Mactrix
Forum Replies Created
-
The problem is that this Sony VTR has really bad analogue
outputs. We just tested to capture over the component
outputs using a Decklink Multibridge in uncompressed
and it looks even worse than the nativ HDV.The Mirranda converts IEEE1394 HDV to HD-SDI? This
should be better than but in my opinion you won’t gain
much. Maybe the Mirranda makes a nice 4:2:0 to 4:2:2
chroma-upsampling but I would follow Graemes suggestion
to convert in FCP … -
A friend of mine is having the same troubles.
FCP 5 is very accurat now when capturing or
playing out to tape – except HD-D5 …We switched yet to Panasonic RS-422 protocol
and tried several offsets but this doesn’t help.
The offset is never the same. They handle this
problem by rebooting and using a standard
offset. In one of three cases this works.It seems that none tested yet HD-D5 as it is
a rare system … -
interlaced where? on your computer monitor or video monitor?
-
Which VTR are you using?
Do you mean the starting and ending timecode or is there a general shift?
-
No, only Downconversion.
For a high-quality and fast Upconversion use
this freeware: https://www.squared5.com/svideo/mpeg-streamclip-mac.htmlIt’s very fast and looks better than Shakes
new conversion feature. -
In your first post you asked about HD 1080i …
Here the pixel format is square with a full image size of 1920 x 1080.HDV however is 1440 x 1080 using none-square pixels.
It’s squeezed on a 4:3 frame. FCP handles HDV nativ
with 1440 x 1080 and according pixel format. However
QuickTime will display HDV .MOVs with square pixels
and so After Effects … you might loose a bit of quality
by resizing the footage in AE and back to FCP but I
think this won’t be visible. -
Try this free tool as well:
https://www.dharmafilm.com/sebskytools
The BWF conversion works perfectly and
a friend of mine is just doing a film job
in italy using this workflow with FCP … -
Yes, you’re right that most filter in FCP were limited to 8-Bit.
In Version 5 the color correction filters were enhanced to 32-Bit.
Anyway I made all my tests in After Effects and Shake. FCP was
just for I/O … 🙂On mactrix.org you can get an idea of the HD video. The whole
postprocessing was done in AE and we tried to use a difference
key for the masking jobs … (better turn off the music, the german
rap is horrible).Also we shot in cinema gamma mode to get more range for
color correction and that’s how I’ve tested intensively the difference
between 8 and 10 Bit inside a 16 Bit project. It was one of many
other tests …And yes film transfer should be in more than 8-bit but because
I tested DigiBeta and found out that luma is 8-Bit and chroma
something between 8 and 10, I would prefere file transfer such
das DPX … also because of logarithmic color space. Instead of
DigiBeta I would try D5 … -
Again, I am not talking about film. This is totally another
story.And the article refers to recording to tape. Tape is limited
to 8-Bit with DVCPRO HD. We captured LIVE and TAPELESS
from the CCD straight to the 10-Bit HD-SDI output.And no it’s no codec issue. Mac can handle true 10-Bit
since years. I can produce banding artefacts in seconds
with CG but not with captured video. This has nothing
to do with internal rendering inside applications …And there are 12-Bit beamers out there for 40.000 USD.
We’ve tested some in a digital cinema … DVI connection
with 8-Bit showed no difference. It is marketing in most
cases. You’re running your OS with 10-Bit? Your graphic
card support 10-Bit? You’re using the EIZO to proof it?
Your eyes can dissolve the 1024 steps? You’re using a
professional CRT, but no class 2 or 1?It sounds to much theory and brochure reveals. It makes
no sense to discuss this in a forum without applying test
on your equipment together … -
I don’t trust Computer Monitors for several reasons:
1) Computer gamma is different from video gamma.
2) Operation system and graphic card are limited to 8-Bit.
3) Quite all CRT- and LCD-Displays can’t even display the
whole range of 8-Bit depending the monitor profile. So
there is always a interpolation of color values.Because we are talking about video we should look on
a video monitor … class 1 of course. You want to say
there a differences between 8- and 10-Bit? If one day
there are displays that can project more than 8-bit than
we would profite from that but I would be happy yet if
they would offer true 8-Bit …And the example with the horizon is again an example
you find in any manufacture description … I never had
a situation with real life shots where this mattered.
The video noise as low as it might be is one reason
why you never have these kind of gradients like in
computer graphics. Apply a dither or jitter to a CG
gradient and you will see the same result. It’s not the
technical point! 10-bit is four times more than 8 but
it’s our eye … so easy to overwit.