Charles Caillouet
Forum Replies Created
-
Well, i still don’t know why it didn’t install, but i was able to copy an earlier version from PS CC2014 and it seems to work.
crc
-
Darren,
I just got a 1 M/E panel in and loaded the 3.0 Beta 1 Mac software and firmware update.
Thanks to BMD for building this unit. It provides a lot of performance in a small package.
Everything seems to work as expected in this rev, with the exception of cuts in the A/B Direct mode.
While Auto transitions keep the same source on each bus and change the color of the active source button, Cut transitions still toggle the busses. It is very confusing to have Cuts and Auto transitions behave differently.One thing that i miss on the ATEM is the lack of drop shadow on keys. While it is possible to create Media Player graphics with alpha channels, or to feed live key channels along with fill sources into the switcher, the small number of inputs on the 1 M/E makes it difficult to make simple live changing supers (like lyrics from a presentation package) stand out. On the upstream keyers, it is also possible to use chroma keys to create shadows from properly formated graphics on the fly, but not downstream. The best solution that i have so far is a 10% luma level shadow on the incoming graphics and the clipper set tight for downstream keys. Then i can set the gain tight to soften the edge.
A few simple options for outline, thin and thick drop shadow would go a long way toward improving the keys.
Thanks, again.
C. R. Caillouet
Vision Unlimited/LA
HD Production Technical Support since 1987
…searching for the right tool for the job… -
Kristian,
Thanks for the response.
That’s about what i had figured out.
And the settings hold well. I swapped cables back and forth between two units on my last job. I will just have to add a USB switch to the mix.crc
C. R. Caillouet
Vision Unlimited/LA
HD Production Technical Support since 1987
…searching for the right tool for the job… -
George,
Thanks for that complete report.
Here is a related event that i can offer…I was on a job a few weeks ago and we were using four Tripp-Lite power strips, two of which were plugged into a third one and the fourth one straight into another outlet. Both strings were apparently on the same circuit. After checking out the temporary installation, we went home for the day, but a couple of hours later we got a call from someone in the office near the installation that she smelled something burning, saw some smoke coming from one of the strips and unplugged the whole thing. Post mortem showed that filter caps in all four strips were blown out; the strips were still working, but presumably the surge protection function was gone. Nothing else in the office was damaged, even a printer and phone plugged into one of the strips.
I suspect that the failure of these surge protectors was related to some interaction among the components but it is still not clear to me what really went on. We did not have power monitoring so we don’t know if there was a surge but all indications are that there was not. We were on shore power and apparently nothing changed. There were no known thunder storms in the area.
This brings up a couple of issues:
First, the safety office at the large facility where this event occurred recommended that power strips not be cascaded, even if the total load is well under the breaker ratings in the strips. They were a bit vague about the reasoning but were adamant that it was against their policies. After some discussion, we found that they had experienced a similar unsolved event some time prior to ours, so they didn’t give us a hard time; they just recommended that we don’t do that. Kind of “If it hurts when you laugh, don’t laugh.” advice.
Second, in the past, i have found that surge protectors often are the source of ground current leakage, which is a big pain in analog video systems. It is second only to neutral/ground reversals in causing ground loops and audio hum. So i would leave your surge protectors at home and get some good quad boxes with no electronics in them; you can cascade those till the breaker blows. Of course, don’t overload the circuits and don’t use underrated wire or receptacles when you draw heavy loads.
Anyone else have experience on this subject?
cheers,
C. R. CaillouetC. R. Caillouet
Vision Unlimited/LA
HD Production Technical Support since 1987
…searching for the right tool for the job… -
Thanks for the response.
I am on the road so it will be next week before i can test it again.
The projector is a Panasonic PT-AX200 1280 x 720.crc
C. R. Caillouet
Vision Unlimited/LA
HD Production Technical Support since 1987
…searching for the right tool for the job… -
I have an update to the first post.
Apparently, the power supply that came with the HD Link Pro cannot support it. With the Multibridge power supply, the HD Link Pro seems to work. I was able to get LUT corrected video from the SDI out spigot, which is what i was trying to get when i started this project.I am still having problems with the HDMI output. It seemed to work fine before i updated the firmware but now it doesn’t talk to my projector.
Can anyone give me an update on what might have changed in the behavior of the DVI/HDMI port between the previous rev and this one, and what i should do to make it work? I notice that the EDID override switch in the prefs is no longer accessible.
Thanks,
crcC. R. Caillouet
Vision Unlimited/LA
HD Production Technical Support since 1987
…searching for the right tool for the job… -
Gary,
The gamma stuff is all useful, but i read the initial post and Ben mentioned output from Sorenson Squeeze. I spent a lot of time last year working with Squeeze on a project and i found problems way beyond monitor gamma. There are related problems in many apps that are obviously interpreting gamma and/or levels incorrectly for one reason or another. My problems were with 720P video in various codecs. I finally created an FCP output filter set to scale the video down for export, and a corresponding set to rescale on input to Squeeze encoding. It was a trial and error process. I found a similar problem with Main Concepts MPEG encoder for Mac, but on a much smaller scale – just a minor, but visible, gamma change.These problems were verified by decoding and evaluating the video from the files, not just looking at them on monitors. So the errors that Ben is talking about could be real processing errors and not just monitoring ones.
cheers,
crcVision Unlimited/LA
Prairieville, LA
HD technical support since 1987
…searching for the right tools for the job -
> …graphics created in After Effects for my 720p/23.976fps stutters in my sequence. …
You can really tell with the graphics because there is a lot of panning and camera movements. Any ideas how I can smooth this out?Jeremy,
When cinematographers shoot at 24 fps, they are required to take care in camera movement because of the slow update rate. When you shoot at 60 frames or fields per second, the eye sees motion as continuous from frame to frame, with no “blank” time between captured frames. This is a result of keeping the shutter “off” for most 60 fps capture. You capture all the motion during each frame.If you don’t use a shutter when shooting at 24 fps, two things happen. The blur in any given image is increased because the exposure time is increased from 1/60 to 1/24 second. And the clip doesn’t have the same look as a typical film clip, where the shutter is typically 1/48 second. Obviously these two results are related; the shorter the exposure time for each frame, the less blurry the image but the larger the jump between captured motion in each frame. You don’t capture all the motion that occurred in that frame time.
So you have to decide how much blur to add. In the camera, you select a shutter speed or shutter angle. In AE you have to decide how much motion blur to crank in.
Then you control the speed of the camera motion to impart an emotion to your audience. Traditional video has not had all the options of film in this area because we did not have such a wide range of exposure times available to us. We couldn’t practically go longer than 1/60 second in most cameras. That also meant that we had fewer issues with camera mottion. If we did a swish pan, the result was almost continuous motion blur, not judder.
In 24 fps, if you do a swish pan, you can decide if you want it to blur like video (360 degree or 1/24 shutter, or judder harshly (narrower or shorter shutter.) The judder can make a viewer very uncomfortable and that can be very useful in a thriller, or very distracting in a love scene.
You have the power.
crc
Vision Unlimited/LA
Prairieville, LA
HD technical support since 1987
…searching for the right tools for the job -
>I was wondering how many P2 cards and/or P2 Stores
>were assigned to each shooter per day?
Each of the four HPX2000 crews started out with about 20 cards.
They estimated that they would shoot between five and ten a day.
That gave them time for the cards to be transferred at a checkpoint and returned to the crew, as well as allow for logistic problems.
There are no P2 stores on the trail. The idea is to keep moving parts to a minimum.
The HVX200 crews got about the same number because they could record in PN mode and record the same amount of 24P as the HPX2000, without the PN mode.
The HPX2000 should have PN this summer but the early models do not.>And what format was being shot?
All the HPX2000 footage is 24P, ingested through P2 Drives onto a 24P timeline, with duplicate frames removed.
The HVX200 footage is 24PN, with the advantage of no duplicate frames to start with.
24P is working well for most of the motion sequences. They can shoot 60/24 to preserve quick action. The 24P format is especially useful for the web uploads because it reduces the loading on the encoder, the servers, and the viewing machines.
Editing is done on MacBook Pro with FCP to mirrored G-RAIDs through 1394B interfaces.crc
Vision Unlimited/LA
Prairieville, LA
HD production technical support since 1987
…searching for the right tools for the job… -
aaronowen,
I am sorry that your thread turned sour and i hope that you got your problem solved. I have not been monitoring the forum for a while but i ran across this discussion today and thought that it deserved some clarification. So here are some comments on the misunderstandings running rampant in this and other threads about 720p24.
1) 720P23.98 as a fake format
It is a real format in Final Cut Pro and in Quicktime files.
It is, in fact defined along with 720P24 in the ATSC Table A3 (which was not approved by the FCC).
It is not defined across firewire or HDSDI or in any SMPTE standard and no video recorders can handle it.
You just have to get it around in other ways than the two above, or convert it to some other format.2) 720P23.98 or 720P24 (the common terminology for 720P23.98, which corresponds to 24 when the camera is running at a system frequency of 59.94) do not use advanced pull down or 2,3,3,2. That is a cadence used for interlaced video to make it easier for non-linear editors to pull out the duplicate fields, assuming that they know how do deal with it. Some don’t.
Also, advanced pull down should never be used for distribution, only from a camera to the editor, and only then, with care.
At any rate, it has nothing to do with progressive recordings so that is a red herring in this discussion.By the way, most recordings are made at 59.94 or 23.98 to simplify editing and audio layback, even if a film out process is anticipated. The conversion to 24.00 is made at the very end of the chain. When i write 24, 30 or 60, i mean 23.98(actually 23.976), 29.97 and 59.94.
3) The metadata that you refer to is simply a string of bits in the user data part of the SMPTE time code associated with the clip in question. It is carried with the time code numbers but the editor needs to be cadence-aware to know to put it in a time code stream.
4) When you drop a clip onto a sequence track, you retain a link to the original time code for edit decisions but you assign a new time code for the finished sequence. If the sequence is 60 (or 59.94) fps, then a cadence is implied for all 24 fps material included and it is incumbent on the editor to keep track of the 2,3 cadence. It is possible to do this on a 60 fps sequence but it is not trivial. The accepted standard for doing this is to keep track of “A” frames, the start of the 2,3 cadence.
Many editors and software utilities know about A frames and can extract a 24P sequence when the cadence is intact (no edits which change the cadence,) even without other flags. This is commonly referred to as “reverse telecine” and has been used for film transfers through “telecines”. A smart application can even look at the images and detect the cadence by finding repeated frames.
If you convert the clip to 24 (or 23.98), through any number of methods, then you can drop the clip on a 24 fps sequence track without regard for cadences. All cadence information is lost in a 24 fps sequence because there is no cadence. Even the off-speed effects clips have been converted before they are dropped into the sequence.5) If you want to export the sequence at 24 fps, you have several options.
One is to keep it in file form as 720p24, without cadence issues. You can move this over a network or on a drive to another system or application that knows about that format and the compression scheme that is used to represent it. This works for compressed or uncompressed clips. The receiver just has to know how to decode it.
Another is to export it as a 1080p24 clip to a recorder that can handle that format. The most obvious are HDCam and D5. Note that not all HDCam and D5 decks can record and play 24 fps. 1080P24 does not contain a pull down cadence, only the unique frames. It may be actually recorded and displayed as segmented frames for operational reasons, but there are no duplicate fields or frames.
You can also chose to recreate a cadence and export the clip as 720p60, 1080i30 or 480i30. When you do this, you are recreating a new cadence, not retaining the original one.6) The comment about many workflows is very true. There are as many as there are users and edit systems. Panasonic will suggest some options based on your configuration but when you mix systems, you are required to learn something about the capabilities of the systems and the signals that can move among them.
Ranting on either side is not helpful.
cheers,
Vision Unlimited/LA
Prairieville, LA
HD production technical support since 1987
…searching for the right tools for the job…