This article originally appeared in Creative COW Magazine’s August 2010 issue, Divergence.
A South African team applies the robotic motion-control expertise it has gained in live-action production to stop-motion shooting of stereographic motion plates on a 14:1 miniature set.
Reflex Motion Control has been thinking about 3D, and how we could use motion control in the production process, but we never had the opportunity to work on a project that had a client with the vision of what they wanted, and a production company to execute it.
When the opportunity came, it was through Cape Town-based animation and post specialists BlackGinger, and production company Shy the Sun, who had been contracted by Ogilvy Johannesburg to develop two scripts for M-Net, the major satellite pay TV provider in South Africa. These were to be two 60-second spots that would air in theaters before 3D feature films, one called Firefly, the other Ladybug. Firefly was a night scene where a firefly’s source of light (a lamp) goes out, and he goes on a search through a forest for a new light. Ladybug was a similar idea, but in daylight.
The producers found that producing photorealistic backgrounds would take way too long, and be very expensive. Shooting miniature sets was going to be much more cost effective.
Our job was to create a 3D motion path for the background plate, as we followed this little firefly through trees and under brambles and bushes. We would export the path we created as three-dimensional data that identified keyframes and some of the frames between them. The move was pulled into Houdini, where it was evaluated, and any changes decided.
To create our path, we used a single Canon 5D Mk II on a motion control arm to photograph one frame at a time, first one eye and then the other. We used motion control to precisely move to each position. This allowed us to also animate convergence, and to create a result that we believe is groundbreaking.
Shooting motion control with a Canon 5D proved to be the best way to efficiently shoot what ended up being more than three weeks of 18 hour days.
Here is the story of how it came together…
ABOUT REFLEX MOTION CONTROL
South Africa has one of the oldest film industries in the world, but as recently as 13 years ago, motion control didn’t exist here.
We looked at the technology that was available in those days, and we made the leap. Since 1996, Reflex Motion Control has been South Africa’s only permanentlybased motion control provider, and has worked on numerous TV commercials and some features too. Reflex runs its own rigs but has also partnered with VFX London to hire the MILO and MINI rigs to the South African market.
We had seen in the UK that motion control rigs were mainly film-based. Video was something to be avoided. However, all of our new directors were cutting their teeth on video, first Betacam, and later Digi-Beta. We chose to look at a technology that would work equally well with film as well as video.
As we began, is was clear that motion control had gotten a bad name. Many of the rigs weren’t reliable, and when they got on set, failed for one reason or another.
This is why we looked at normal robotics. We chose the kind of robot arm that you have seen welding cars around the world, because those robots are bulletproof, built for working in factories under terrible conditions, 24 hours a day, 365 days a year.
We had to modify our robot to make it more suitable for the film industry. Some hardware and software tweaks, and also to compensate for the fact that these things tend to get bolted down in factories. For example, we had to figure out how to make them portable, and to quickly change rigs for a shoot.
Shooting Notes From DP Johan Horjus
The M-Net job was a great challenge, and also very rewarding.
My first concern was defining the area in focus. We were shooting a scale of 14:1 and then later life size action. The size of the 5D chip gave me a wider angle of view, this helped me to get wider and closer. This was very helpful when shooting with the 5D’s maximum resolution.
Even on a wide lens, the focus is very intense when you are so close to objects. To avoid the possibility of exposure drift, I used Zeiss prime lenses, and taped down the focus and iris rings. I also used longer exposure times to allow me to close the iris to increase my depth of field.
In fact, between the long exposure times and the wait to transfer data to the computer, we had to be careful not to have the robot arm command the Drag-on software not to move to the next frame too quickly. Like Peter, I believe that it will be of great help if the data transfer time to be faster.
The only other issue we had was minor. Once we had done our set-up and planned all our passes, the arm and camera starts its work, slowly clicking away. After one or two passes the camera will just stop and display a miscellaneous error message. We simply had to wait a bit, then start it up again.
That was it. The 5D has a great releationship between the lens and chip that produced great stills. I did not have to push the dynamic range, but from previous use, I know that its ability to do that is impressive.
The 5D was great to use and I achieved great results using it.
The big hurdle that we needed to get over for these M-Net commercials was to establish what the parameters were that would make for successful 3D.
The CG foreground elements were to be provided by animation experts BlackGinger and Director Ree Treweek (Shy the Sun). VFX Supervisor Hilton Trevis (Black Ginger) and Director Jannes Hendrickz (Shy the Sun) supervised the 3D stereo shoot of the backgrounds, and we allocated a full week of testing before shooting commenced.
The two parameters that we had to focus most carefully on were interocular distance and convergence. We had to determine where we wanted the pieces to appear, whether in front of the zero plane using negative parallax, or if we wanted them to go behind the screen.
This was more complicated than usual, because we were working on 14:1 scale sets. So, traditional thinking is that about 65mm is normal interocular distance, the distance between adult eyes. Divide that by 14, and it comes to about 4.5 mm IOD for the shoot. In the end, we were shooting IODs in the 2 to 3 mm range, because at 4 to 4.5, the stereoscopic effect was actually a little bit too much.
We had a number of people and all the gear in the studio to check this out as we ran the test. We were able to play the footage after each set of shots, and see what everybody thought. As their brains tried to resolve it, some people found it uncomfortable. Some couldn’t even watch.
I think that it is important to understand that it takes care to create 3D that works well. You’ve got to be conservative, because 3D can be really good, but get it wrong and it can be very bad. The test phase was invaluable. We had to change the variables to get the correct interocular distances and peak convergences for each shot. We experimented until we knew the parameters to run for the shoot, and then we started laying down the actual footage.
The first few times, we actually tried to fire frames manually. The DP was firing, and I was moving the arm from camera left to camera right, and then back to the camera left for the next frame. After only 50 frames, we lost track of where we were!
Most sequences were over 300 frames each, so we decided to automate the process, a critical decision. It made the production run much more smoothly, and it allowed the crew to concentrate on the creative aspects of the shot.
The bracket to attach the 5D to the motion control rig is that you can reference the nodal point of the camera lens in the rig, so the rig can then execute nodal pan, tilt and rolls. This allows one to easily change camera or lens. We sometimes had to undersling the Canon 5D to get really close to the set – a quick reprogram of the nodal point enabled this.
More difficult, we had to get the 3D graphics platform to talk to our robot. It’s not like they spoke the same language, and “Here we go!”
The mathematics experts at BlackGinger wrote scripts to do this, and we turned to software called Dragon stop Motion to run the camera, using a hardware keypad for control, combined with Dragon’s DMX controller as the hardware interface with our robot arm.
This system can be configured in one of two ways, for Dragon to tell the rig when to capture a frame and move to the next camera position, or have the rig control the software. We chose the latter because it was more practical for us. The rig moved the camera, frame 1: left, frame 1: right, frame 2: left, frame 2: right, and so on. It fired the camera at each of these positions, and then waited for Dragon Stop Motion to capture the frame.
Instead of capturing to the internal SxS card, we ran an Ethernet cable out of the 5D, and captured to an external drive. We chose hi-res JPEGs as our format, because RAW files were a bit too big, and would have made things even more tedious. As it was, each file took on average 7-10 seconds to be written to disk, before we could fire the next frame.
In fact, if we had any major request to make of Canon, it would be for high-data transfer speeds.
Production Details From Shy the Sun
New options were made possible in deciding to have a stop motion shoot, using a digital stills camera mounted on a motion control rig that was programmed to shoot one eye at a time, and repeating the exact move to shoot multiple passes.
The practical passes included: a few matte passes, allowing us to take out the back blue screen, and have the characters move in between set elements; a firefly light pass for the light being emitted by the fly’s tail on the direct environment; a moonlight pass to imitate the light of the moon; and a UV pass for the UV lights seen around the pond.
A few additional passes were shot depending on the actual specific scene set-up, and what it required. For example, the last shot of the firefly had 9 different passes, which took an hour and a half each to shoot, adding up to two days of shooting with set-up times included.
Besides the actual shooting equipment and sets that we assembled in BlackGinger’s basement were a few computers dedicated to specific functions crucial in the pipeline. BlackGinger wrote a script that allowed us to take the data of the manually programmed motion control rig, and feed it into Houdini. Within Houdini we were allowed to finesse the curves and ease-in/out points of the path. We would then feed the data back into the motion control rig and perform the new, smoothened move.
STOP-MOTION AND 3D
In the old days, 3D feature films were a novelty. Now they are becoming something a little bit more than that. And with 3D TVs coming into the home, what was once the race for hardware is now becoming a race for content: features, documentaries, educational programs and more. The animation guys are going to love this, but I think that producers will also find the benefits of shooting miniature sets for stereoscopic backgrounds, not just to reduce costs, but to add creative options.
You can do so much more with real sets and motion control – lighting passes, blue screen passes, etc. – that allows the guys in post to add even more value.
There are wonderful artists doing miniatures, such as Steven Saunders who worked on this project, that I hope are going to be seeing a lot more business now. And from my side, obviously we would like to tap into that as well.
As we start to move into 3D live-action shooting, I do not yet see the 5D having as big an impact for us, as motion control rigs can be adapted to carry full-size stereo camera rigs. However, for miniatures, it has been amazing. Working as closely as we did with the miniature set, and the tedium of moving from frame to frame, something like a Red camera would have made no sense, even if cost had not been a factor.
Of course budget was a factor, but it was the only compromise we had to consider. The Canon 5D’s size worked in our favor, and it was able to accommodate high-quality lenses that we were easily able to rent, and we got amazing images at 21MP – more than enough for the guys in post to use for their 2K project. The compromises that some motion picture makers have to deal with, such as having no timecode, were not an issue for us.
For stop motion, though, the 5D was a dream for us. That’s why I believe that, with our Reflex/Black Ginger system, we have a complete working solution for the 3D stop-motion market. In fact, our companies are currently busy working on a joint venture to offer a packaged solution that will quickly enable a complete 3D stop-motion studio system to be set up anywhere in the world, with minimal equipment freight costs, and local support for the robotics hardware.
Motion Reflex Control
A recent project we shot was for Invisible Children, in which years ago on Christmas Day, the congregation of a small rural church in Uganda was burnt alive by rebel forces. We were recreating part of that scene.
The shot started on the burnt church, and as the camera tracks out, one goes ‘back in time’ to see the people, in slow motion, getting up from the pews and running to escape, only to have the doors shut in their face. The shot ends with the rebels setting the whole place alight. Here, Reflex’s motion control rig was used to shoot with a Vision Research Phantom at 300fps, tracking out of the church as it was being set alight. It was very emotional to shoot, but the footage we saw off the Phantom was awesome!
Directors, Laren Poole and Jason Russell, and DP Yon Thomas were all from California. Invisible Children is an important project, and we were privileged to work with them in South Africa.
On Peter Constan-Tatos…
As Peter tells Creative COW, “After many years of traditional electrical engineering experience, primarily in nuclear power station construction and high voltage cable design, I followed my then-part time interest in robotics and photography to eventually establish ‘Reflex Motion Control’ with friend Campbell Stuart-Watson, who had extensive video production experience.” Over the past dozen or more years, they have worked with most of South Africa’s only locally based motion control service.