Early on in the planning stages of “The Longest Night,” we realized there were going to be some big changes to how we approached production. Typically we work with animated CG characters and environments. We can dictate actions and are in charge of a camera that is essentially unlimited in its range of motion.
With this production we knew we were going to need to actually capture some performances and integrate that footage into some sort of environment. An example of what we thought would stylistically work with the look and feel of a typical Paperhand Puppet Intervention show is the Modest Mouse video “Float On”.
The flat nature and stylistic treatment of the video seemed to be flexible in the sense that it didn’t have to make physical sense and could be executed with static camera green screen shots. The footage could be mapped to flat cards and be moved around. We shot some test footage and made a quick proof of concept.
We found this technique to be functional but limiting.
We eventually realized that we could put movement into the green screen footage and match-move it. Match-moving is a technique in post-production where points are calculated out of the footage and software can use the points to map out digitally what the camera is doing in real life. Then we can use that data to animate a fisheye camera in 3D space where we could do more dynamic and realistic uses of footage in relation to the camera. This was a theory that we decided to test out.
You can see the final result of our test below.
Below is an earlier version where you can see the original footage before it was keyed out.
We matched the real world footage to a digital background. This rough test was the foundation for the work we would then build upon for designing the rest of the show.