When in Dome…

The Morehead Planetarium & Science Center Production Blog

Archive for August, 2009

Looks like the new version of Final Cut supports 4K resolution and RED camera natively.

http://www.apple.com/finalcutstudio/finalcutpro/digital-cinema-workflows.html

If we weren’t already using After Effects for our final edit, I’d move us over to Final Cut since we’re doing our sound design with Logic and Soundtrack. But we may soon have access to a RED camera so it’ll be nice to pull in footage to Final Cut for editing. I’d love to hear from people who have used Final Cut for their 4K footage. What do you think?

The full schedule for this year’s DomeFest is up on the ArtsLab website:

http://artslab.blogspot.com/2009/08/domefest-returns-to-new-mexico-925-27.html

September 25-27 at the University of New Mexico in Albuquerque. We’re going to miss it this year due to budget cuts, but we’re already in the planning stages for Morehead’s first entry into the competition the year after.


We’ve been running into some issues stitching together frames that have varying opacity. Namely, clouds and particle systems. Originally when using a sequence of PNG’s we’d find ourselves having a seam around the stitched boarder. This was due to the alpha being added together at the seam line creating a 1-2 pixel boarder that had a combined opacity greater than the pixels around it.

badseams2

I realized the problem came with having the stitching software not being able to understand the alpha channel, and that if I controlled that myself rather than leaving it to the code I could remove this variable from the equation. So by out putting an opaque color pass and an opaque alpha pass I could use one to cut out the other as a luma matte in after effects.

opaque_color1

opaque_alpha1

aftereffects_menu1










 

Thus, removing the seem issues, and having an alpha channel that could be independently manipulated.

noseams1

True this creates more files, but really doesn’t increase render time, as the alpha information is calculated in a render anyway and either mixed into a 32 bit frame, or simply discarded in a 24 bit frame. Though if you select Alpha Split in the .tga File set up when outputting, rather than discard the information it will save it as “A_[filename].tga” giving you the two opaque frames you need for stitching.

alpha_splitsetup

 

Hope this is helpful, I know for us this is a great discovery, and kind of a “why didn’t I think of that before”, moment. I also realize that stitching isn’t the best solution, but sometimes is necessary.

In our conversion of The Magic Tree House, there is a sequence of shots that the visuals are being re-done. One part of that sequence is when we are on the surface of Mars following the Sojourner rover, but we ran into a hitch. There were two goals we were wanting to achieve for this section, which is to have the rover exit from the lander, and to end with an impression of the rover exploring the surface of Mars. Since the audio commentary was to remain unchanged, we were fairly constrained in what options we had to visually tell the story. To keep the number of shots to an absolute minimum so we could fit it in the already predetermined sequence length, we had to look to using some film techniques we weren’t sure would translate to a dome.

Needing to show passage of time to make sense for the following shot of the rover driving off into the martian sunset, we lowered the sun over a series of dissolves, while still keeping the same camera dolly in.  The reason we felt it’d translate well for the dome is that with the continued motion forward we can continue having parallax motion with the rocks and boulders to show distance, and the growing length of shadows combined with the sky’s hue and saturation change, can help to really create some immersion. Check out the video below:

shot04


I was reading the fulldome yahoo listserv today (the “My farm’s bigger than yours” string) and saw that a couple people mentioned producing for 8K systems. Wow. Already? Hmmmmm. I’m wondering if we’re jumping the gun a bit.

Now, for a minute forget about the technical issues, like the fact that After Effects can’t easily handle anything larger than 4K and that we’d need a render farm 4x bigger than our current one to handle the processing. After all, we’ve got Moore’s law working for us and sooner than later, the hardware and software will catch up.

What I’m wondering is will the average Joe Planetarium visitor appreciate the difference? After all, 4K looks great and I even think 2K looks pretty damn good on a large dome. And being part of the industry, I’m probably much more discriminating than 99% of the general public out there.  I haven’t yet seen any 8K demos or been to any of the installations that Sky-Skan has done in China but I’ve been assured by Steve Savage over at Sky-Skan that it looks phenomenal and that even 4K content looks better on a 8K system (which I don’t really understand). And yes, it is supposed to be rivaling the image quality of large format 70mm film. So OK, maybe it’ll look fantastic and we’ll sit back and marvel at our own magnificence.

However, think about this – in that same string on the fulldome listserv, Paul Mowbray over at NSC Creative mentioned that their “Centrifuge” scene in Astronaut was “rendered at 2400×2400 and then scaled up to 3600×3600″ and it still looked amazing on a large dome with a 4K system. In fact, it looked good enough that it picked up a Domie at the 2008 DomeFest.

He also said this, “… don’t get caught up with pure resolution” …. “4k doesn’t = high quality. If you have a big enough render farm/budget/time/patience then the higher res the better but at the moment very few domes can even show 4k so by the time they can you’ll probably be making a new show so in the meantime focus on the content itself.”

If we spent as much time worrying about storytelling and compelling content as we do about resolution, we’d have a lot more people excited about going to their nearest dome.

Centrifuge – ASTRONAUT – Fulldome from NSC Creative on Vimeo.

I’m going to discuss some potential issues I’ve been mulling over about blending live action and cg on a dome. Following links will discuss in further detail some of the terms I may be using.
Chroma Keys (Aka, Green Screen)
Match Moving

Generating live action footage for a dome has been an on going challenge for anyone producing content larger than 2k. The current resolution standards on most HD cameras only allow us to create the bottom have of a 4k fisheye master. This means of course that part, if not all, of the environment that live actors interact with will need to be computer generated. Also shooting live action, you’re somewhat limited to how much motion you can incorporate into a shot.

The challenge of shooting a moving camera shot, is needing to match that motion in the digital 3d world. You’ll need to be able to record the camera’s position and orientation for each camera move, and replicate it so that your filmed and separated actor/actors are rooted to the scene. You could achieve this using a motion control rig that the camera sits on. With every take you can program the camera’s move so that human error is removed from the situation. The downside is the cost of renting and operating such equipment can be excessive.

Another option is to try syncing the camera up using some match move software and tracking markers. Though most of the software has been developed to track xyz positions in relation to a single plane of footage, and has yet to be calibrated for working with the unique distortion of a fish-eye lens. A work around would be to lock down the camera during filming and then move the actors image in 3d, but would be limiting in its ability to recreate complex camera moves.

Hopefully as Fulldome video becomes more mainstream, camera companies will develop the hardware that will make live action a more plausible solution for smaller studios. The benefits of using real actors, and building on existing sets, leads to a more believable experience for audiences. It also makes production a little simpler because practical solutions can be generated rather than leaning everything on being created in post.