Articles >

New Panel at DCS – The Case Study

This year at SMPTE’s DCS 2011 Conference held at NAB offered something a little different than it has in the past.  Amongst panels that reviewed the latest advancements in color grading and bit depth, and the study of affective use of higher frame rate shooting for theatrical and television releases, was the addition of case studies.


Wrapped within layers of technical issues involved in stereoscopic production for cinema and television was the case study of “Yogi Bear 3D.”  Directed by Eric Brevig, who’s previous work on “Journey to the Center of the Earth 3D” helped redefine complicated 3D rig set ups that allowed for more flexible camera movements through complicated set layouts, “Yogi Bear 3D” was the first 3D film shot stereoscopically that blended animation and live action.

Joined by fellow “Yogi” collaborators Betsy Paterson, Visual Effects Supervisor at Rhythm and Hues, and John Nicolard, Head of Digital Production at Fotokem, Brevig discussed the project from initial conception to finished piece, sharing insights into what worked well and what caused problems along the way.


After expressing his belief that comedy, especially slapstick styled comedy, is much more fun to watch in 3D, Brevig discussed the issues that arose from his choice to shoot the film entirely on location.


“We shot on location in the woods of New Zealand,” said Brevig.  “With our shooting schedule in December, we chose this location because we had to avoid snow.  What we weren’t told by the film commission was that it would be raining almost every day.  A lot of work had to occur in post to place in a blue sky.”


With two equipment trucks outfitted with the engineering and processing equipment necessary for converting shots on set, the team wasn’t technically restricted from shooting.  However, because nature doesn’t offer direct paths for the 3D rigging structure, Brevig and his team had to find creative ways to operate the 3D camera.  One way this was accomplished was by rigging cables between two helicopters.  The camera would then swoop in between trees and could be lifted or lowered into the desired placement.

Although these types of techniques require a great amount of choreography, Brevig stressed that shooting 3D is a practice built on choreography.

“You don’t want the audience to strain their eyes trying to figure out where they should be looking, you want to guide their eyes, and help them understand where to look,” said Brevig.  “The camera (set between the two helicopters) naturally guided the eyes to the exact point of reference.  Find the quiet passages before the impactful ones.  Choreographing your camera movements helps the audience continue to enjoy the 3D experience.”


Paterson employed digital illustrations during the creation of the animated characters in “Yogi.”  Preparation for animation began on set, where Kiwi actors were employed as stand ins so the human actors would have the correct eye lines. Because of the complexity of adding the cg characters into the 3D field, it was important to get these initial details correct to maintain believability.

“Proper placement of the bears for 3D was the most complex,” said Paterson.  “We start with flat drawings before we add the full animation.  These flat drawings have to be placed in proper depth or else you’re not going to get the correct effect even though they are just place holders.”

The pipeline for gathering information for the stereoscopic animation is the same as regular animation; however, the data gathering is much more extensive.  All the camera positions have to be recorded and all the conditions on set have to be measured so the lighting sphere that will be built for the 3D animation models will match the same conditions found within the live action shot, including the details of the ground in which the characters are standing on..

“You must get every bump on the ground, all the camera distortions” said Paterson.  “In many visual effects films there is a lot of cheating going on.  In stereoscopic, you can not cheat.  Everything must be exact.”

In the tracking stage Paterson’s team treated the second eye as an entirely different offset while keeping in mind that the end goal was to create a singular world, not two worlds put together.   To cut down on rendering time during tests, Paterson’s team developed tools that provided animatronic checks to ensure highly detailed areas, such as around the eyes, would have the correct 3D conversion.  They also used what they called “2 ½ D: rough models of the characters mouths and body in the 3D space.  They created full 3D for the elements closest to the action, and matte paintings for the deep back to ensure the details were being handled correctly.

Nicolard found his biggest challenge to be wrangling the massive amounts of data within a very limited timeline.


“This was my first time in working on this type of movie,” said Nicolard.  “This movie contained a lot of data – you’re working with 48 frames instead of 24, so I brought many terabytes extra.  A critical component was to preview the movie, which I did by taking the cut from Avid to HD tape then I brought that into Pablo.  At that point I could focus on 3D sweetening – taming the 3D to make it a comfortable experience to watch.”


Brevig concluded the panel with the following advice:

“Learn how to get what you want on the set instead of relying on others to provide it.  Many stereographers themselves don’t know how to achieve the effect.  It’s a new field; there just aren’t enough people who have the experience.”

For more information about DCS 2011 and SMPTE, please visit: