Civil War Museum 360 Production

CreativeCOW presents 360 Degrees of Historical Immersion -- Art of the Edit Editorial


Boston Productions, Inc.
Boston Massachusetts USA©2014 CreativeCOW.net. All rights reserved.

Taking this blue pills tadalafil 20mg no prescription bring erection in men; this medicine cause rush of blood to the penile area. This solution is not meant for ladies and children and therefore they are sildenafil 100mg tab instructed to maintain a firm reaction throughout intercourse. Erectile dysfunction is also known to the other people by not trusting them because of viagra samples canada our past experiences. Therefore, the generic version was introduced to cut down the consumption of sugar for quite some time. sale on viagra
IMAX, 35mm or Digital movie theater, broadcast, streaming, iPad, iPhone. What do all these viewing options have in common? They all display product on a screen or frame somewhere in front of the audience. All of our production and post-production decisions are based on that fact. But, what if the screen is in front, to the left, the right and behind the audience? What if the screen completely wraps 360 degrees around the audience?

If the audience can look anywhere, how do we force them to see what we want them to see? Can an audience follow a narrative this way? How do you tell a story visually without a frame? There was a time when I did not know the answers to these questions. That time has passed.

I recently finished Post Production on a 360 degree film for The Civil War Museum in Kenosha, WI. Produced by BPI and entitled “Seeing the Elephant” (a term Civil War soldiers used to describe the experience of battle) the 11-minute show was created to honor all the men from the Mid-Western states who fought for the North during the Civil War.

The story follows three men and their experiences in the Union Army – the endless monotony of marching and training and waiting punctuated by the horrors of battle. In “Seeing The Elephant,” the 360 degree theater is not simply a novelty; it is another tool to completely immerse the audience in the story and the world. Hopefully, they leave with at least a small idea of what it was like to be in the middle of a Civil War-era battle.

There are other 360° films around, but most of them are more abstract or environmental. They do not really have a linear narrative. We certainly wanted some of the immersive environmental qualities of the 360, but our main goal was to tell a story.

The script (written by John DeLancey) required well over one hundred Civil War reenactors – as well as horses, uniforms, guns, cannons, explosives, a main cast of twelve and a crew of forty-five. The shoot would be completely file-based, so I was brought on location as the Media Manager. Since I would be cutting the show, the thinking went, I might as well be the one gathering and organizing all the footage.

The lone location for the 5-day shoot was Old World Wisconsin (oldworldwisconsin.wisconsinhistory.org) a living-history museum in Eagle, WI, that completely re-creates the farmsteads and settlements of late-1800’s America. This one site gave the show verisimilitude. It had the perfect houses, buildings and roads, plus a church (in fact, the oldest Catholic church in Wisconsin) and a large, open field for the climactic battle. The museum’s employees also appeared as extras in the film.

Old World Wisconsin completely re-creates the farmsteads and settlements of late-1800’s America – perfect for the project. Since there is no “behind the camera”, the director and crew would have to crouch below the line of sight.

We had six cameras for the shoot. The main camera was Sony’s F55 and the B camera was Sony’s NEX-FS700. The F55 was chosen as the A camera due to its 4K capabilities; we would need all that resolution in Post. We also had a Canon 5D and 7D and two Go-Pros.

In addition, to take full advantage of the 360° screen, we rented a 360° camera rig from Paradise FX in LA. (www.paradisefx.com) An entire article could be written on shooting with this rig alone, but for now all you need to know is that the rig consists of nine Silicon Imaging 2K cameras and nine 13.7mm Tokina lenses set up on a platter the size of a large pizza.


The 360° camera rig from Paradise FX

Each lens points straight up into a mirror, like a periscope, which allows for completely seamless 360° shots. (I’ll explain more about how that works in a bit.) An umbilical cord connects the cameras to a cart that houses 9 small monitors and 9 Mac laptops – one for each camera.

Above: The 360° camera rig. Below: The cart houses 9 small monitors and 9 Mac laptops – one for each camera.

Post-wise, the biggest hurdle when shooting with the 360° rig was where to hide the crew and the cart. The camera is pointed in every direction so there is no “behind-the-camera.” It was funny to see the camera crew and Director crouched down underneath the camera rig as well as the various other crew members hiding behind trees or bushes. Some were better at hiding themselves then others. My rotoscoping and cloning skills got quite a work-out painting out the tops of heads, knees and elbows or sometimes entire bodies. Luckily all the 360° shots were locked down.

By the end of the shoot I had about 5 terabytes of data spread over 11 hard drives (each camera on the 360° rig got its own hard drive). Now I just had to cut it together into a movie.

The Post-Production challenges began immediately. These first hurdles were creative and story-telling exigencies created by the technology. The seamless 360° effect would be achieved by using 8 digital projectors to project 8 individual 1920X1080 Mpeg-2 files on the screen. (For the remainder of the article, I will refer to these projections as “screens,” but remember that the final result is a seamless projection with no hard edges or frame lines.)


Eight digital projectors would be used to project eight individual 1920X1080 Mpeg-2 files on the 360-degree screen.

I did all the editing using an Avid DS system. (Rest in peace DS.) I have been cutting on the DS for several years now and its ability to do so many different tasks, both offline and online, without leaving the box is unequaled. But, the DS couldn’t natively play the F55 HD material. I had to use Media Composer as a middleman to get that footage into the DS. Media Composer was able to transcode Sony’s XAVC files to DNxHD. The footage from all the other cameras came in perfectly.

Setting up multi-screen shows is a breeze on the DS. To represent the eight screens I created an 8-layer composite with 5 screens arranged in a semi-circle on the upper part of the frame and three more horizontally arranged in a row on the lower half. This way I could see how all the imagery worked together while I cut.

Inside the 360° theater there really is no “front” or “back.” The audience can look wherever they want. The Director, Bob Noll, did not want any black areas on the screen. I had to make sure there was always something to see on every part of the screen.

However, we are trying to tell a story so there had to be a main focus, a part of the screen that is a little more important than the others. How did we achieve this? By taking advantage of one of the oldest techniques in cinema – the opening crawl. (If it’s good enough for George Lucas it’s good enough for me.) We were also lucky enough to have the great Bill Kurtis provide all the voice over narration.

The show begins with some exposition about the timeline and an explanation of the unusual title. Reading the crawl compels the audience to look at one particular area of the screen. This became the “front screen.” Having established this forward position we expanded the front screen to include the two screens on either side as a continuation of the central front or forward.

We figured that the peripheral vision of the audience would allow them to take in the visual information on these five screens without too much trouble. That left three screens behind their heads that would require them to turn around completely to see. This became the “rear” of the theater.


Reading the crawl compels the audience to look at one particular area of the screen.

So, we decided that for the majority of the show all the important story-telling and character bits would occur on the five “front” screens, but occasionally we would force the audience to turn to watch the “rear” screens. We wanted the audience to be an active participant in the show, but we didn’t want to give them whiplash.

We also quickly learned that audio would be extremely important, even more so than usual, to guide the audience to look at the parts of the screen we wanted them to see. The theater has a total of 11 speakers – one above each projection and three pendants hanging from the ceiling as well as a sub-woofer and a “butt-kicker” underneath the floor to shake the audience whenever the cannons go boom.

When each of the main characters appears for the first time, their dialogue comes directly from the speaker mounted on the same screen that holds their image. This forces the audience to look to that area so they learn who each character is and what they sound like. Once that relationship is established, the audience will always know who is speaking even if they don’t see the character on screen.

These rules that we came up with were like nothing I had to deal with in the edit suite before. While creating the story in the front, I also had to always make sure there was relevant and interesting imagery in the rear in case someone decided to look back there. The 360° set-up completely changed the way I dealt with rhythm and montage and pace, with the length of shots and the selection of shots. So many things that have become instinctual over the past 18 years of editing were new and different. It was exciting and scary at the same time.

And, if I didn’t have enough flies in the ointment on this show, I also decided to cut the show without a temp music track. I knew we were going to be hiring a composer to create an original score for the show so I wanted to give her the freedom to create music that hit all the right emotional beats without being tied to a pace or tempo created by another piece or pieces of music. This was something I had only done once before and that was for a simple one-screen documentary. Award-winning composer Ruth Mendelson (www.reverbnation.com/ruthmendelson) was extremely happy to have a completely clean musical slate to work from and she created an incredible score for the movie.

Getting back to the visual – how did the seamless 360° shots work? From conception, Director Bob Noll wanted the 360° shots to exist for more than the simple “Wow!” factor. He designed them to appear at very specific points in the narrative to help tell the story, immerse the audience in the world of the story and give them something they haven’t seen before.

The first shot we see after the crawl is a sunrise that surrounds the audience. This was not a true seamless 360° shot. I created it in Photoshop by stitching together a series of stills that Bob shot one early morning in the hotel parking lot. I also added a flock of birds created in After Effects just to have some movement in the shot. This sunrise remains for the first few minutes as we see sequences of young men as they proudly sign up to fight the Southern Rebellion. We watch as they leave their families and friends.

The 4K resolution of the F55 material allowed me to stretch those shots across three screens without losing quality. All this imagery fades up and down, layered on top of and blended into the sunrise. Yes, we see multiple images on the screens, but no hard edges ever. It always had to seem, well, seamless.

In this same technique, we introduce our three main characters: a Captain in the Union Army, a veteran soldier, one of the few who has seen battle before and knows the costs; a young Irish immigrant eager to have an adventure; and an Abolitionist who fights for a cause. We first see him in Church, blended into the sunrise like all the other shots, but then the Church interior slowly unwraps across the entire theater to reveal the complete congregation. This is first full, seamless 360 shot and it puts the audience right in the middle of the Church. Hopefully, at this point, the audience realizes this show is going to be different.

But, how do the seamless 360° shots work?

Basically, each lens captured an image area that included overlap from the lenses on either side. Each camera was fed directly into a separate Mac laptop and all the files were transferred to hard drives at the end of the day. For all the 360 shots every individual “take” consisted of 9 separate files.

Once I had decided on the takes I wanted in the show, I had to stitch all the files for that shot together in After Effects. The original 2K files from the 360° rig were encoded with the Cineform codec, but AE doesn’t really play nice with too many Cineform files at the same time so I had to export .png sequences for every file and bring those back into AE for the stitch. It took a unique combination of extra large compositions, distort and offset filters, scaling and nudging as well as many masks to get the pieces to fit together.


Each file had to be individually color-graded to get the entire shot to match.

There was no all-purpose solution so every shot had to be tackled on its own. And, of course each file had to be individually color-graded to get the entire shot to match. Once that was done, I then took my final large comp and broke it up into 8 1920X1080 HD comps to be rendered out and cut into the show.

The process was like putting several puzzles together than breaking them up to be put back together in a different way somewhere else. I was worried at first that the nine 2K files wouldn’t play nicely in a show destined for eight screens, but with the overlapping and distorting, it worked out extremely well. We ended up having seven seamless 360 shots in the show plus 2 “faux-360″ shots that were created by using several static shots from the F55 4K camera and stitching them together in a similar way. As long as there was nothing crossing in front of the camera the illusion was complete.

Yet another issue sui generis to this project was the review of rough cuts. Our theories about peripheral vision, front and rear, etc., all made sense, but they were still theories. We had to see if they actually worked in the real world. There was only one way to do this – we had to watch the show in a 360° theater. Since those are rare, we decided to build a half-scale prototype in our studio.

We (and by “we” I mean Pete Does, a Senior A/V Installation Technician at BPI) built a structure to support eight Digital Projection Inc. HIGHlite Cine660 projectors, each one weighing 85 pounds. This contraption was hung from the actual building supports and allowed us to project the show on a ring of bedsheet-screens.

Pete Does, a Senior A/V Installation Technician at BPI, built the structure to support the eight eighty-five-pound HIGHlite Cine660 projectors. Click to view larger images.

Dataton’s Watchout multi-display software (www.dataton.com/watchout) was the choice to properly sync and display the eight projectors. Once again, an entire article could be written about Watchout, but we were using the software for two of its more impressive capabilities: projection edge-blending and geometry correction to account for projecting onto a curved surface. Both of these can be adjusted real-time and on-the-fly in Watchout.

After each major revision (and there were a lot) we would screen it on the prototype to see if our ideas were working. We brought in staff and friends to see if the story made sense and was easy to follow. Creating and using the prototype was an absolutely essential part of the Post process for this show.

In fact, we went a few steps further. I mentioned the immense importance of the audio mix for the show. Since we had the prototype set up we also hung all the speakers in their correct positions. The BPI ProTools system is designed to be mobile, so Audio-Mixer Extraordinaire Mike Rafferty simply wheeled his cart into the studio and was able to mix the show exactly the way it was to be heard in the theater. And then, (whew!) we also set up the final aspects of the show.

The theater includes a few different lighting cues to enhance the experience, so we put all the lighting equipment up so the lights could be programmed. And the final feature of the theater that really adds something special to the experience is a large air cannon. Engineered and built by 5Wits Productions (www.5witsproductions.com/) the air cannon is inconspicuously mounted in the wall of the theater. It blasts the audience with large puffs of air to match cannon shots, explosions and the like. We mounted that in our studio, too.

On February 27, 2014 the museum held a premiere celebration for Seeing the Elephant. I was invited to attend. Personally, I might have preferred a Wisconsin-based premiere in, say, August, but never-the-less it was great to see the final show in situ. Bill Kurtis was there as well so it was fun to get the reaction of someone like him, a guy who has made well over 500 documentaries himself and voiced 500 more. He loved it. The Museum says that they’ve had groups as diverse as fourth-grade field trips, to Veterans from the Korean War – and anything in between. The feedback has been universally positive.

All in all, a ton of work. But, worth it, I think. “Seeing the Elephant was a chance to do something very different on both the creative side as well as the technical side. And those don’t come around that often.