As the in-house graphics department of SVT (the Swedish equivalent of the British BBC) we do everything from high-end visual effects for drama and trailers, to motion graphics branding and daily news graphics. We have a wide range of talents that can do anything from 3D modeling to building video play-out software.
In early 2011 we got a request if we were equipped to do all the visual effects and on-screen graphics for an upcoming 10-hour drama called Äkta människor (Real Humans.) The show is something as unusual as a Swedish sci-fi series, set in a parallell universe were the humanoid robot evolution has had the same exponential progress as computers have had in our society. Like all good sci-fi, it focuses on what happens to us if we change one major factor, in this case; what happens when every middle-class family suddenly can afford to have a robotic servant that take care of the chores, and that is human-like enough to get involved with, both mentally and physically…
Although the effects that were planned at that stage weren’t tremendously complicated, it looked to be a big job judged by our broadcast standards. The visual effects budget was really tight, initially only 60 man-hours in total per one-hour episode.
When shooting started in April 2011, we had done quite a number of tests to nail down a look for the recurring effect of the sub-skin lights underneath the eyes that indicate a “deep charging” robot (called “hubot” in the series.) Other than that, there was a very undescriptive list of shots that might need effects, but not much more.
As the visual effects supervisor, I knew that planning ahead was essential if we were to be able to keep to the tight budget, and at the outset I insisted that I was given time to be on set for the shooting of some of the trickier effect scenes. Initially, the plan was to lock the edit for each episode before we started work on any of the effect shots, minimizing the risk of working on effects that were later cut or changed. However, the further the eight-month shooting schedule progressed, delays in shooting and editing started to creep into our allotted time so that we for the last half of the episodes had to work before the edit was locked.
Since most of the show was shot on a Steadicam for quick setup and turn-around, the first stage of our VFX pipeline was almost always a planar track in “mocha AE” that was then brought into Adobe After Effects CS5.5 via the excellent script “mochaImport” that lets the artist create a stabilized and undistorted precomp of the areas that will need treatment. This was essential, especially for the sub-skin lights that needed to be tracked to faces that move around and turn. By adding effects to this undistorted shot, and then downstream getting the distortion applied back to the composite made for quick turn-around of shots that would have been impossible just a few years ago. I was so confident in our ability to track everything that the DoP repeatedly asked if I didn’t want the camera locked down for the tricker shots, I always replied that “I actually prefer if you move the camera, it helps sell the effect!”
As the shooting progressed through the summer of 2011, the script kept evolving and the director’s initial trepidation with doing the effects in-house evaporated. This meant that the list of scenes that could and would need VFX grew quickly. At the same time we were doing a lot of logos, symbols, posters and around 15 touch interfaces for disguised iPads used to change settings and firmware in the robots. The interfaces were made for interaction by the actors, so the result was shot in camera to avoid lengthy screen replacements, as were the around 30 screen animations designed for display in computers and TVs. In the end we only did eight screen replacements where the video that was to be shown in the screens didn’t yet exist at the shoot.
One of the biggest effects sequences was the explosion of a store front. The location featured a big parking space with a nicely lit exterior, but no natural opening for an entrance other than a giant garage door with doors that could not be removed. Instead of building a real neon sign and glass doors and that could be blown up (at the cost of around $25,000) we offered to do a digital replacement. With TV production, the schedule is so tight that we couldn’t pre-plan more than really rough camera angles. We devised a single Steadicam shot showing a stunt man walking out from the store, and in sync with a special effects arsenal of propane burners and burning-debris canons, being pulled by a wire into the side of a car. The car was rigged with squibbed windows and a hydraulic jack that caused it to jump at the time of the explosion. The camera would then walk behind the stunt man as he turned around and looked at the devastated and burning store front, at the time represented by a semi-lit green screen behind lots of smoke.
The shoot went fine, but we quickly realized that this shot required more advanced tools than were available with the combo of “mocha AE” and After Effects. The large amount of smoke and parallax from the moving camera made the manual wire removal painstakingly complex and slow. After a few days we gave up and bought “mocha Pro” to get access to the “Remove” module that made it possible to finish the wire-removal the same day we got the licenses. When it came to isolate the stunt man from the smoky green screen, “mocha Pro’s” tracking-assisted rotoscoping capabilities and variable-edge feathering also helped to save the day.
Everything was shot on Arri Alexa at 1080p25 to Apple ProRes4444 codec as LogC. We processed everything as a 32 bpc workflow with Rec709 LUTs used only for viewing, and output offline copies for the editors as DNxHD 36 or 120 with baked-in Rec709 for import into their Avid MediaComposer 5 suites. The final delivery was done as 1,920 x 1,080 16 bpc LogC TIFF sequences that were conformed in a Nucoda grading suite.
Almost all of the 290 VFX shots in season 1 was comped entirely in After Effects, with a few exceptions that were done by a freelancer in The Foundry’s “NukeX.” Being in broadcast, After Effects is the natural choice and our go-to software, but we have now bought a Nuke and a NukeX license that we will start using more for similar projects in the future.
The first season (9.5 hours in total) has 26 minutes and 24 seconds of visual effects in it. All in all, we’ve spent on average 55 minutes per delivered and aired second of visual effects, which I believe is pretty effective even by broadcast standards. Add to that around 100 hours of on set supervision, and a few hundred hours for the other assets we’ve created.
It’s pretty amazing to be able to do high-end, seamless and photo-real HD effects on hardware that costs less than $2,000 per machine and with software for even less than that. Being an in-house shop, I know that we need to be extremely efficient, but I know of similar TV shows that have paid more for one tenth the amount of similar effects done “on the outside.” And ten years ago the budget for this would have to have been ten times that!