Virtual Reality Lighting with Andrew Shulkind

A Brave New World: Feature film, Commercial, and VR cinematographer Andrew Shulkind on the unique demands of lighting for 360 degrees of coverage.

Known for an unstoppable interest in staying up with the most applicable trends in filmmaking and technology, Andrew Shulkind’s work behind the camera has been referred to as a “painterly use of lighting”. Fairly ironic, as one of his most well known commercial projects was at the helm of Google’s 2016 Tilt Brush commercial, a visual feast designed to showcase the burgeoning possibilities of the 3D and virtual reality mediums. It’s racked up nearly 2 million views on YouTube and elsewhere, while a virtual reality project for Nike necessitated that Shulkind design his own 360-degree camera array system; the uncompressed 32K-resolution, 12-bit Headcase Cinema Camera, which is now available for rent in Los Angeles via Radiant Images.

The skilled and economical cinematographer has done plentiful standout commercial work in the 2D space, as well, including an installment of the famous Old Spice series with Terry Crews for directors and comedians Tim & Eric, as well as a client roster that includes Super Bowl advertising work for McDonald’s and Budweiser as well as plentiful VR work for clients like Samsung, the NBA, and FX Networks, who asked the team to put together a VR experience for Comic Con 2015 which centered on their series The Strain from producer and writer Guillermo Del Toro. After graduating from New York University’s Tisch School of Arts, he started in cinematography as a tech in the film days, working as a lab technician and liaison for the Kodak/Panavision PreView System with some of the biggest names in cinematography.

The skilled and economical cinematographer has done plentiful standout commercial work

Thanking his time at Panavision as the ultimate training ground, it often fell on his shoulders to maintain color consistency and lighting levels for a number of ultra-big-budget, new-millennium blockbusters, including David Fincher’s Panic Room where he worked with Darius Khondji, A.S.C., Janusz Kamiński on Spielberg’s A.I., and Don Burgess, A.S.C. for director Jonathan Mostow’s Terminator 3: Rise of the Machines. He says that these cinematographers and others were very encouraging, giving him camera operator duties on several major films while he made his own way as lead DoP on lower-budget music videos and commercials. Now, quite an accomplished cinematographer in his own right, his narrative films and documentaries have been screened at renowned festivals like Cannes, Sundance, Camerimage, and Tribeca.

With work profiled in American Cinematographer, British Cinematographer, The Hollywood Reporter and Variety, he’s also become well known in the industry for his sensible integration of new technology. He frequently presents at trade shows and industry summits; last year he spoke at NAB in Las Vegas about the ACES color space for Canon with cinematographer Curtis Clark, ASC and this year at NAB led a panel on next generation imaging with Adobe and Microsoft, among with many others.  He has an informed sense of the future of content origination, and has lately been consulting on content, strategy, process, and workflows with studios, brands, manufacturers, and the US military.  He recently wrote an article on virtual reality for the International Cinematographers Guild, who awarded him an Emerging Cinematographer Award in 2013. Shulkind has also won a Studio Daily Prime Award in 2014 for his outstanding work as a cinematographer in media and production, in addition to a Studio Daily Top 50 Award for Creativity and Innovation in 2016.

“I came up in a funny place in time: right as the industry was transitioning from film to digital so I had a mechanical appreciation for quality and how we got here,” he explains about his background, “but an early fluency with these new tools to find ways to make the old system more efficient. I was fortunate enough to work closely with digital imaging experts at Kodak, Cinesite, and Superdailies to establish color-accurate digital workflows for studio features and commercials, and then shot my first two movies on film and finished them photochemically.” His competence and fluency with visual-effects heavy workflows was hard-earned and well established; from working with everything from motion control and miniatures, high resolution digital sensors, and stereo 3D rigs  to game engine previsualization and HDR have led to a close and commutative collaboration with visual effects supervisors. Shulkind notes, “Nowadays, almost every commercial and movie that we do has some visual-effects aspect, so I guess it was a good foundation.  And we speak the same language.”

Since then, he has teamed with new virtual reality companies like Facebook, Oculus, and Headcase VR to create high end live-action virtual reality and interactive experiences all the way down to branded cardboard VR goggles. He’s said that he considers the lighting constraints with VR to be the hardest part of the art form, as his goal in lighting design is always to direct the viewer’s eye in a meaningful way. This becomes an added challenge when the viewer can see in all directions, and he’s always using new ways to push the medium, so that everything is not always top-lit, common to VR films.

“You have to continually remind people that in VR that you see everything,” he explains. “The camera always needs a mounting point, so we will put it in the most invisible place possible, depending on the application—above, below, or at a rear 45-degree angle, meaning you can look up and down but not back through your body. Virtual reality is all about virtually transporting the viewer to convince them that they are somewhere that they are not. A panorama is central to that illusion. The viewer should have the freedom to look in any direction.  We did some early tests shooting only 180 degrees so that I could cut down on the number of cameras required and also hide my Steadicam operator, but it wasn’t a full enough experience.  I don’t think that every project has to capture 360 [degrees] by 360 [degrees], though that does offer the most complete immersive experience and you do need to fill your field of view and periphery to complete the effect.”

“In one of our tests, we explored how to cover a dialogue scene in 360-degree-VR between two people. We found that an over-the-shoulder feels creepily voyeuristic, and a wide shot feels alarmingly distant. By far, the most natural way to shoot was to put the camera right in between the characters so that the viewer could choose who to watch. It was an unexpected discovery. So in terms of building camera rigs, I’ve been focused on having the capability to shoot 360 degrees by 360 degrees, and we pull cameras when they are not needed. And when mounted on a drone or in some other rigging circumstances with a perspective of only a plain blue sky or green grass, we just paint in that hole digitally.”

He works often with Chimera brand, saying he’s used the Lightbank softboxes “forever”, and loves the ease of the Chimera speed rings for fast lighting setups and swaps. He’s also excited for the recent Chimera and Zylight partnership that has resulted in a new Active Diffusion technology, a variable opacity soft light screen that can dial in the level of diffusion for up to 512 remote fixtures when using DMX. Shulkind gushes that one of his mainstay lighting instruments is the Zylight iS3/c, an RGB panel that provides a vast spectrum of thousands of tunable colors with wireless connectivity and sychronization. But he says that his very favorite light is the Chimera Birdcage, a mountable lantern light with a compact, circular profile that can carry up to 500 watts.

He works often with Chimera brand, saying he’s used the Lightbank softboxes “forever”, and loves the ease of the Chimera speed rings for fast lighting setups and swaps

“I own two Birdcages, and I don’t own very much equipment, because I like to use too many different things,” he laughs, “but I use the Birdcage all the time! On a recent shoot in Romania, I had four of them with me. It’s the kind of light that I can hide easily use reflectors or blacks to manipulate it with the Velcro. It’s another tool that allows me to work quickly without having to spend a bunch of extra time adjusting. I can rotate it ten degrees and I’ve totally changed it’s purpose. On some occasions I’ll clip in my own diffusion. I also use China Lanterns, Pancakes, and the Barger light in this way.  But I use that Birdcage all the time. I bring them all over the world. My gaffer gave me one as a gift and now I give them to others as gifts! Coupled with a dimmer, it’s my secret weapon.”

He says he embraced LED quite some time ago, and it’s become a growing part of his kit. He’s a big fan of ARRI SkyPanels as well as Kino LEDs, which he says are lighter in weight though a little less efficient. He’s also been using HIVE Lighting plasma out of Los Angeles when he needs a lot of output from a small draw. “Technically, its only drawing a thousand watts,” he says, “but the output is like a 10K. It’s really punchy! I’m a big fan of the K 5600 stuff, so I use a lot of Joker-Bugs. I use a lot of bi-color, RGB LEDs, anything high-CRI, but I can roll with anything as long as we’re above CRI 90 or so, especially if it isn’t on a face. Usually I want DMX-controllable, because my gaffer can control it through an iPad Luminaire or a small board. I always carry Lekos with me. I was doing a movie last year and I needed a lot of output deep in an old basement, and we had limited cable run, so I landed a couple of Joker-Leko units with 19˚ lenses really deep, plug them into a 20 amp circuit, and was able to get the equivalent of a 4k.”

He breaks lighting for virtual reality into two different approaches. Shooting in a spherical way with a 360 degree camera, like the Headcase camera that he designed, which uses seventeen Codex Action Cams to capture a RAW 360-degree field of vision at 12-bit color depth, far more than a typical GoPro system. That much dynamic range gives him a way to use practicals within a scene, which he will often enhance with lighting from above. “The other way, which I usually prefer to shoot,” he continues, “is to shoot nodally, which gives everyone much more versatility. Basically, you back the camera up and shoot the scene in angles, like slices of a pie.  It has it’s own challenges, but this makes for the cleanest, easiest stitch in post. When you shoot with a big camera ball, there’s a perspective disparity between lenses, just like your eyes, which makes close objects more complicated to stitch.  But by pulling the camera back to the same point in space, you can shoot successive slices of the pie separately, and it becomes much easier to reassemble as a complete viewing sphere.  And from a lighting perspective, we can basically light from outside that slice, so it’s much easier to shape for talent.  Having soft light is the best way to light this kind of photography, because of the way that we have to manage shadows.”

He breaks lighting for virtual reality into two different approaches.

“My crusade as a DoP is to bring artful, cinematic ways of keeping the level of quality and nuance of narrative that we’ve refined in cinema for the past hundred years, into this new VR space.” Shulkind says that eighty percent of lighting design when he’s moving fast on a production is positioning, and he’ll use any available time to tweak the levels from there. Diffusion is hard to alter without access to the fixtures, however, so final touches and refinements available remotely through DMX or tech like Active Diffusion tech is highly desirable.

“When I was a young camera operator coming up in the business,” he remembers, “John Seale, A.S.C., told me that the second most important thing that a cinematographer needs to be is fast. Moving really quickly is of top importance. So finding versatile tools and efficiencies of technology is something that I’m always pursuing, whether it’s plasma lights, LEDs, or new dimmer systems like this Zylight Active Diffusion. Versatility is a big thing for me. Because working quickly at a high level of exactitude has always been one of my biggest assets. With modern sensors, it’s all about capturing as much range as we can on the day, and getting the directionality that we need, and the intent that we need, and the kind of lighting ratios that we need. In terms of color and density, I can kind of stretch that and squeeze to some extent in post, but I can’t change the quality of the light, what the softness looks like.”

Versatility is a big thing for me. Because working quickly at a high level of exactitude has always been one of my biggest assets.

“The scale of moviemaking and the number of films that are being made is changing,” Shulkind says. “And the speed at which we have to work to meet those needs is changing. The budgets and schedules are relentless. On commercials, we used to have a week, and now we have three days. On movies, we used to have a hundred days, now we have forty. So we’re always having to do more with less. I’m always looking for tools that allow me to do that more efficiently; to cut corners without seeing those cuts on the screen. Active Diffusion allows you to do more with less and stay nimble when an actor’s performance or position changes.  The cool thing is that you can program it remotely for different densities.  So it’s not just some whiz-bang new thing, it literally means that I don’t have to take the time to build a whole grip jungle around each light.  Now I can make minor, last minute, performance-driven improvements to the shot from off to the side with my team without disrupting the director and actors. It falls in line with the way that I have always turned to Chimera for that efficiency of softness. It feels like another chapter has been written in that lineage of versatile light control.”