top of page

ZACH JOPLING Q&A

BMW

Synapse Virtual Production’s chief innovation officer and resident cinematographer Christopher Probst, ASC sat down with director/dp Zach Jopling to discuss his recent experience shooting his first virtual production commercial for BMW’s i5 featuring Jay Shetty.

 

The duo covered a lot of VP ground, both artistic and technical, examining the process, the nitty and gritty details of the shoot, and what Jopling learned along the way.

Jack Jopling.png
BMW  i5 Spaces Have Energy.00_00_02_12.Still003.jpg

Christopher Probst, ASC:  Thanks for jumping on for a little postmortem on your first experience shooting with virtual production for BMW.  I thought that since we talked prior to going into your shoot – with what you assumed it would be like and then what you learned through the process – if there were any sort of take-aways for you, that’d be great to examine. 

 

Zach Jopling:  Thanks!  Overall, I think the experience was everything I hoped it would be. I could certainly talk about the good things all day! It was pretty incredible.

 

CP:  That’s awesome to hear.  Perhaps you could begin by talking about your experience working with Synapse’s VAD team and the flexibility of creating your own custom environments.

Storyboards.jpg

ZJ:  Yeah that was the number one reason why we went with Synapse for the project.  Every aspect was fully integrated: the VAD team, the VP supervision and the support every step of the way.  You were all able to offer a great amount of confidence in each phase of the process that other VP companies we approached simply couldn’t.

 

Being able to tell us exactly what something was going to cost and give us a realistic picture as to what we could expect was crucial.  And not just from a timeline perspective and what was possible in the weeks leading up to the production, but also what could practically run on the volume on the actual production day. Synapse were able to keep us on task, which I think a lot of other virtual production companies weren't willing to, or weren’t able to do. 

 

They were able to quote a certain, concrete number up front as to what it would take to develop our environments without having to go down the road like five steps in previs and asset design just to get to that bigger picture quote.  Those other companies weren't really equipped to guide us, or even consult us and tell us, “You have this amount of money, this is the path that you can go on to expect this result…” 

Synapse really showed us the way and kept us on time and budget and gave us the most value on screen at the end of the day.

 - Zach Jopling

UE_Gif.gif

CP:  Your production had a bit of a hybrid scenario where Synapse created the VAD and were also present on set supervising the VP process for your shoot, but it wasn't shot on our VP stage.  That was one degree of separation from our ideal, where we create the assets and then bring them onto our own stage – with our own design advantages – and be able to vet/optimize those assets often weeks in advance as they are being created. 

 

Having the environments created in and run in Unreal is one thing, but the optimizing of those assets is also a big part of the assurance that a VP project runs smoothly. Yeah, it may run great in the engine…. But when you bring it onto the LED screens, you often take a [performance] hit that needs to be accounted for.  Building environments so that they play well on a virtual stage is a critical part of the process/pipeline that often doesn’t get discussed.  Did you like the ability to dictate your own reality with the environments you were creating? 

VAD1.jpg

ZJ:  Well, I wouldn't say it was exclusively my reality.  It was a negotiated reality between many parties and cooks.  But that's kind of the benefit too, you know.  We were able to create this world that appeased the agency, the brand and of course, me as the director and we had all the expertise of Synapse’s VAD and VP teams.  The benefit of creating the environments from scratch is that it allowed for feedback from the clients early on.  They could see the assets evolve and say, “We want more greenery in the rocks,” or, “Yeah, that one rock in the distance… it kind of looks like a house.  Can you just remove it?” Those types of changes were near instantaneous. If someone asked for a moon to be put right above the car, again, it just took a few minutes and was no big deal.  Whereas if we had shot that practically, it may never have lined up, or it would have ended up being a VFX shot in post and it would have added more cost on the backend and complicated our workflow if we didn't already have a VFX pipeline in place.

 

CP:  As a cinematographer, one aspect of virtual production I especially appreciate is the ability to control time of day… having authorship over the placement of the sun.  I can talk to our Unreal artists and say, “I want that sun right there and have it stay there the whole shoot!”  You have perpetual control over the conditions of the shoot, something you never have in real life!

BMW  i5 Spaces Have Energy.00_00_14_21.Still004.jpg

ZJ:  Absolutely.  I think that controlling the time of day may seem kind of a mundane use of the volume, but it's so valuable because you could waste half a day just waiting for a sunset that may never happen, right?  I will add that I was super surprised by the amount of light that the screens actually gave off.  It's just something you really can't comprehend until you're actually there on the volume.

 

CP:  You're typically not starved for photons the majority of scenarios and rarely need the panels at 100%.  And if you do put a white screen up, it’s blinding!

 

ZJ:  Yeah, that was truly incredible.   But one aspect I did not fully account for going in was the quality of light from the panels.  Their CRI wasn’t super high. With the panels on that stage, that was kind of an issue.  Sometimes it was fine for the car to have a bit of a [magenta] color cast, but for skin tones it was actually terrible.  It became a secondary color correction pass in post that we had to focus on the skin tones a little bit just to make sure Jay looked healthy and not skewed toward magenta or green.

 

CP:  Did you mix in a little bit of on-set lighting to help with that?  Trying to combat that on the day, or was it something you realized more in post?

 

ZJ:  We used a lot of the ambient light of the actual volume. I think I realized the issue more in post, because I was using my own [on-camera] LUT on set. In general I think the Red Raptor tends to skew a little bit more green sometimes, so I have a LUT that I used that corrects for that magenta/green misbalance. I think I have it reduced on camera a lot, so I wasn’t really looking at that on set. But once I saw it at Company 3 when we were grading, I saw it amplified a lot more than I had on the day.

CP:  We definitely stress that while bringing up a scene in an LED volume does throw some photons of light around and have tones indicative of the scene, the panels themselves are not designed to be light sources. In fact, as a display they perform better if each of their three color channels – R, G and B – do not have their emitted wavelengths cross over one another.  So by definition, they produce a limited spectrum of light.  That deficiency can cause all sorts of issues with that light interacting with physical objects on set and even produce completely different colors than what you would expect from an object; a phenomenon known as metamerism. To address this, for our stage at LA Center, we are building a downstage “brow” that’s a long overhead strip of RGBW panels angled to provide sort of three-quarter frontal/top light to help create better skin tones.  The panels are still driven by the environmental content, but they also mix in a white diode source to emit a continuous-spectrum  of light that helps render better skin tones.

IMG_0724.jpg

ZJ:  That sounds amazing. I'd be interested to see a project that uses high key lighting. I'm sure it could work fine if the LED volume is big enough, but on the stage where we shot BMW, I couldn't imagine doing a high key look with how small it was.

 

CP:  Well the LED panels themselves will never project a “beam” in the air, you know?  So that has to be achieved with supplemental lighting…

 

ZJ:  No, I'm saying the idea of projecting a light straight at the talent, for example. If you don’t have a large enough volume or enough depth on the stage to have them further from the walls, that light would hit the screens and wash them out.

CP:  Ah okay, you’re talking more about flashing… where the contrast of the screens gets milked out by stray light hitting them…  Well, there are pluses and minuses between every different panel product out there.  As the pixel pitch gets worse, meaning there's more gap between the pixels, there is more sort of negative area to absorb stray light and not reflect back.  So you at least get good contrast.  But as pixel pitches get finer and finer, those gaps between the diodes get smaller and smaller and no longer act as a light baffle and absorb that stray light reflecting on the panels. So much so, with a product like Sony’s 1.5mm Crystal LED panels, for example, they have a continuous sheet of plastic on their face.  So even though their pixel pitch is great, they're really bad in terms of reflections, both aurally and visually.  So you lose contrast and that can be really problematic.  There’s a sweet spot pixel pitch right around about 2mm where you still have a little bit of bezeling in between the pixels that absorbs incident light.  Now you can use the trick of putting a polarizing filter on the camera to dial out some of that flashing on the screens.

IMG_0730.jpg

This brings up a point to stress about having a successful experience shooting virtual production.  At Synapse, we have a support structure where we have somebody that consultants and shepherds people on set to help guide the process.  It’s also partially why we are having this conversation with you right now.  With you also handling the cinematography, for example, it was important for me to talk to you up front and say, “Here's what you should look out for…”  For example, having those wild walls for reflections, using mixed light, looking out for moiré issues, where it’s best to stage the talent, depth of field… all of those considerations.

ZJ: Things that I think Synapse already has solved for like the gaps in the ceiling. That was something I was constantly having to deal with.  Those gaps in the ceiling really slowed us down.  I had to constantly ask, “Okay, can you feather out [the content displayed on] the gap in the ceiling? Because I'm seeing it… This windshield’s reflecting 270-degrees right now, feather out this gap, then feather out that gap.”

unnamed.jpeg

CP:  Indeed.  Your situation was also exacerbated by the size of that stage being very limited.  With the Synapse stage at LA Center, for example, by having a seamless marrying of the ceiling and the walls, you can have perfect reflections that don't have any big gaps where there’s no overlap between the ceiling and the walls.  Nor are there any tracking cameras in shot to paint out. The stage you shot BMW on was such a small volume that you had to be physically closer, which kind of breaks our suggested rule of thumb of trying to pull everything at least 10’ to 15’ away from the walls.  Had you shot on our stage, it would have been a lot easier to have the car on go-jacks and just turn it to dial in reflections on the windscreen.  Now that’s not to diminish the final result of your work.  I think BMW came out absolutely gorgeous.

 

ZJ:  Well thanks, but there's so many things that I pixel peep where I'm like, “Yeah that could have been better…”  I was having to deal with a lot of moiré issues as well.  That was definitely a problem because  the  car was often on the same [curved] plane as the screen at certain points.  You couldn't really escape it.  I feel like we did a pretty good job to get around it by using [vintage Canon] K-35 lenses, as they're pretty soft and helped minimize moiré.

CP:  And the fact the ROE panels are like a foot off the ground, I thought you guys found a clever solution putting that parking border around the ground to make it look like it was a parking lot. That [practical feature] tied the foreground to the virtual environment really well, but it also put the car in closer proximity to panels, creating more issues with moiré.  At LA Center we have a much larger volume and the panels are only about an inch off the ground, so it's a lot easier to make that disappear with a little bit of set dressing on the ground.  And by having a larger footprint to work within the volume also allows you to pull subjects further away. Additionally, our panels’ pixel pitch is 2.3mm, compared to what most stages are using, which are almost 2.9mm.  With virtual production, every millimeter above 2mm creates an almost exponential increase in the propensity to encounter moiré.  And for me, that factor alone can make or break the success of a VP shoot.

Again, as the pixel pitch on the panels gets finer and finer as the technology evolves these issues will automatically begin to resolve themselves.  It’s also important to communicate these factors to all departments, including unit photography and cameras shooting for print.  That being said, I don’t feel there is anything shot in the spot to sneeze at or think, “That looks fake, or that looks like it's a VP shot…” You did good and definitely deserve kudos! And based on this first experience, do you think you’re excited to shoot more with virtual production?

BMW  i5 Spaces Have Energy.00_00_04_22.Still002.jpg

ZJ:  I'm definitely letting people know that I did this BMW spot and book me again for more VP jobs!  In fact, I just got reached out about another car ad for the volume.  So I will definitely be name-dropping Synapse as the production partner that they need because you have the studio, you have the ability to develop the assets that they’d be shooting with and execute each step of the production.

CP:  That actually brings up a good point.  For you as a director, thinking ahead, when pitching jobs for virtual production you can really start to use Unreal and part of your deck building and storyboarding / pre-vising process. And the crazy thing is, all those assets wind up porting over to the physical production as well. Our chief creative officer, Rich Lee recently did a VP shoot for Lexus and he created all of the storyboards in Unreal.  All of those assets evolved into the actual assets used for the environments. And even on his storyboards, the reflections created on the windows of the car for the board, when they put the real car on the volume, the reflections on the physical car’s windows were identical. The boards matched identically.  It was insane.

 

ZJ:  Was that with something like CineTracer?  Or was he doing that with a more sophisticated program?

BMW_Jay_Shetty_Pre-Pro_v5-43.jpg

CP:  No, it's just within Unreal!  And by having the mesh of your VP stage in the NDisplay config within Unreal, when you then get the matching virtual car model, drop it into your environment and start blocking out your shots, your environments and the reflections are being generated in real time.  So what is then displayed and reflected from the actual panels onto the car on set was exactly the same as was reflecting on the virtual car in unreal, because the geometry of the car was the same, with the same staging/placement it matched exactly.

 

ZJ:  Okay, I just need to download Unreal and learn it!

 

CP:  Yeah, it's pretty nuts.  And then the tool sets are getting even better…  The assets that are created to build your boards with to show the clients your shots, well when the client says, “Well, we wish the camera was lower, or that building was different…” Well, you can drop the camera down live and say, “There you go and hit print.  There's your board for it!

1.jpg

ZJ:  I'm sure with Polycam now I could actually use this for location work, too.  Just go scan the location, bring it into Unreal…

 

CP:  It all already exists!  Unreal is using Google Earth images now, so you can scout locations in the engine.  It converts Google Earth into 3D maps now and you can move around it. It’s not high rez or anything but it’s still pretty fucking nuts. We location scouted Paris for this job coming up next week with Google Earth in Unreal and it's accurate! It's crazy.

 

ZJ:  Okay… I just downloaded Unreal Engine, so that'll be a thing I'll be learning!

bottom of page