How to Build (And Film) a Virtual Production Stage

by | May 11, 2022

Richard Cormier is Executive Producer of Virtual Production at MELs Studios in Montreal, Quebec, Canada – one of the largest virtual production stages in the world.

MELs marries physical stage design with cutting-edge virtual sets, often working with files of up to 12K resolution rendered on a 20-foot high LED wall in the background alongside a physical set in the foreground. Its state-of-the-art technical facilities include:

  • A 10,000 sq. ft. virtual production studio
  • A concave wall configuration with a 65-foot diameter (and 20-foot height)
  • A fully computerized and motorized ceiling for custom configurations.

We caught up with Richard recently to pick his brain on all things virtual production, and the discussion was fascinating. Here’s what we learned.

Table of Contents

File Transfer for Filmmakers

Send footage and large production files over the cloud, no matter the size.

Note: The following transcript has been edited for length and clarity.

How Does One Become an Executive Producer of Virtual Production?

The MELs studio logo displays over top of a cityscape on a virtual production stage

I have two roles: I’m Vice-President of Digital Creative Services, and Executive Producer for Virtual Production. My background has always been in post production, VFX, all-encompassing 3D animation, the entire thing.

When I was 23, I started my first business called Buzz Image Group. We were the largest post house in Canada, during the early days of VFX. We started a new technique with the auxiliary scanner where we printed video frames into physical film.

And then later on I started another company with Bell Canada called Jazz Media Network, which was a global broadband collaborative network that allowed professionals around the world to work in real time on whatever tools we could remotely control. We had video conferencing and annotation tools – stuff that looks normal today, we had uncompressed and real-time in the mid-90s.

Most recently, I worked at Moment Factory in a consulting endeavour to help them scale up their global production department. And then that led to being here, at MELs.

What is Virtual Production?

I’m glad you asked because virtual production now is not exactly what we thought it would be. It’s like the Swiss army knife, right? You think it does one thing but then you start opening the knife and there are all these things it can do that you didn’t envision.

The initial thought was to do a lot of game engine and 3D set extensions. The reality is, that is 10 percent of what we do. We do a lot of green screen work and video plates for car shoots. We also produce a ton of music videos; we probably did 12 music videos last year, using tools like Notch or TouchDesigner in the background, or even pre-rendered motion graphic material. We even use the LED stage as just a pure performing environment.

We did a massive 3D set extension for a movie with Joaquin Phoenix called Disappointment Boulevard. We built a 3D cruise ship and the performers were here on stage on a 32 by 32-foot platform acting on the deck of the boat. We had an amazing marriage between the physical set and virtual set extension.

So, in a nutshell, virtual production is just a fantastic, versatile tool.

And we’re just scratching the surface as we go. When we first built the stage, we thought we would display final CG materials on-screen for performers to react to. That will happen, but we believe working with plates will still be around for a long time. In this case, the CG might be between 50 to 100 percent state of readiness, but we will use it for lighting, reflection in the glasses, or for an actor’s performance.

And then, we are developing a technique which is called ghost framing, which allows us to do green screen and real camera action capture at the same time from the same camera. So it’s going to transform the world of compositing in a good way.

That’s how we see virtual production for the time being, and we see other tools coming and joining the band soon. Virtual production will evolve into being a bigger bag of tricks using electronic display.

Related: Creating the Metaverse with Volumetric Video and Virtual Production

Unlimited Transfer for Unlimited Creativity

Share up to 15 TB with a single file when you use MASV. Sign up today for 20 GB free.

How Long Has MELs Been Doing Virtual Production?

It was a pandemic initiative.. MELs kickstarted this stage before I came on as a proof of concept, and that’s when they invited me. And then we went on to do R&D for over a year, and production at the same time.

There was a lot of uncertainty about virtual production, so we tried all kinds of different tech. We went through the gamut of tech and on all systems: tracking systems, LED systems, process processors systems, the size of the LED, flat, concave ceiling, no ceiling, etc.

We established what would be the minimum for us to be in business with the stage. Our goal was to be an international presence, and have the scope and the breadth and the ambition of it. So it comes with a minimum, and if you go on our website you can download all the specs of our studio.

A cliffside ocean view is displayed on a virtual production stage

Can You Talk About the Upgrades to MELs Virtual Production Stage?

We said to ourselves,

“Hey, we need distance. We need a diameter of 65 feet minimum. Then we can play with the big productions.”

Distance is important. For example, we used to have 15 foot high wall. We quickly realized that we needed a minimum of 20 because when a DP tilts their camera up, 20 feet comes quickly. You don’t think about it, but it comes super fast.

That’s what all the R&D and trial-and-error did for us. We did that until Fall of 2021, after which we knew what we’re doing. We learned quite a bit the hard way.

From there, we purchased what is considered the world standard in virtual production technology. We didn’t reinvent the wheel. We went for all the best-of-breed or the best-in-class technology:

  • ROE, which is an LED manufacturer that’s well recognized.
  • Brompton, which is the processor that controls all LED matters and the color pipeline.
  • Disguise as the playback engine.
  • Unreal Engine, as the only game engine that can be put on a wall.

There were some choices that were easy to make.

Something we love to talk about though is our computerized, motorized ceiling. It allows us to control reflection and lighting down to the millimeter. With this ceiling, if you do something in the morning and you want to go back to precisely where you were 10 hours ago, or three days ago, you just recall it the time and place and the ceiling will move accordingly. It’s magical. The ceiling is like magic.

MELs magic ceiling in action.

It’s incredible that you pulled all this off in the matter of a year and a half or so, from concept to having a full set.

We had full support from everyone involved.

One of my requests to my boss was to be serious about this. We had full-time, permanent staff dedicated to making our virtual stage the best it could be, which isn’t typical.

Because of our dedicated resources, every week, our knowledge factor grew by five.

Thankfully, the stage has proven to be very valuable to us. We see it every time we have a meeting with a client. The client feels great and, most importantly, they feel secure that their production is in good hands, because we cover all angles: the tech, the creative, the camera, the lens, etc.

Can You Explain the On-Set Roles at a Virtual Production Studio?

At MELs we have something called the ‘Brain Bar’, which is a collective of roles on-set to facilitate virtual productions. In the Brain Bar, we have:

Tracking Specialist

The tracking specialist has to fully understand the scope of the physical set because that determines what kind of tracking device we’re going to use, what systems we’re going to use, and where to position (or reposition) so we can follow the camera in the virtual space and physical space.

That person will be a very sensitive touch point with the crew, especially the Director of Photography (we do put some hardware on the camera physically).

It’s easy to do a tracking system or calibrate the tracking system when nothing is moving — but when we’re in a shoot, every aspect on set is moving every minute: a person, a piece of furniture, he core lighting equipment, camera equipment, etc. The tracking specialist requires a commanding presence and a knowledge of where stuff will be during the set.

They also package the data so it makes its way to the compositing department in decent shape. It’s easy to have bad data. They are responsible for the integrity of the tracking data that might make its way to other departments.

A man waits for the train, except, he’s not at a subway track.

Reliable File Transfer of Heavy Data

Stop worrying about dropped files and incomplete transfers.

Chief Operator

Then you have the Chief Operator, who is responsible for the match, the playback, and color timing. We combined a few roles to create the Chief Operator.

Our Chief Operator is basically like a DJ in the middle of a production set. They are shoulder to shoulder with the director and the first AD — who should be their best friend. The Operator is the one that says,

“If you want this change, I can get it done, but we have to work on it now so I can render it later.”

In our case, this person is alos responsible for Disguise, which controls playback, and some key adjustments that need to be done for tracking.

Image Specialist

The third player we have is the Image Specialist. This person guides the rest of the crew. They know everything about lenses, cameras, captors, etc. They are the expert generalist, but more from the old-school world.

So if the DP wants a more compressed picture, a more open picture, the Image Specialist will give the right advice. They are the personal adviser to the DP.

Tech Artist

If the scene was built somewhere else (i.e. not at MELs), the client will typically bring their tech artists. If there are creative changes that are required, they’ll be here on set prior and they may be tasked with changing things.

How Do You Guide Filmmakers Through the Virtual Production Process?

Getting involved early on

The majority of filmmakers – 98 or 99 percent of directors have not experienced virtual production. Neither have set designers. So, as the virtual production experts, we intervene early on — I’m talking months in advance.

For example, if a scene is being created somewhere else in the world, we will be part of the creation process months before the crew comes to shoot with us. We don’t comment on the creative aspects of the scene but we advise on delivery methods, formatting, etc.

The analogy I like to use is, ‘don’t deliver us a flat file in Photoshop.’ We want all the layers separate and organized in a certain way. This is so when it’s time to shoot, and the director decides to get rid of an element in the background, my team can say, ‘That’s layer seven, no problem.’ Click, delete, done.

Script Review

We start the education process at the script level, and that’s for films that will be shot a year from now.

This is where we sit down with a director; we discuss where they are coming from. We’re trying to understand what the director really wants to accomplish. Then we slowly introduce them to the tech and show examples.

Set Walkthrough

We meet with the director a lot before we get on-set (we’re open 24/7). We take them through a walkthrough of the set so they’re not so intimidated. They learn all the words they need to understand at all the meetings, and we make sure that they get the lingo, and they get comfortable.

It’s important to note that virtual production artists are not some tech guys in the background. We part of the filmmaking process and we want filmmakers to have the best possible creative outcome. I often tell directors,

“I’m your partner today. I’m your very best friend. Let’s rock and roll.”

And that’s how they, we hope, have a fantastic experience.

Performers

For performers, it’s a far more immersive experience than using a green screen. Without virtual production, a performer has to imagine they are standing at the base of a volcano. If it’s a sci-fi movie, let’s say, the actor has to envision a tall alien creature without knowing the actual height.

In a virtual set, they can look at the tall alien on the screen and make eye contact at the perfect height.

Josh Brolin wears a Thanos head extension on-set

Josh Brolin wears a head extension so co-stars have a visual eye line.

Is Building a Virtual Set the Same as Creating a Traditional Set?

No, that’s changed, for the better. For us on the creative side, this is the coolest part of virtual production. Let’s take the cruise ship example I mentioned above.

The set designer calls us and says,

“Hey, for this scene, this is the deck chair I want to use, this is the umbrella and its color and texture. This is the material I’m going to be using.”

So, our job is to digitize all this. And then, the set designer wants to put chairs in the background. Since we’re the background to the foreground, we need to start working with them to create a layout in the background that will match the physical foreground. And we’ve got to work together at the same time.

This is contrary to a traditional visual effects workflow, where the footage is already shot, and the VFX artists have to make do. There’s no interaction with the set designer at all.

This same process also works vice versa.

Sometimes, we’ll create stuff that the set designer needs to replicate in the real world. So, we will create 3D models and send it off to the designer for construction. Or, if the script calls for something special, we’ll 3D print the item.

It’s important to be agile. If we need to change a utensil two days before a shoot, we need to be able to digitize or get access to the 3D assets. The lines of communication need to be open and direct.

And then filmmakers need to decide — and this is kind of a new thing to manage — do they create assets physically or digitally? If a table is required on set, will the foreman build another physical table, or does the virtual production studio duplicate an existing table and add it in digitally?

Related: Photogrammetry, Virtual Sets, and the Future of Digital Data

What Type of Software/Hardware is Required to Bring the Sets to Life?

Unreal Engine

Software-wise, for 3D scenes, there’s only one game in town: Unreal Engine, from Epic Games. And it’s not just a game engine we happen to leverage. Unreal Engine has a specific focus on virtual production. They have several tools put together for real-time virtual connections.

Through Megascans we have access to amazing textures.

Disguise

The second layer, which is actually half software, half hardware, is the Disguise platform which allows us to do the hard work, which is pushing high-res video playback onto a wall.

You need very powerful software and hardware for that. It’s expensive but that’s the price to pay for such high resolutions. Disguise is intelligent enough to take into account what is a mesh, where to broadcast the material; it understands the physical environment on which things will be displayed.

Color Grading

We also do live color grading because what you see on screen is what you get in-camera. We need to be able to grade live and at a moment’s notice. We don’t have an hour to go back and tinker with the gamma range as there are a hundred people waiting for us on-set.

Brompton Processor

Otherwise, the Brompton processor is key to elegantly, and rapidly, switch from 24 frames, to 30 frames per second, to 60 frames per second. It’s a fundamental component, especially if a director wants to do slow-mo or work with the color temperature of the LED.

How are Real-Time Renders Possible?

A virtual production tech team behind-the-scenes

It does take time but this is where MELs, in particular, shines. The trick with the game engine is to get the frame rate to be real-time. With Unreal, we can have the most amazing and beautiful scene, but it’s not going to be so good if it does not play in real time.

It’s a trade-off; either the scene doesn’t look as good in order to get the real time frame, or, we have to bake stuff into the scene. So, we will bake lighting, or we will bake shadows. We’ll decide how to enhance the performance of that scene.

If the director or the DP requests a change, we tell them what’s possible in a given amount of time. Most of the things they ask for, we can do in real-time because of how we plan for the shoot.

Since we get involved early on, we know the scene by heart, so we’ll make an intervention ahead of time to the DP.

Are There any Special Pieces of Equipment Required to Record a Virtual Set?

Lenses

We need to know what lens the camera crew will use because we need to map the lens to our system. It’s not difficult, mind you, we just need to know the lens type. Most of the different lens types out there are already mapped into our system.

Frame Rate

The frame rate is something we discuss way ahead of time. As mentioned, the frame rate of the game engine dictates the output. So, if a DP wants to change the frame to a higher speed without notice, it’s not possible. We make sure this conversation is a part of the pre-production process.

If the client is adamant about doing a very last minute change, we can flip our virtual production stage into a green screen in 1.2 seconds. Then it’s regular VFX comp.

What Happens with a Virtual Set After it Has Been Filmed?

Every physical asset belongs to production. The digital asset will eventually be shared with the visual effects house.

On a related note, there are grey areas around rights management.

For example: We use a scene of forest and the client says it’s their forest but the VFX house says no, it’s public domain. Unless we heavily change the scene up, by transforming it into a mystical magical forest, let’s say, it becomes a point of contention.

It does raise the issue of what is digital domain and what is actually an asset the production owns.

Send Hi-Res Assets

Use MASV to send video & images, audio files, production files, or entire project folders.

What Advancements in Virtual Production Do You See in the Immediate Future?

We know it’s just going to get better. It’s kind of a set answer, but we know that GPUs are going to get better. We’re using a 6000 series but we’re waiting for a 12,000. And after that, it’s going to be a 24,000 or the next crazy GPU. And that allows us to do even more photo-real material.

And as time progresses, people will begin to understand virtual production and the power of it, and its popularity will increase.

I actually see virtual production being used in television more.

I see virtual production tackling daily shoots and stuff like that. I can see physical decor construction shrinking by 80 percent in the next five years, which obviously impacts the set design industry.

I also think it’s going to have an impact on the visual language sooner than later. We see the language changing; things we thought are up-and-coming are de facto now, right? We watch stuff on TikTok, on Instagram, and on YouTube, on a streaming device, etc. We’re now getting used to a new language, visual language. And I think virtual production is going to have a huge role to play.

We’d like to thank Richard Cormier for taking us through the ins-and-outs of virtual stage production. From the tech itself, to the people on-set, to the changing face of filmmaking thanks to real-time, photorealistic renders of physical locations.

Virtual production has revolutionized how we create and enjoy media — and studios like MELs are at the forefront of this revolution. The key to a successful virtual production shoot is proper pre-planning, fluid communication, and of course, super high-resolution video and image assets.

Moving these heavy media files, for example, from a virtual set to a VFX team for compositing requires a robust file transfer solution that can handle the delivery of large videos.

MASV is a secure large file transfer solution capable of sharing terabytes of data, over the cloud, fast and without fail. We are trusted by production studios, post teams, live broadcasters, and freelance video teams to send and receive copyrighted material across all stages of their respective workflows. From hours of RAW footage, to hi-res proxies, DCPs, and large file formats, such as DPX.

When you sign up for MASV, you get access to 20 GB of free data credits to use as you see fit. Click the button below to get started. 👇

MASV File Transfer

Get 20 GB to use with the fastest, large file transfer service available today, MASV.