Skip to main content

Author: Caroline Chauvet

Image-Based Lighting for Virtual Production

What You’ll Learn in This Series

  • Image-based lighting (IBL) and how it offers superior visual effects quality and color fidelity.
  • How virtual production techniques supported by best-in-class lighting technologies can deliver a streamlined workflow for filmmaking and support increased cost savings and efficiencies in post-production.
  • Kino Flo’s cutting-edge technology also unlocks previously impossible workflows through new techniques, such as Camera Mode, which features subframe synchronization.
  • A look at recent projects that successfully used IBL and Kino Flo solutions.
  • How directors, cinematographers and others can achieve greater creative control via ‘what you see is what you get’ filmmaking tools.

Introduction to Kino Flo MIMIK 120 and Matchmakker Technology

One of the key challenges in virtual production is creating realistic full-spectrum light in LED volumes. Your audience’s eyes want to believe the light falling on the actors and props within a scene emanates from the background, as it does naturally in real-world locations. However, achieving a precise match while preserving full-spectrum color is a crucial issue in LED volumes and cannot be achieved by the panels on the wall alone.

The MIMIK 120 hybrid video tile addresses these challenges by deriving precise color and intensity values from a video signal via an innovative video processor-based workflow. This new series of creative blog posts will detail some of the creative challenges inherent in virtual production lighting and explore Kino Flo’s cutting-edge solutions.

Core Technology: MIMIK 120

MIMIK 120 is Kino Flo’s pioneering hybrid video tile engineered to create accurate and controllable full-spectrum light. Featuring a carbon fiber frame for enhanced mobility and flexibility, MIMIK offers cinematic color fidelity and extended spectral bandwidth for lifelike lighting effects.

Each MIMIK 120 consists of 7200 pixels at a 10mm pixel pitch. Unlike the standard three RGB emitters in conventional on-camera LED panels, MIMIK offers five LEDs per pixel, adding warm and cool white emitters (RGBWW) to fill in the portion of the spectrum absent in RGB panels.

In addition to an improved color spectrum, MIMIK 120 is also remarkably bright, with a maximum calibrated capacity of up to 10,000 nits. MIMIKs can be stacked and linked to create nearly any desired array, covering scenes in various scales.

Core Technology: Matchmakker

Matchmakker is the patented, custom software solution that empowers MIMIK 120 to translate video signals into full-spectrum light. The software intelligently leverages video processor hardware to convert an incoming RGB signal for the fixture’s five individual emitters.

By harmonizing the background on-camera imagery with interactive light for the foreground actors and set, Matchmakker automatically creates a synchronized and highly accurate image. Starting with a video signal and ending with a full-spectrum lighting source provides a best-of-both-worlds approach. The video signal matches the intensity and color present in the background while delivering a broad-spectrum ambiance that preserves accurate skin tones and costume appearance.

Core Technology: Video Processor-based Workflow

MIMIK’s control via video processor unlocks previously unheard-of levels of speed and synchronization beyond the capabilities of DMX or LED panel-based lighting equipment. MIMIK is closely integrated with Megapixel VR’s HELIOS LED processor, a leading solution in the virtual production market.

In addition to converting video signals to lighting commands, the video processor-based workflow integrates with next-generation techniques such as subframe mapping. With its 30 kHz response time and subframe-accurate synchronization, MIMIK can produce multiple lighting setups simultaneously. For example, one camera could see an ICVFX background, while another sees only a green screen, and each receives perfectly matched lighting.

Although MIMIK can be set to automatically mimic the color and intensity of any desired portion of a video signal, it can also work in conjunction with a traditional lighting desk. Depending on the desired usage, this flexibility means MIMIK can seamlessly integrate into nearly any workflow and crew configuration as a plug-and-play solution.

A single HELIOS processor supports up to 38 million pixels for highly intricate lighting setups. It also enables precise synchronization of lighting with video plates and other sources. For convenient cabling, the processor workflow uses connectivity for long-distance signal quality and flexibility when the stage is remote from the processor and brain bar.

With the core technologies described, the following sections explore potential use cases for MIMIK 120.

Use Case: Driving Scenes

Although MIMIK isn’t limited to use with LED Volumes, virtual production highlights its strengths. Driving scenes are a widespread use case for LED volumes due to the increased convenience and safety compared to shooting car scenes on the open road. The major challenge for driving scenes in virtual production, especially in nighttime settings, is providing convincing interactive lighting.

For example, every time a street light or other bright object appears in the background imagery, the eye naturally expects to see it both reflected in the car and influencing the lighting of the actors in the vehicle. The traditional approach requires bespoke lighting gags and instruments running manually or through DMX setups to approximate interactive lighting. However, these approaches are not always accurate nor easily repeatable over multiple shoot days and takes.

Using the source video footage to drive interactive lighting through the video processor and Matchmakker, MIMIK 120 provides frame and subframe-accurate synchronization with the background plates and precise, repeatable effects. The result is a much more efficient setup and accurate operation of interactive lighting for virtual production while providing continuously usable footage from start to finish of each take.

Use Case: Camera Mode with Ghostframe

Camera Mode with Ghostframe is a groundbreaking new technique that leverages video processor technology to enable multiple cameras to receive different content when capturing the same LED screen. For example, one camera could capture a live 3D animated background while another gets a green screen for post-production flexibility. Or two cameras could have discrete overlapping frustums and capture multiple camera in-camera effects angles simultaneously. Ghostframe displays a dominant image to the viewer and suppresses or “ghosts” the alternate subframe images that the other cameras can see.

Camera Mode uses genlock and camera shutter speed to synchronize each camera to a different subframe slice of content within a second of playback. A second can be subdivided into as many as 30 different sources for each camera, while a dominant source can be selected to be the most visible to the studio crew. MIMIK further enhances the subframe workflow by providing illumination capable of synchronizing at high speed and with frame accuracy to ensure each camera receives the desired lighting effect appropriate to its synchronized image feed.

Camera Mode may sound somewhat complex, but it’s integrated into the video processor workflow with a straightforward setup. Once in use, the technique unlocks many valuable opportunities. For example, when time with key talent is limited, a single take can capture various backgrounds, significantly increasing the amount of completed footage that can be achieved in a given schedule.

It also overcomes one of the critical challenges to in-camera VFX: when the captured footage needs to be altered in post-production for creative or technical reasons. By capturing a live in-camera VFX background and a green screen version for safety in one take, subframe mapping represents a best-of-both-worlds solution for virtual production.

MIMIK also lends itself well to other use cases, which we’ll explore in more detail in upcoming posts in this series. The following sections compare image-based lighting with other existing virtual production lighting workflows.

MIMIK Compared to Conventional LED Panels

First-generation LED volumes were often built with ceilings to provide interactive lighting. Typically, the ceiling panels used lower resolution pixel densities because the camera wasn’t intended to view them directly. While this approach provided a ballpark approximation of interactive light, the RGB emitters of conventional LED panels resulted in reduced color rendition, inaccurate skin tones and metamerism, where props and costumes don’t appear with accurate color on camera.

MIMIK 120 combines RGB and white LEDs to provide a broader color spectrum, delivering more accurate skin tones and proper prop and costume color rendition. MIMIK offers far higher precision than a standard LED panel, with 7200 individually addressable pixels per unit for deeply nuanced lighting effects. Compared to traditional LED panels, MIMIK renders superior brightness and a refresh rate of up to 30Hz for more dynamic scenes and lighting effects.

MIMIK Compared to Cinema Lights with DMX/Pixel Mapping

Before MIMIK, cinema lights with DMX control were the preferred solution for full-spectrum lighting for virtual production. In conjunction with pixel mapping applications, operators define areas of a video signal to sample and send intensity and RGB, CCT or other color command protocols.

While pixel mapping cinema lights preserve full-spectrum color, DMX wasn’t designed for subframe accuracy or high-speed lighting changes. DMX is also limited to a maximum of 512 channels per universe. MIMIK overcomes these limitations via Matchmakker with many more channels than an equivalent DMX universe could ever contain. MIMIK also offers subframe speed, unlocking Camera Mode with Ghostframe techniques that would be difficult or impossible with pixel mapping and DMX solutions.

Conclusion: Creativity at the Speed of Light

The key advantages of MIMIK 120 include full-spectrum, image-based lighting coupled with game-changing speed and synchronization capabilities. Please join us for the rest of this series to learn how MIMIK is being used and what creative possibilities it offers.

In the subsequent updates of this series, we’ll cover topics including cinematic color fidelity, RGB to RGBWW conversion, and a primer for gaffers and lighting board operators. We’ll also explore case studies, including the hybrid in-camera/traditional VFX of Shrapnel with Sam Nicholson, ASC and a virtual production music video workflow with director Snehal Patel. Anything is possible with creativity at the speed of light.