Paul Debevec presents “The Full Spectrum of Virtual Production” to the San Francisco Chapter of ACM SIGGRAPH on May 17, 2023.

Date and time: Wednesday, May 17, 2023 · 6 – 9pm PDT

Location: Online

Today’s virtual production stages surround actors with the light of millions of LEDs, creating real-world image-based lighting in addition to in-camera backgrounds. But these stages often fall short of lighting the actors the same way they would appear if they were actually in the intended environments. LED panels lack the dynamic range and maximum intensity needed for many lighting environments, especially ones with direct sunlight, and stages rarely cover all the angles light can come from. And most notably, today’s LED panels have poor color rendition, with a spectrum consisting of peaks of red, green, and blue wavelengths which misses important parts of the spectrum in the orange, yellow, and cyan wavelengths. While the panels display just about any color the eye can see, their spectrum is so different from the sources we are used to (such as daylight and incandescent light) that actors and sets exhibit strange color shifts when lit in a VP stage. Skin tones shift to red or pink, oranges shift to red, yellow becomes dim and brownish, and cyans can shift to blue. In this talk, I’ll describe the underlying color science issues and show how both software and hardware improvements offer solutions, drawing upon the lighting reproduction research at the USC Institute for Creative Technologies and Netflix Research.

Speaker Bio:

Paul Debevec is the Chief Research Officer of Netflix’s Eyeline Studios where he oversees R&D in new technologies in computer vision, computer graphics, and machine learning with applications in visual effects and virtual production. Paul’s 2002 Light Stage 3 system at the USC Institute for Creative Technologies was the first LED Stage to illuminate live-action actors with imagery of digital sets for virtual production. Techniques from Paul’s work have been used to create key visual effects sequences in The Matrix, Spider-Man 2, Benjamin Button, Avatar, Gravity, Furious 7, Blade Runner: 2049, Gemini Man, Free Guy, numerous video games, and to record a 3D Portrait of US President Barack Obama. Today’s Paul’s light stage facial capture systems are helping numerous visual effects and technology companies create photoreal digital actors and advance ML datasets for facial appearance. Paul’s early work in High Dynamic Range imaging, image-based lighting, and light stage facial capture has been recognized with two technical Academy Awards, SMPTE’s Progress Medal, and a Lifetime Achievement Emmy Award. Paul is a Fellow of the Visual Effects Society and a member of the Television Academy’s Science and Technology Peer Group, and a Governor of the Visual Effects Branch of the Academy of Motion Picture Arts and Sciences and co-chairs the Academy Science and Technology Council. More info at: