Paul Debevec is a Senior Scientist at Google VR and Adjunct Research Professor of Computer Science in the Viterbi School of Engineering at the University of Southern California, working within the Vision and Graphics Laboratory at the USC Institute for Creative Technologies.
Debevec's computer graphics research has been recognized with ACM SIGGRAPH's first Significant New Researcher Award in 2001 for "Creative and Innovative Work in the Field of Image-Based Modeling and Rendering", a Scientific and Engineering Academy Award® in 2010 for "the design and engineering of the Light Stage capture devices and the image-based facial rendering system developed for character relighting in motion pictures" with Tim Hawkins, John Monos, and Mark Sagar, and the SMPTE Progress Medal in 2017 in recognition of his achievements and ongoing work in pioneering techniques for illuminating computer-generated objects based on measurement of real-world illumination and their effective commercial application in numerous Hollywood films. In 2014, he was profiled in The New Yorker magazine's "Pixel Perfect: The Scientist Behind the Digital Cloning of Actors" article by Margaret Talbot and worked with the Smithsonian Institution to scan a 3D model of President Barack Obama at The White House.
Debevec earned degrees in Math and Computer Engineering at the University of Michigan in 1992 and a Ph.D. in Computer Science from UC Berkeley in 1996. In 1991, he combined techniques from computer vision and computer graphics to create an image-based model of a Chevette automobile from photographs. At Interval Research Corporation he contributed to Michael Naimark's Immersion '94 virtual exploration of Banff National forest and collaborated with Golan Levin on the interactive art installation Rouen Revisited.
Debevec's 1996 Ph.D. thesis with Prof. Jitendra Malik presented Façade, an image-based modeling system for creating virtual cinematography of architectural scenes using new techniques for photogrammetry and image-based rendering. Using Façade he directed a photorealistic fly-around of the Berkeley campus for his 1997 film The Campanile Movie whose techniques were later used to create the Academy Award-winning virtual backgrounds in the "bullet time" shots in the 1999 film The Matrix.
Following his Ph.D, Debevec pioneered techniques for illuminating computer-generated objects with measurements of real-world illumination. His 1999 film Fiat Lux rendered towering monoliths and gleaming spheres into a photorealistic reconstruction of St. Peter's Basilica, realistically illuminated by the light that was actually there. Techniques from this research known as HDRI and Image-Based Lighting have been used to dramatic effect in films such as the The Matrix sequels, The Curious Case of Benjamin Button, Terminator: Salvation, District 9, and Avatar. Debevec led the design of HDR Shop, an high dynamic range image editing program for visual effects production, and co-authored the 2005 book High Dynamic Range Imaging, now in its second edition. Debevec's 2004 computer animation The Parthenon used 3D scanning, inverse global illumination, HDRI, and image-based lighting to virtually reunite the Parthenon and its sculptures, contributing to depictions of the Parthenon's history for the 2004 Olympics, NHK televison, PBS's NOVA, National Geographic, the IMAX film Greece: Secrets of the Past, and The Louvre.
At USC ICT Debevec has led the development of several Light Stage systems that capture and simulate how people and objects appear under real-world illumination. Early Light Stage processes have been used by Sony Pictures Imageworks, WETA Digital, and Digital Domain to create photoreal digital actors in award-winning visual effects in Spider-Man 2 and King Kong, Superman Returns, Spider-Man 3, Hancock, and the The Curious Case of Benjamin Button. The most recent light stage process based on polarized gradient illumination has been used in numerous films including James Cameron's Avatar, The Avengers, Oblivion, Ender's Game, Gravity, and Maleficent. This high resolution facial scanning process was used in 2008's Digital Emily project, a collaboration with Image Metrics which produced one of the first digital facial performances to cross the "Uncanny Valley", and Digital Ira, a collaboration with Activision which has produced one of the earliest photoreal real-time digital characters.
Collaborating with virtual reality pioneer Mark Bolas, Debevec has developed glasses-free 3D displays involving spinning display surfaces and video projector arrays for applications such as 3D Teleconferencing and, in collaboration with USC's Shoah Foundation, preserving the testimony of survivors of the Holocaust.
In 2002 Debevec was named one of the world's top 100 young innovators by MIT's Technology Review magazine, and in 2005 received a Gilbreth Lectureship from the National Academy of Engineering. In 2005 Debevec received the Special Award for a Distinguished Professional Career in Animation/VFX from the Mundos Digitales Festival in A Coruna, Spain and in 2009 received the "Visionary Award for VFX" at the 3rd Annual Awards for the Electronic and Animated Arts.
Debevec is a member of the Academy of Motion Picture Arts and Sciences, from 2012 to 2018 a member and then co-chair of the AMPAS Science and Technology Council, a Fellow of the Visual Effects Society, and a member of ACM SIGGRAPH. He chaired the SIGGRAPH 2007 Computer Animation Festival and co-chaired Pacific Graphics 2006 and the 2002 Eurographics Workshop on Rendering. From 2008 to 2014, he served on the Executive Committee and as Vice-President of ACM SIGGRAPH.