Army-sponsored research used to produce first-of-its kind models of a president
June 26, 2014
By Orli Belman, Institute for Creative Technologies; and Joyce M. Conant, ARL Public Affairs
- The Smithsonian's 3-D presidential portrait project represents the first deployment of a Light Stage system designed for mobile use
- 3-D presidential printed bust and life mask displayed at the first-ever White House Maker Faire
- ICT is an Army-sponsored, University Affiliated Research Center, managed by the U.S. Army Research Laboratory
The University of Southern California's Institute for Creative Technologies (ICT) was part of a Smithsonian-led team that created 3-D portraits of President Barack Obama. The portraits include a digital and 3-D printed bust and life mask. Both were on display at the first ever White House Maker Faire on June 18.
The team scanned the President earlier this year using two distinct 3-D documentation processes. Experts from ICT used their Light Stage face scanner to capture high-resolution shape and reflectance properties of the President's face in seconds. Next, a Smithsonian team used handheld 3-D scanners and traditional digital cameras to record peripheral 3-D data to create an accurate bust. The data and the printed models are intended to become part of the collection of the Smithsonian's National Portrait Gallery.
"The Smithsonian's 3-D presidential portrait project represents the first deployment of a Light Stage system designed for mobile use and the fastest scanning session ever conducted by ICT's graphics laboratory," said Paul Debevec, ICT's chief visual officer and the inventor of the Light Stage technologies. "The Smithsonian Institution had an ambitious vision to create the first ever 3-D printed model of a president and it was an honor to contribute our technology to the process."
ICT is an Army-sponsored, University Affiliated Research Center, managed by the U.S. Army Research Laboratory (ARL) and is devoted to advancing the art and science of simulation. Institute research and development efforts can be seen throughout the Army – to include a virtual reality therapy for treating post-traumatic stress, interactive virtual humans for training and education, and tools and techniques for creating low-cost, immersive head mounted displays. The Army funds much of the basic research that goes into the development of the Light Stage systems.
Debevec said the graphics laboratory's involvement in the presidential scanning project represents several breakthroughs. It provides the basis for a new form of presidential portrait – one that can digitally recreate every skin pore and fine line and present it in 3-D. It also proves the Light Stage is portable. He said this opens up opportunities to scan people and objects all over the world.
"This collaboration is a great symbol of the imagination and innovation that the government, academia and industry can accomplish by working together," said Randall W. Hill, Jr., executive director of ICT. "The final result is amazing and shows the power that Army-sponsored research can have in developing technologies to preserve our present and past."
Hill said the ICT graphics laboratory has refined its facial rendering techniques in collaboration with Hollywood's visual effects industry, helping digitize the stars of movies including Avatar, Gravity and Maleficent to create computer-generated characters with the appearance of real people. Hill added that the Light Stage process has been used to help create believable digital characters for the Emergent Leaders Immersive Training Environment (ELITE) which was developed with ARL, as a state-of-the-art, low-overhead training platform for interpersonal communication and leadership skills.
"The partnership between ARL and the ICT was initiated to leverage the creative talents of the entertainment industry while pushing the state-of-the-art in simulation and training technologies," said ARL's John Hart, who is the chief of the Human Research and Engineering Directorate's Creative Technologies Branch located in Orlando, Fla., and the program manager for ICT. "This effort is a result of our interest in creating photorealistic virtual humans to train Soldiers on social skills like negotiation, interacting with different cultures and developing leadership skills."
Current ICT projects include recording and projecting 3-D, life-sized depictions of heroes and historical figures, including recent Medal of Honor Recipient Ty Carter and Holocaust survivor Pinchas Gutter – in collaboration with the USC Shoah Foundation, in partnership with Conscience Display.
"When combined with artificial intelligence algorithms from ICT's natural language group, these interactive projections can answer people's questions about their lives and experiences," said Hill.
In 2010, Debevec and his collaborators received a scientific and engineering academy award for the development of the devices along with systems for creating realistic digital faces. In addition to Debevec, key ICT contributors to the presidential scanning project were Graham Fyffe, Xueming Yu, Maj. Paul Graham and Jay Busch. The recent Maker Faire was the first such event at the White House. It was designed as an opportunity to showcase and inspire American innovation.
ICT's Light Stage: How it Works
USC ICT's Mobile Light Stage has 50 light sources (each with a cluster of 12 bright white LEDs) arranged similarly to the stars of the American flag. Each light is its own custom-designed networked computer and can produce any brightness or polarization of light.
Each light stage scan is a flurry of 10 photos taken in about a second by eight cameras as the lights emit different gradients of intensity and polarization, generating 80 photographs tightly zoomed in on the face from all angles. At the end, six auxiliary cameras take one photo each under flat lighting seeing down to the shoulders and rest of the bust.
Computer algorithms identify corresponding points on the face in the different viewpoints, allowing a rough 3-D model to be triangulated from the correspondences. The differently polarized lighting conditions allow the shine off of the skin's surface to be imaged separately from the light, which scatters beneath the skin and the play of light over the skin's shiny component records minute texture details at the level of skin pores and fine creases to a tenth of a millimeter accuracy.
The final 3-D model covering the face from ear to ear, and forehead to tie, is a digital surface mesh with several million tiny triangles and corresponding 30-megapixel digital images of the face's texture and shine.