LOS ANGELES — Cary, NC’s Epic Games hosted its first SIGGRAPH Unreal Engine User Group and party in Los Angeles on Monday evening. The User Group featured an exciting lineup of presentations – from Epic Games Founder and CEO Tim Sweeney and Enterprise GM Marc Petit to companies as diverse as global architectural firm HOK, self-driving car start-up Zoox and animation production company Digital Dimension, among others. Customer presentations and third-party integrations underscored how Unreal Engine is transforming content creation across animation, feature film, augmented and virtual reality (AR/VR), architecture, and product visualization.
“These incredible customers are just a sample of those turning to Unreal Engine to create high quality, interactive content–whether for large-scale design visualizations, children’s animated TV shows, AR, VR or artificial intelligence applications,” said Marc Petit, Enterprise GM, Epic Games. “Visual fidelity in real time is what we’re known for–now we’re adding workflow tools that simplify pipelines considerably and we look forward to demonstrating them at SIGGRAPH!”
Epic will showcase real-time productions, workflow tools, and interactive experiences at several events during this year’s SIGGRAPH: from Real-Time Live!, to the VR Village, to the Computer Animation Festival, and much more.
Epic Games CTO Kim Libreri provided the keynote at the DigiPro Symposium on July 29 at the Beaudry Theatre at Los Angeles Center Studios, where he discussed “Real-time technology and the future of production.”
Throughout the SIGGRAPH conference, Epic Games will coach the VR Film Jam, where teams will convert animated shorts into interactive VR experiences.
In addition, this is the first year real-time production is represented at the Computer Animation Festival, and the only two real-time films being shown are both Unreal projects ("A Hard Day's Nite" and "The Human Race").
Lastly, Meet Mike in VR is a state-of-the art digital human performance in virtual reality, powered by Unreal Engine. Spearheaded by FX Guide’s Mike Seymour as part of his PhD with the University of Sydney, this is a widely supported global research project that is pushing hardware and software limitations to provide a glimpse into the future of photorealistic, character-driven real-time production.