We’re all used to the cartoon like VR worlds that dominate the Metaverse, but photo realistic worlds aren’t far away now …


Love the Exponential Future? Join our XPotential Community, future proof yourself with courses from XPotential University, read about exponential tech and trendsconnect, watch a keynote, or browse my blog.

If you’ve ever wanted to be able to create and render virtual worlds that literally are indistinguishable from the real thing then you’re in luck after Luma AI debuted a plugin for NVIDIA NeRF, a volumetric capture and immersive content creation suite which enables developers to run their designs on Epic’s Unreal Engine 5 as a real-time render.


Scientists double the speed of quantum storage devices for quantum computers


The update allows Unreal developers to run Luma AI volumetric renderings locally, therefore negating the need for streaming, geometry, or material alterations. Platform developers can import Luma AI volumetric captures into an Unreal project, immediately presenting the digitised object in a developer’s pipeline.


The Future of the Metaverse, by keynote Matthew Griffin


Developers creating immersive gaming content or virtual production assets can leverage Luma AI’s plugin to streamline their production pipelines. Moreover, brands and entertainment groups could leverage the technology stack to create bespoke content such as immersive media and experiences.

Luma AI’s “Video-to-3D” API software allows XR developers to generate NeRF digital Twins at a dollar a scene while only taking 30 minutes to create a virtual object. The pricing allows Luma AI to democratise high-quality RT3D content creation for developers representing various use cases. Luma AI states that its NeRF software is applicable to use cases such as eCommerce inventory previews and VFX content production.


Star Trek like Replicator manufacturing tech unveiled for first time


The firm’s solution allows platform developers to embed NeRF objects directly into an RT3D engine, add textured objects, and import 360 scene captures.

Using Luma AI’s volumetric capture APIs, developers can create a digital twin of a real-life object and then import the file into Unreal 5 by installing the firm’s fresh plugin. Then, from there Unreal developers can simply drag and drop their NeRF file into a project allowing for photorealistic assets in VR experiences.

Additionally, Luma AI provides four “Blueprints” to assist an Unreal developer with suitably importing an asset into a virtual world which reflects its digital environment. Luma AI’s Baked Blueprint accurately simulates a captured object’s real-life lighting conditions. Dynamic Blueprints allow a NeRF object to alter based on Unreal lighting and shadow conditions. Luma AI’s Cropped Blueprint provides an automatically shortened version of NeRF objects, if present. Finally, the Environment Blueprint simulates a suitable sky-box for each NeRF capture.


Harvard puts a 3D printed Heart on a Chip to make drug testing safer


Luma AI is working on RT3D rendering solutions that flatten the curve for local computing requirements. For example, in February, the firm introduced NeRF rendering tools that enable developers to preview RT3D content on web browsers.

In March, Luma AI secured roughly $20 million in a Series A funding round, accelerating Luma NeRF distribution for brands, studios, and smartphone users. In a statement regarding its successful Series A funding, Luma AI expressed that it is “aggressively expanding” its team to work on data and engineering systems that support large-scale renders and the “next generation of UX.”

The funding round introduced NVentures, NVIDIA’s Venture Capital Group, to its partners, which also includes Matrix Partners, South Park Commons, and RFC’s Andreas Klinger.

Mohamed (Sid) Siddeek, the Corporate Vice President and Head of NVentures, said:

“Luma is one of the first to bring to market new 3D NeRF capture technology, enabling non-technical users to create higher quality 3D renders than has previously been possible. The team has consistently demonstrated the ability to rapidly productize new research, and their long-term vision of democratizing access to 3D workflow tools for consumers has the potential to drastically lower the skills barrier to create & edit in 3D.”


CGI "Virtual influencers" on Instagram battle it out in public, snag real ad cash


NVIDIA is also working on a NeRF service. Instant NeRF from NVIDIA leverages AI networks to create a complete volumetric capture of an object or environment with rich considerations of a subject’s lighting, shadows, and reflections.


Welcome to Luma AI


The service also includes tools to import volumetric content into NVIDIA’s Omniverse suite for creating enterprise-grade immersive solutions, such as a VR scene. NVIDIA developers can integrate their NeRF content across various use cases, such as property management, retail, and eCommerce.

Moreover, Epic Games provides a volumetric capture solution via RealityScan, an integrated NeRF service for Unreal 5. Using a sister smartphone application, the service allows users to create high-quality digital twins with a phone camera. The Reality Scan smartphone service also employs AR visualisations and feedback to reduce output errors.


AI has learnt how to predict how smart you are from brain scans


There are various NeRF services emerging from trusted RT3D solutions providers. XR Today listed a handful of volumetric capture applications and services to help lead developers into the new design territory.

NeRF and photogrammetry solutions prepare XR enterprise designers for the Industry 4.0 landscape that presents new digital transformation opportunities; that’ll only increase as additional software, products, and updates debut and as RT3D engines become increasingly user-friendly and low-barrier.

About author

Matthew Griffin

Matthew Griffin, described as “The Adviser behind the Advisers” and a “Young Kurzweil,” is the founder and CEO of the World Futures Forum and the 311 Institute, a global Futures and Deep Futures consultancy working between the dates of 2020 to 2070, and is an award winning futurist, and author of “Codex of the Future” series. Regularly featured in the global media, including AP, BBC, Bloomberg, CNBC, Discovery, RT, Viacom, and WIRED, Matthew’s ability to identify, track, and explain the impacts of hundreds of revolutionary emerging technologies on global culture, industry and society, is unparalleled. Recognised for the past six years as one of the world’s foremost futurists, innovation and strategy experts Matthew is an international speaker who helps governments, investors, multi-nationals and regulators around the world envision, build and lead an inclusive, sustainable future. A rare talent Matthew’s recent work includes mentoring Lunar XPrize teams, re-envisioning global education and training with the G20, and helping the world’s largest organisations envision and ideate the future of their products and services, industries, and countries. Matthew's clients include three Prime Ministers and several governments, including the G7, Accenture, Aon, Bain & Co, BCG, Credit Suisse, Dell EMC, Dentons, Deloitte, E&Y, GEMS, Huawei, JPMorgan Chase, KPMG, Lego, McKinsey, PWC, Qualcomm, SAP, Samsung, Sopra Steria, T-Mobile, and many more.

Your email address will not be published. Required fields are marked *