Scroll Top

Luma AI brings photo realistic virtual worlds closer to reality

Futurist_lumaai

WHY THIS MATTERS IN BRIEF

We’re all used to the cartoon like VR worlds that dominate the Metaverse, but photo realistic worlds aren’t far away now …

 

Love the Exponential Future? Join our XPotential Community, future proof yourself with courses from XPotential University, read about exponential tech and trendsconnect, watch a keynote, or browse my blog.

If you’ve ever wanted to be able to create and render virtual worlds that literally are indistinguishable from the real thing then you’re in luck after Luma AI debuted a plugin for NVIDIA NeRF, a volumetric capture and immersive content creation suite which enables developers to run their designs on Epic’s Unreal Engine 5 as a real-time render.

 

RELATED
Revolutionary new SkinGun fires stem cells to heal burns without scarring

 

The update allows Unreal developers to run Luma AI volumetric renderings locally, therefore negating the need for streaming, geometry, or material alterations. Platform developers can import Luma AI volumetric captures into an Unreal project, immediately presenting the digitised object in a developer’s pipeline.

 

The Future of the Metaverse, by keynote Matthew Griffin

 

Developers creating immersive gaming content or virtual production assets can leverage Luma AI’s plugin to streamline their production pipelines. Moreover, brands and entertainment groups could leverage the technology stack to create bespoke content such as immersive media and experiences.

Luma AI’s “Video-to-3D” API software allows XR developers to generate NeRF digital Twins at a dollar a scene while only taking 30 minutes to create a virtual object. The pricing allows Luma AI to democratise high-quality RT3D content creation for developers representing various use cases. Luma AI states that its NeRF software is applicable to use cases such as eCommerce inventory previews and VFX content production.

 

RELATED
Working in the Metaverse gets closer as Microsoft goes all in

 

The firm’s solution allows platform developers to embed NeRF objects directly into an RT3D engine, add textured objects, and import 360 scene captures.

Using Luma AI’s volumetric capture APIs, developers can create a digital twin of a real-life object and then import the file into Unreal 5 by installing the firm’s fresh plugin. Then, from there Unreal developers can simply drag and drop their NeRF file into a project allowing for photorealistic assets in VR experiences.

Additionally, Luma AI provides four “Blueprints” to assist an Unreal developer with suitably importing an asset into a virtual world which reflects its digital environment. Luma AI’s Baked Blueprint accurately simulates a captured object’s real-life lighting conditions. Dynamic Blueprints allow a NeRF object to alter based on Unreal lighting and shadow conditions. Luma AI’s Cropped Blueprint provides an automatically shortened version of NeRF objects, if present. Finally, the Environment Blueprint simulates a suitable sky-box for each NeRF capture.

 

RELATED
Scientists double the speed of quantum storage devices for quantum computers

 

Luma AI is working on RT3D rendering solutions that flatten the curve for local computing requirements. For example, in February, the firm introduced NeRF rendering tools that enable developers to preview RT3D content on web browsers.

In March, Luma AI secured roughly $20 million in a Series A funding round, accelerating Luma NeRF distribution for brands, studios, and smartphone users. In a statement regarding its successful Series A funding, Luma AI expressed that it is “aggressively expanding” its team to work on data and engineering systems that support large-scale renders and the “next generation of UX.”

The funding round introduced NVentures, NVIDIA’s Venture Capital Group, to its partners, which also includes Matrix Partners, South Park Commons, and RFC’s Andreas Klinger.

Mohamed (Sid) Siddeek, the Corporate Vice President and Head of NVentures, said:

“Luma is one of the first to bring to market new 3D NeRF capture technology, enabling non-technical users to create higher quality 3D renders than has previously been possible. The team has consistently demonstrated the ability to rapidly productize new research, and their long-term vision of democratizing access to 3D workflow tools for consumers has the potential to drastically lower the skills barrier to create & edit in 3D.”

 

RELATED
The US Marines want to build a fully autonomous F-35

 

NVIDIA is also working on a NeRF service. Instant NeRF from NVIDIA leverages AI networks to create a complete volumetric capture of an object or environment with rich considerations of a subject’s lighting, shadows, and reflections.

 

Welcome to Luma AI

 

The service also includes tools to import volumetric content into NVIDIA’s Omniverse suite for creating enterprise-grade immersive solutions, such as a VR scene. NVIDIA developers can integrate their NeRF content across various use cases, such as property management, retail, and eCommerce.

Moreover, Epic Games provides a volumetric capture solution via RealityScan, an integrated NeRF service for Unreal 5. Using a sister smartphone application, the service allows users to create high-quality digital twins with a phone camera. The Reality Scan smartphone service also employs AR visualisations and feedback to reduce output errors.

 

RELATED
Dubai plans on becoming the world's third metaverse city

 

There are various NeRF services emerging from trusted RT3D solutions providers. XR Today listed a handful of volumetric capture applications and services to help lead developers into the new design territory.

NeRF and photogrammetry solutions prepare XR enterprise designers for the Industry 4.0 landscape that presents new digital transformation opportunities; that’ll only increase as additional software, products, and updates debut and as RT3D engines become increasingly user-friendly and low-barrier.

Related Posts

Comments (1)

[…] and create a high-quality avatar in just a few minutes rather than having to use today’s much bulker photogrammetry rigs. He also said that Meta was exploring ways to make the avatars more expressive and customizable, as […]

Leave a comment

EXPLORE MORE!

1000's of articles about the exponential future, 1000's of pages of insights, 1000's of videos, and 100's of exponential technologies: Get The Email from 311, your no-nonsense briefing on all the biggest stories in exponential technology and science.

You have Successfully Subscribed!

Pin It on Pinterest

Share This