Scroll Top

Prophesee and Qualcomm partner to bring neuromorphic computer vision to smartphones

Futurist_neuroimage

WHY THIS MATTERS IN BRIEF

Neuromorphic computers are very different to traditional computers, they can learn. As a result they have the ability to change many industries.

 

Love the Exponential Future? Join our XPotential Community, future proof yourself with courses from XPotential University, read about exponential tech and trendsconnect, watch a keynote, or browse my blog.

If you want sharper photos then this latest innovation from Prophesee, a first of a kind neuromorphic machine vision company, who’ve partnered with Qualcomm to bring a new ultra fast image sensor to smartphone cameras, is for you.

 

RELATED
General AI, the "holy grail" of AI, demonstrated for the first time

 

The “event-based” Metavision technology will enable smartphone cameras that can capture fast action in comparison to today’s image sensors. The company made the announcement with Qualcomm at the Mobile World Congress event in Barcelona.

 

Learn more about the technology

 

The idea is to bring the speed, efficiency, and quality of neuromorphic computing enabled vision to mobile devices, and the technical and business collaboration will provide mobile device developers a fast and efficient way
to leverage the Paris-based Prophesee sensor’s ability to dramatically improve camera performance, particularly in fast-moving dynamic scenes, such as sport scenes, and in low light, through its breakthrough event-based continuous and asynchronous pixel sensing approach.

In contrast to normal image sensors, the event-based sensors designed by Prophesee only capture what changes in a smartphone image. That enables them to skip the processing required by other kinds of sensors which process every single pixel in an image. Prophesee only captures the changes, or events, that reflect something that is changing in a moving image.

 

RELATED
Researchers have created a digital nose that can sniff out disease

 

Prophesee and Qualcomm have agreed to a multi-year collaboration to enable native compatibility between Prophesee’s Event-Based Metavision Sensors & Software and premium Snapdragon mobile platforms.

The world is neither raster-based nor frame-based. Inspired by the human eye, Prophesee Event-Based sensors repair motion blur and other image quality artifacts caused by conventional sensors, especially in high dynamic scenes and low light conditions bringing Photography and Video closer to our true experiences, Prophesee said.

“Prophesee is a clear leader in applying neuromorphic techniques to address limitations of traditional cameras and improve the overall user experience. We believe this is game-changing technology for taking mobile photography to the next level and our collaboration on both the technical and business levels will help drive adoption by leading OEMs,” said Judd Heape, vice president of product management at Qualcomm Technologies, in a statement.

“Their pioneering achievements with event cameras’ shutter-free capability offer a significant enhancement to the quality of photography available in the next generation of mobile devices powered by Snapdragon, even in the most demanding environments, unlocking a range of new possibilities for Snapdragon customers.”

 

RELATED
An American grad student just admitted using an advanced AI to write his term papers

 

Prophesee’s neuromorphic Event-Based Metavision sensors and software will be available for premium Snapdragon mobile platforms. Development kits are expected to be available from Prophesee this year.

“We are excited to be working with the provider of one of the world’s most popular mobile platforms to incorporate event-based vision into the Snapdragon ecosystem. Through this collaboration, product developers will be able to dramatically enhance the user experience with cameras that deliver image quality and operational excellence not available using just traditional frame-based methods,” said Luca Verre, CEO of Prophesee, in a statement.

Prophesee’s breakthrough sensors add a new sensing dimension to mobile photography. They change the paradigm in traditional image capture by focusing only on changes in a scene, pixel by pixel, continuously, at extreme speeds, the companies said.

 

RELATED
Hypersurfaces uses AI to turn everything into a computer interface

 

Each pixel in the Metavision sensor embeds a logic core, enabling it to act as a neuron. They each activate themselves intelligently and asynchronously depending on the number of photons they sense. A pixel activating itself is called an event. In essence, events are driven by the scene’s dynamics, not an arbitrary clock anymore, so the acquisition speed always matches the actual scene dynamics.

High-performance event-based deblurring is achieved by synchronizing a frame-based and Prophesee’s event-based sensor. The system then fills the gaps between and inside the frames with microsecond events to algorithmically extract pure motion information and repair motion blur.

Related Posts

Leave a comment

EXPLORE MORE!

1000's of articles about the exponential future, 1000's of pages of insights, 1000's of videos, and 100's of exponential technologies: Get The Email from 311, your no-nonsense briefing on all the biggest stories in exponential technology and science.

You have Successfully Subscribed!

Pin It on Pinterest

Share This