0

WHY THIS MATTERS IN BRIEF

The vast majority of video imagery is low quality which makes identifying criminals and events hard if not impossible, but now that’s not the case.

 

Love the Exponential Future? Join our XPotential Community, future proof yourself with courses from XPotential University, read about exponential tech and trendsconnect, watch a keynote, or browse my blog.

Machine Vision is being used in more and more applications, but when there are poor lighting conditions, such as low light, high dynamic range – such as when a person is silhouetted against a bright background – heavy rain, fog, fast motion, and sudden changes in intensity, like flashing headlights, devices that use hardware based image signal processors often generate images that are difficult to watch or analyse.

 

RELATED
Google's quantum computer allegedly achieves quantum supremacy milestone

 

Now though Visionary.ai, a Jerusalem, Israel-based startup, has come out of stealth to show off Artificial Intelligence (AI) enhanced cameras that produce superb image quality whatever the conditions – as you can  see from  the videos below.

 

 

The business also claims to have created the first ever software based Image Signal Processor (ISP), and uses AI at the edge of the network to enhance the real-time image in challenging lighting, shadow, glare, and reflection situations. The company also adds that wherever there is a camera, Visionary.ai is “attaining market-leading image performance, giving consumer electronics, drones, robots, mobility, medical imaging, and other industries a competitive edge.”

 

 

The company also claims it can reduce image noise and enhance real-time performance for various industry-standard image sensors, enhancing the functionality of cameras and machine vision applications.

 

RELATED
Generation Z, the first immortal generation?

 

The CEO and co-founder of the company Oren Debbi said, “The market is experiencing dynamic growth for image sensors used for laptop cameras, medical endoscopy, safety reversing cameras, electronic doorbells, quality inspection on the assembly line, and hundreds of other applications. Our software-based ISP is continuously being updated with optimized algorithms to enable these sensors to create the highest possible quality images so people can stay connected, healthy, and safe.”

The company has raised $7 million to date, and the capital is being utilized towards R&D and business development that will further the company’s aim of making cutting-edge technology available to all cameras.

About author

Matthew Griffin

Matthew Griffin, described as “The Adviser behind the Advisers” and a “Young Kurzweil,” is the founder and CEO of the World Futures Forum and the 311 Institute, a global Futures and Deep Futures consultancy working between the dates of 2020 to 2070, and is an award winning futurist, and author of “Codex of the Future” series. Regularly featured in the global media, including AP, BBC, Bloomberg, CNBC, Discovery, RT, Viacom, and WIRED, Matthew’s ability to identify, track, and explain the impacts of hundreds of revolutionary emerging technologies on global culture, industry and society, is unparalleled. Recognised for the past six years as one of the world’s foremost futurists, innovation and strategy experts Matthew is an international speaker who helps governments, investors, multi-nationals and regulators around the world envision, build and lead an inclusive, sustainable future. A rare talent Matthew’s recent work includes mentoring Lunar XPrize teams, re-envisioning global education and training with the G20, and helping the world’s largest organisations envision and ideate the future of their products and services, industries, and countries. Matthew's clients include three Prime Ministers and several governments, including the G7, Accenture, Aon, Bain & Co, BCG, Credit Suisse, Dell EMC, Dentons, Deloitte, E&Y, GEMS, Huawei, JPMorgan Chase, KPMG, Lego, McKinsey, PWC, Qualcomm, SAP, Samsung, Sopra Steria, T-Mobile, and many more.

Your email address will not be published. Required fields are marked *