113 views
0

WHY THIS MATTER IN BRIEF

Google’s AI is moving from just being able to identify and categorise images to being able to identify them according to their aesthetics, and that’s big.

 

Today’s slew of photo apps can help you find objects in your pictures but they don’t tell you whether or not those images are actually worth sharing. For now, at least, that’s still up to you.

 

RELATED
QuirkLogic's new E-Ink whiteboard is a giant kindle for your office

 

If Google has its way though Artificial Intelligence (AI) may soon become an art critic. After more than a year in development recently Google detailed their work on what they’re calling a Neural Image Assessment (NIMA) system that uses a deep learning to rate photos based on what it believes you’d, or the critics out in the wider world, might like both technically and aesthetically.

Google trained NIMA using sets of images based on histogram of ratings, such as from photo contests, that gave the platform a sense of the overall “quality” or aesthetic of a picture in a range of different areas, not just a mean score or a simple high-low rating, and now, after having been trained, the system can use reference photos to compare one image with another, if they’re available, but it can also turn to its statistical model if it hasn’t seen anything like it before.

The result is an AI that “closely” replicates the mean scores of humans when judging photos, and that, in turn, has all kinds of implications for photography apps and smartphones. After all, imagine a smartphone with a built in photo critic that can help you take perfect shots time and time again… sounds fun doesn’t it!?

 

RELATED
JPMorgan is automating the world of finance

 

To begin with, NIMA could help you quickly find the best photos in your albums while avoiding poorly composed shots, or blurry ones, although here too Google has another AI agent that can help un-blur even the blurriest photos, called RAISR, which I wrote about a while ago, and Google adds that NIMA would be helpful for editing too because you could use it to help fine tune automatic editing tools, for example, your favourite editing app could tweak exposure, brightness and other details based on artistic appeal rather than arbitrary values.

While there’s a lot of work still to be done, this hints at a day when your phone could have as discerning a taste in photos as you do, so don’t be surprised if in a year or so you see NIMA turn up in an Android update… I hope you enjoy taking awesome photos.

About author

Matthew Griffin

Matthew Griffin, award winning Futurist and Founder of the 311 Institute is described as "The Adviser behind the Advisers." Recognised for the past five years as one of the world's foremost futurists, innovation and strategy experts Matthew is an author, entrepreneur international speaker who helps investors, multi-nationals, regulators and sovereign governments around the world envision, build and lead the future. Today, asides from being a member of Centrica's prestigious Technology and Innovation Committee and mentoring XPrize teams, Matthew's accomplishments, among others, include playing the lead role in helping the world's largest smartphone manufacturers ideate the next five generations of mobile devices, and what comes beyond, and helping the world's largest high tech semiconductor manufacturers envision the next twenty years of intelligent machines. Matthew's clients include Accenture, Bain & Co, Bank of America, Blackrock, Bloomberg, Booz Allen Hamilton, Boston Consulting Group, Dell EMC, Dentons, Deloitte, Deutsche Bank, Du Pont, E&Y, Fidelity, Goldman Sachs, HPE, Huawei, JP Morgan Chase, KPMG, Lloyds Banking Group, McKinsey & Co, Monsanto, PWC, Qualcomm, Rolls Royce, SAP, Samsung, Schroeder's, Sequoia Capital, Sopra Steria, UBS, the UK's HM Treasury, the USAF and many others.

Your email address will not be published. Required fields are marked *