Matthew Griffin, described as “The Adviser behind the Advisers” and a “Young Kurzweil,” is the founder and CEO of the 311 Institute, a global futures and deep futures consultancy working between the dates of 2020 to 2070, and is an award winning futurist, and author of “Codex of the Future.” Regularly featured in the global media, including AP, BBC, CNBC, Discovery, RT, and Viacom, Matthew’s ability to identify, track, and explain the impacts of hundreds of revolutionary emerging technologies on global culture, industry and society, is unparalleled. Recognised for the past six years as one of the world’s foremost futurists, innovation and strategy experts Matthew is an international speaker who helps governments, investors, multi-nationals and regulators around the world envision, build and lead an inclusive, sustainable future. A rare talent Matthew’s recent work includes mentoring Lunar XPrize teams, re-envisioning global education and training with the G20, and helping the world’s largest organisations envision and ideate the future of their products and services, industries, and countries. Matthew's clients include three Prime Ministers and several governments, including the G7, Accenture, Bain & Co, BCG, BOA, Blackrock, Bentley, Credit Suisse, Dell EMC, Dentons, Deloitte, Du Pont, E&Y, GEMS, HPE, Huawei, JPMorgan Chase, KPMG, McKinsey, PWC, Qualcomm, SAP, Samsung, Sopra Steria, UBS, and many more.
WHY THIS MATTER IN BRIEF
Google’s AI is moving from just being able to identify and categorise images to being able to identify them according to their aesthetics, and that’s big.
Today’s slew of photo apps can help you find objects in your pictures but they don’t tell you whether or not those images are actually worth sharing. For now, at least, that’s still up to you.
If Google has its way though Artificial Intelligence (AI) may soon become an art critic. After more than a year in development recently Google detailed their work on what they’re calling a Neural Image Assessment (NIMA) system that uses a deep learning to rate photos based on what it believes you’d, or the critics out in the wider world, might like both technically and aesthetically.
Google trained NIMA using sets of images based on histogram of ratings, such as from photo contests, that gave the platform a sense of the overall “quality” or aesthetic of a picture in a range of different areas, not just a mean score or a simple high-low rating, and now, after having been trained, the system can use reference photos to compare one image with another, if they’re available, but it can also turn to its statistical model if it hasn’t seen anything like it before.
The result is an AI that “closely” replicates the mean scores of humans when judging photos, and that, in turn, has all kinds of implications for photography apps and smartphones. After all, imagine a smartphone with a built in photo critic that can help you take perfect shots time and time again… sounds fun doesn’t it!?
To begin with, NIMA could help you quickly find the best photos in your albums while avoiding poorly composed shots, or blurry ones, although here too Google has another AI agent that can help un-blur even the blurriest photos, called RAISR, which I wrote about a while ago, and Google adds that NIMA would be helpful for editing too because you could use it to help fine tune automatic editing tools, for example, your favourite editing app could tweak exposure, brightness and other details based on artistic appeal rather than arbitrary values.
While there’s a lot of work still to be done, this hints at a day when your phone could have as discerning a taste in photos as you do, so don’t be surprised if in a year or so you see NIMA turn up in an Android update… I hope you enjoy taking awesome photos.