Matthew Griffin, award winning Futurist and Founder of the 311 Institute, a global futures think tank working between the dates of 2020 and 2070, is described as "The Adviser behind the Advisers." Regularly featured on AP, CNBC, Discovery and RT, his ability to identify, track, and explain the impacts of hundreds of revolutionary emerging technologies on global culture, industry and society, is unparalleled. Recognised for the past five years as one of the world's foremost futurists, innovation and strategy experts Matthew is an international speaker who helps governments, investors, multi-nationals and regulators around the world envision, build and lead an inclusive future. A rare talent Matthew sits on the Technology and Innovation Committee (TIAC) for Centrica, Europe’s largest utility company, and his recent work includes mentoring XPrize teams, building the first generation of biocomputers and re-inventing global education, and helping the world’s largest manufacturers envision, design and build the next 20 years of devices, smartphones and intelligent machines. Matthew's clients are the who’s who of industry and include Accenture, Bain & Co, BCG, BOA, Blackrock, Bentley, Credit Suisse, Dell EMC, Dentons, Deloitte, Du Pont, E&Y, HPE, Huawei, JPMorgan Chase, KPMG, McKinsey, PWC, Qualcomm, SAP, Samsung, Sopra Steria, UBS, the USAF and many others.
WHY THIS MATTERS IN BRIEF
The next battle front in computing will be dominated by graphics – whether it’s 3D, AR, VR or simply even higher resolution displays, and Apple wants to control its destiny.
For nearly a decade a British company called Imagination Technologies was responsible for designing and supplying Apple with the processors that powered the rich, high definition colours you see on your iPhone and iMac’s Retina displays. But that relationship ended abruptly last month when Apple announced they were splitting with their long term partner to strike out on their own and design their own GPU’s, because, as far as they, and many others in the technology space, from Google to Facebook, are concerned, GPU’s are far too an important technology to entrust to a “mere” third party.
However, as many industry observers have commented since, the split seemed inevitable, because graphics are the future – and they’re right. If you look at all the major developments in user experience they’re awash with new graphics based emerging technologies and innovations that rely on the powerful capabilities of today’s and tomorrow’s GPU’s – from 3D and 8K displays, Augmented Reality (AR), Photorealistic Rendering and Virtual Reality (VR), the GPU is going to play an increasingly important part in our world.
Map into that the fact that Artificial Intelligence (AI) machine learning and deep learning systems, as well as good old traditional gaming systems are also fuelled by GPU’s and you have a slam dunk. It’s a good time to be a GPU manufacturer – just ask Nvidia the world leader who’s now strutting around Wall Street like the only Peacock in town.
The GPU’s secret sauce lies in the way it processes calculations. While traditional CPUs process information sequentially a GPU can crunch a huge number of calculations in parallel – they’re the multi-taskers of the processor world and in today’s world, full of calculation hungry use cases that’s increasingly making them the work horses of tomorrow.
“The GPU is being leaned on more heavily than it ever has before,” says Patrick Moorhead, founder of Moor Insights & Strategy, “with the right algorithm, you can get 10 times the performance per watt [a key measure of computational efficiency] with a GPU on machine learning than you can with a CPU.”
While it’s fair to say that Apple has been a quiet player in AI they rely on these machine learning algorithms to power Siri and increasingly they’re the connective tissue that, increasingly, helps your iPhone anticipate your every need, from the next app that you’re likely to open through to power management. Then, looking further into the future in the AR and VR space, another area where Apple lags behind their Seattle and Silicon Valley counterparts, it’s increasingly clear that Tim Cook sees all of these emerging fields as the next battlefield for customer dollars.
“Augmented reality is a big ideal like the smartphone,” Cook told a room of analysts earlier this year, “with potentially iPhone like impact.”
As a result the race for high quality, stutter free graphics is on. Combine all of this with Apple’s “also” aspirations to dominate the living room with Apple TV – yes, something they’re still harping on about – and you have the perfect business case for making your own GPU’s. Furthermore, experts have been arguing that given the fact that Apple is so far behind in many of these races they didn’t have any other option but to take GPU development in house, and it’s not the first time they’ve taken their own chip destiny into their own hands.
Apple has, of course, been making its own processors for years. Its AX series SOC garners the most attention, since it’s the brain behind the iPhone, but in recent years they’ve branched out to include the SX series that supports the Apple Watch and the W1 which enables the Bluetooth magic that keeps Air Pods connected. Then, last year the T1 popped up in MacBook Pro, providing an extra security layer for the touch bar and its fingerprint reading Touch ID feature.
“In general, Apple likes to own as much of the underlying technology for its products as possible, and it already has a deep investment in chips,” says Jan Dawson, founder of Jackdaw Research.
If anything Apple’s move into graphics is simply the continuation of a trend it helped start. Rather than rely on outside partners for critical components an increasing number of tech giants, from Google who are developing their own custom AI chips to Microsoft who are toying around with their own FPGA chips to turn their Azure platform into the world’s largest supercomputer, are increasingly designing their own silicon. And as for the advantages? There are plenty, such as the fact that you can design your new chips to work specifically with your own devices and software, find new ways to differentiate your company in an increasingly crowded marketspace, and avoid being dragged down if someone else, such as Intel perhaps, skews their product development roadmap in the “wrong” direction.
Apple’s also good at making its own chips – the AX series, for example, improves in performance by 25% each year, a pace that veteran analysts say they’ve “never seen before.” But that’s not to say that Cupertino will automatically replicate its success in the graphics realm.
“I liken GPUs to black magic,” says Moorhead, “it’s really, really hard to get right. And there are fewer people who know how to do it.”
In fact, that confluence of factors – difficulty of execution and scarcity of talent – could be a problem for Apple, and there’s a big question about whether or not it can design an effective graphics processor without getting sued – a lot. Something that already seems to be on Imagination Technology’s mind.
“Apple has not presented any evidence to substantiate its assertion that it will no longer require Imagination’s technology, without violating Imagination’s patents, intellectual property, and confidential information,” the company said in a recent statement, and they’re not going to be the only rival keeping eyes on what Apple’s developing.
“There is a big question about where Apple will get the patent licenses and so on that it needs,” says Dawson, but both he and Moorhead also suggest that it’s a highly solvable problem. Companies like ARM willingly license out technology, which could provide some legal coverage, or Apple could spend some of its $250 billion cash pile to snap up a patent loaded chipmaker, ironically one like Imagination whose share price crashed after the news. Building up IP quickly takes two things Apple happens to have in spades – ingenuity and money.
Get ready for your eyeballs to be overwhelmed – eye candy is about to go to the next level. Hopefully.