Matthew Griffin, award winning Futurist and Founder of the 311 Institute is described as "The Adviser behind the Advisers." Recognised for the past five years as one of the world's foremost futurists, innovation and strategy experts Matthew is an author, entrepreneur international speaker who helps investors, multi-nationals, regulators and sovereign governments around the world envision, build and lead the future. Today, asides from being a member of Centrica's prestigious Technology and Innovation Committee and mentoring XPrize teams, Matthew's accomplishments, among others, include playing the lead role in helping the world's largest smartphone manufacturers ideate the next five generations of mobile devices, and what comes beyond, and helping the world's largest high tech semiconductor manufacturers envision the next twenty years of intelligent machines. Matthew's clients include Accenture, Bain & Co, Bank of America, Blackrock, Bloomberg, Booz Allen Hamilton, Boston Consulting Group, Dell EMC, Dentons, Deloitte, Deutsche Bank, Du Pont, E&Y, Fidelity, Goldman Sachs, HPE, Huawei, JP Morgan Chase, KPMG, Lloyds Banking Group, McKinsey & Co, Monsanto, PWC, Qualcomm, Rolls Royce, SAP, Samsung, Schroeder's, Sequoia Capital, Sopra Steria, UBS, the UK's HM Treasury, the USAF and many others.
WHY THIS MATTERS IN BRIEF
- Quantum computers look set to become the dominant computing platform of the future but today’s storage technology will need a major update too to keep up
We’ve all heard the hype around quantum computing, and we can all feel revolution in the wind, with the first universal commercial systems being available to use as early as 2020. But for all their promise – whether it’s their presumed ability to help us create new forms of energy, or even cheat death itself – there’s a huge, gaping hole in our plans to harness them to their full potential. Because the very science behind these new ultra-fast systems means you can’t duplicate or save information on a quantum computer and all that massive computing power is useless if you can’t store it or back it up.
While you can convert quantum data and put it onto a traditional storage device, like a solid state hard drive, the speed at which quantum computers can process an almost infinite volume of data means that, on the one hand, trying to store all that information at a reasonable speed it nigh on impossible, and secondly, but more crucially, all the data they generate will take up an unfathomable amount of space. As a result, even with the most advanced filtering and storage algorithms known man and woman, the hyperscale storage systems needed to store it all would quickly become incalculably huge – and that’s before we’ve even started talking about the amount of energy they’d use, and so on and so on…
One solution to the problem though could come from a very old friend, a four billion year old friend to be precise – DNA. Scientists and researchers have been playing around with DNA, as both a computing and a storage device, for decades now. In September last year, for example, Microsoft spent millions of dollars buying custom fabricated DNA to tinker around with, and recently the University of Manchester pushed out the first feasible design for a self-replicating DNA supercomputer that could make quantum computers look as advanced as a rock.
The secret behind quantum computers raw, awesome power is how they process data. A classical computer reads, stores and manipulates bits in the form of 1’s and 0’s. A quantum computer on the other hand used “qubits” which are tiny quantum objects that can exist in two states – a 1 and 0 – at the same time, something called superposition. And it’s this phenomenon that gives them their phenomenal computational power and lets them perform a limitless number of tasks in parallel, something that last year saw Google’s DWave quantum “so called” computer out perform its older brethren by a factor of one hundred million times.
This quantum weirdness has its drawbacks though. On the one hand it enables superposition – the very thing that scientists and researchers crave to create insanely powerful computing systems – but on the other it’s also this weirdness that prevents the cloning of quantum particles.
“It’s called the ‘no-cloning theorem,’” says physicist Stephanie Simmons of Simon Fraser University in Canada, “say that a quantum computer programs an atom to be in a specific quantum state that represents a set of numbers. It is physically impossible for the computer to program another atom to be in the exact same quantum state.”
Consequently, Simmons has now proposed a new, roundabout way of storing quantum data.
“First, you’ll need to convert it into binary data – translating the numbers that describe quantum superposition into simple 1’s and 0’s. Then, you’ll need to store that converted data in a classical storage format. In other words – hard drives – super compact ones, because the size of each quantum data file from a 49-qubit computer will be on the scale of 40,000 videos,” she says, “and realistically, in order to store that volume of information we’re going to need new storage technologies.”
Today, a single quantum file would occupy a stamp sized area on a solid state hard drive, and this is where DNA comes in. Late last week scientists managed to demonstrate a new DNA storage method that could store 215 petabytes, or 215 million gigabytes, on a single gram of DNA, and at that density, you could fit all the information on the internet onto just one gram. Furthermore, as the University of Manchester scientists posited – albeit with a DNA computer – a DNA storage system could self-replicate, scaling almost instantly to meet whatever demands a quantum computer can throw at it. Plus, it lasts a long time, a very, very long time without degrading.
“Think about your CD’s from the 1990’s,” says computer scientist Yaniv Erlich of Columbia University, “they’re probably a bit scratched, and you can’t read the data accurately. But DNA can store information for a very long time. We can read DNA from skeletons that are hundreds of thousands of years old to very high accuracy.”
Another super-compact technology that recently made an appearance, courtesy of IBM, is atom scale storage. Again, last week – it must have been a week for storage announcements – researchers from big blue demonstrated how they’d managed to store a single bit of data on a single atom and successfully read it back.
To do this, they embedded holmium atoms on a chip and used electronics to control the direction of the inherent magnetic field produced by each atom. They found that they could control the atoms independently when they were spaced just a nanometer apart. So basically, it’s possible to encode one bit per atom – and you can’t get more dense than that, says physicist Chris Lutz of IBM. To give you an example, the best commercial hard drives on the market today can only store one bit of information on at least 100,000 atoms, and even a DNA base pair is made of around thirty atoms.
Obviously both of these methods, like quantum computers themselves, are years from being commercialised, and DNA, for now at least, is expensive to synthesise and it takes a long time to read from it. And on the atomic side of the fence, in order to store data on single atoms you have to keep them extremely cold because otherwise they’ll interfere with each other and overwrite their data.
Add that to the complexity of creating quantum algorithms that can efficiently convert and compress quantum data into binary, and the architecture needed to support and run it all, and you have a challenge that no doubt will keep researchers and their lab AI’s busy for years, probably decades.
“I see huge challenges coming our way,” says Simmons, “because if quantum computers don’t back up their data, auto save won’t be coming to the rescue.”
However, all of that said, one day we could see the data generated by our quantum computers backed up onto atoms and DNA and all of that should make your head spin.
Have a great day and enjoy using your stone age computing device – as for me, well I’m still stuck using my photonic supercomputer. Lousy slow computer.