Matthew Griffin, described as “The Adviser behind the Advisers” and a “Young Kurzweil,” is the founder and CEO of the World Futures Forum and the 311 Institute, a global Futures and Deep Futures consultancy working between the dates of 2020 to 2070, and is an award winning futurist, and author of “Codex of the Future” series. Regularly featured in the global media, including AP, BBC, Bloomberg, CNBC, Discovery, RT, Viacom, and WIRED, Matthew’s ability to identify, track, and explain the impacts of hundreds of revolutionary emerging technologies on global culture, industry and society, is unparalleled. Recognised for the past six years as one of the world’s foremost futurists, innovation and strategy experts Matthew is an international speaker who helps governments, investors, multi-nationals and regulators around the world envision, build and lead an inclusive, sustainable future. A rare talent Matthew’s recent work includes mentoring Lunar XPrize teams, re-envisioning global education and training with the G20, and helping the world’s largest organisations envision and ideate the future of their products and services, industries, and countries. Matthew's clients include three Prime Ministers and several governments, including the G7, Accenture, Aon, Bain & Co, BCG, Credit Suisse, Dell EMC, Dentons, Deloitte, E&Y, GEMS, Huawei, JPMorgan Chase, KPMG, Lego, McKinsey, PWC, Qualcomm, SAP, Samsung, Sopra Steria, T-Mobile, and many more.
WHY THIS MATTERS IN BRIEF
We think of computing as being “silicon based” and “solid” but the future of computing breaks all these rules and more.
Love the Exponential Future? Join our XPotential Community, future proof yourself with courses from XPotential University, read about exponential tech and trends, connect, watch a keynote, or browse my blog.
Traditional microprocessors in smartphones, computers, and data centers process information by manipulating electrons through solid semiconductors, but our brains don’t – they have an entirely different “computing” method. Instead they rely on the manipulation of ions in liquid to process information and, inspired by the human brain, researchers have long been seeking to develop aqueous Ionic Computers – a new class of liquid computers. And over the past few years I’ve seen the emergence of everything from liquid computer storage technologies through to liquid CPU’s … so the revolution is starting. And that’s before I talk about the emergence of chemical computers, DNA computers, and a whole bunch of other stuff.
While ions in water move slower than electrons in semiconductors, scientists think the diversity of ionic species with different physical and chemical properties could be harnessed for richer and more diverse information processing.
Ionic computing, however, is still in its early days. To date, labs have only developed individual ionic devices such as ionic diodes and transistors, but no one has put many such devices together into a more complex circuit for computing – until now.
A team of researchers at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS), in collaboration with DNA Script, a biotech startup, have developed an ionic circuit comprising hundreds of ionic transistors and performed a core process of neural net computing.
The research is published in Advanced Materials.
The researchers began by building a new type of ionic transistor from a technique they recently pioneered. The transistor consists of an aqueous solution of quinone molecules, interfaced with two concentric ring electrodes with a center disk electrode, like a bullseye. The two ring electrodes electrochemically lower and tune the local pH around the center disk by producing and trapping hydrogen ions. A voltage applied to the center disk causes an electrochemical reaction to generate an ionic current from the disk into the water. The reaction rate can be sped up or down –– increasing or decreasing the ionic current — by tuning the local pH. In other words, the pH controls, or gates, the disk’s ionic current in the aqueous solution, creating an ionic counterpart of the electronic transistor.
They then engineered the pH-gated ionic transistor in such a way that the disk current is an arithmetic multiplication of the disk voltage and a “weight” parameter representing the local pH gating the transistor. They organized these transistors into a 16 × 16 array to expand the analog arithmetic multiplication of individual transistors into an analog matrix multiplication, with the array of local pH values serving as a weight matrix encountered in neural networks.
“Matrix multiplication is the most prevalent calculation in neural networks for artificial intelligence,” said Woo-Bin Jung, a postdoctoral fellow at SEAS and the first author of the paper. “Our ionic circuit performs the matrix multiplication in water in an analog manner that is based fully on electrochemical machinery.”
“Microprocessors manipulate electrons in a digital fashion to perform matrix multiplication,” said Donhee Ham, the Gordon McKay Professor of Electrical Engineering and Applied Physics at SEAS and the senior author of the paper. “While our ionic circuit cannot be as fast or accurate as the digital microprocessors, the electrochemical matrix multiplication in water is charming in its own right, and has a potential to be energy efficient.”
Now, the team looks to enrich the chemical complexity of the system.
“So far, we have used only 3 to 4 ionic species, such as hydrogen and quinone ions, to enable the gating and ionic transport in the aqueous ionic transistor,” said Jung. “It will be very interesting to employ more diverse ionic species and to see how we can exploit them to make rich the contents of information to be processed.”
The research was was supported in part by the Office of the Director of National Intelligence (ODNI), and Intelligence Advanced Research Projects Activity (IARPA), under grant 2019-19081900002, who have been trying to cram all the power of giant hyperscale datacenters into a package the size of an office desk by developing their own molecular computing platforms – among other systems.