Matthew Griffin, described as “The Adviser behind the Advisers” and a “Young Kurzweil,” is the founder and CEO of the World Futures Forum and the 311 Institute, a global Futures and Deep Futures consultancy working between the dates of 2020 to 2070, and is an award winning futurist, and author of “Codex of the Future” series. Regularly featured in the global media, including AP, BBC, Bloomberg, CNBC, Discovery, RT, Viacom, and WIRED, Matthew’s ability to identify, track, and explain the impacts of hundreds of revolutionary emerging technologies on global culture, industry and society, is unparalleled. Recognised for the past six years as one of the world’s foremost futurists, innovation and strategy experts Matthew is an international speaker who helps governments, investors, multi-nationals and regulators around the world envision, build and lead an inclusive, sustainable future. A rare talent Matthew’s recent work includes mentoring Lunar XPrize teams, re-envisioning global education and training with the G20, and helping the world’s largest organisations envision and ideate the future of their products and services, industries, and countries. Matthew's clients include three Prime Ministers and several governments, including the G7, Accenture, Aon, Bain & Co, BCG, Credit Suisse, Dell EMC, Dentons, Deloitte, E&Y, GEMS, Huawei, JPMorgan Chase, KPMG, Lego, McKinsey, PWC, Qualcomm, SAP, Samsung, Sopra Steria, T-Mobile, and many more.
WHY THIS MATTERS IN BRIEF
Once you digitise a physical object you can analyse it, model, and simulate scenarios at exponential speeds to discover more and design better solutions.
Celebrities have them, wind turbines have them, even cancerous tumours and the Covid-19 virus have them. I am, of course, talking about Digital Twins that let people represent physical things in digital form, and that let engineers talk to wind turbines, doctors fly through tumours and examine viruses in new ways, and that, increasingly, let celebrities clone themselves so they can be anywhere and everywhere. Now though not to be left out another entity is about to get its own digital twin after scientists in Europe announced they’re building a “highly accurate digital twin” of Earth.
“The new Earth system model will represent virtually all processes on the Earth’s surface as realistically as possible, including the influence of humans on water, food and energy management, and the processes in the physical Earth system,” ETH Zurich says in a statement. This is in addition to extensive climate data, creating one unified model that also brings together computer science and climate studies.
Like the Pentagon’s rising use of digital engineering in military aircraft, the “Earth twin” has a goal of saving money on costly design errors for projects that are unlikely to succeed. And with climate change, there’s a secondary reason to use a digital twin – we’re running out of time. A process or climate mitigation strategy that can be tested and tuned on the digital Earth twin can save crucial time and energy in the fight against worsening climate events.
Over the next 10 years, programmers and climate scientists will work together on the wildly detailed Earth twin. The scientists hope the resulting model will help everyone run simulations to make better, more reasoned plans for approaching extreme climate events and new challenges.
This project is a collaboration between the European Centre for Medium-Range Weather Forecasts (ECMWF), the European Space Agency (ESA), and the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT). The actual programming and computing is happening at ETH Zurich and the Swiss National Supercomputing Centre (CSCS). The work follows Europe’s commitment to carbon neutrality by 2050.
Using data about climate as well as human activities, the souped-up and super-powered version of something like Google Earth will help experts trace through the consequences of weather events as well as human structures, like whether a program to buoy sinking parts of Venice will withstand more rapidly rising waters, for example, or whether a levee will hold during a severe storm.
Climate computer models have stagnated because of the way they were developed and maintained, ETH Zurich explains. For years, improving these models has been a matter of simply adding more powerful computer processors. Very detailed models could depict more and more data by crunching that data at a higher speed, and for a long time, this seemed like a never ending option for improvement.
But now, much more sophisticated models involving complex algorithms can be leveraged with the massive amount of data that computers today can crunch. This is why the Earth twin will take a full decade to code and put into action. The coders and designers will be making hardware changes while building out the algorithms they need, with a goal to use both sides to their best effect.
There’s one last snag in the process though – just where do you put a computer system that will require an estimated 20,000 CPUs? And where are you supposed to put that system if it’s carbon-neutral itself? This could be a remote location in a colder region, where natural cooling and renewable natural energy could both be large benefits. We’ll find out soon.