Matthew Griffin, described as “The Adviser behind the Advisers” and a “Young Kurzweil,” is the founder and CEO of the World Futures Forum and the 311 Institute, a global Futures and Deep Futures consultancy working between the dates of 2020 to 2070, and is an award winning futurist, and author of “Codex of the Future” series. Regularly featured in the global media, including AP, BBC, CNBC, Discovery, RT, and Viacom, Matthew’s ability to identify, track, and explain the impacts of hundreds of revolutionary emerging technologies on global culture, industry and society, is unparalleled. Recognised for the past six years as one of the world’s foremost futurists, innovation and strategy experts Matthew is an international speaker who helps governments, investors, multi-nationals and regulators around the world envision, build and lead an inclusive, sustainable future. A rare talent Matthew’s recent work includes mentoring Lunar XPrize teams, re-envisioning global education and training with the G20, and helping the world’s largest organisations envision and ideate the future of their products and services, industries, and countries. Matthew's clients include three Prime Ministers and several governments, including the G7, Accenture, Bain & Co, BCG, Credit Suisse, Dell EMC, Dentons, Deloitte, E&Y, GEMS, Huawei, JPMorgan Chase, KPMG, Lego, McKinsey, PWC, Qualcomm, SAP, Samsung, Sopra Steria, T-Mobile, and many more.
WHY THIS MATTERS IN BRIEF
Teams of humans, AI’s and drones take center stage in the future of warfare, until that is AI’s replace humans.
In the wake of a recent announcement that the US will stand up it’s first fully autonomous drone squadron later this year earlier this week US Air Force Chief Scientist Gregory Zacharias announced that several US fighter jets will soon be combining human and artificial intelligence (AI) traits to control swarms of nearby that will be capable of carrying weapons, including laser systems, testing enemy air defenses or performing intelligence, surveillance and reconnaissance missions in high risk areas.
Zacharias said that much higher degrees of autonomy and Manned-Unmanned teaming are expected to emerge in the near future from work at the Air Force Research Lab.
“This involves an attempt to have another platform fly alongside a human, perhaps serving as a weapons truck” said Zacharias.
He added that F-35 pilots will be able to control swarms of drones, which could be released in mid flight, that are flying nearby from the aircraft cockpit, in the air and that the existing F-35 computer system uses early applications of AI that already help computers make assessments and make some decisions by themselves – without human intervention.
“We’re working on making platforms more autonomous with multi-infusion systems and data from across different intel streams,” explained Zacharias.
While computers can in many cases complete checklists and various procedures faster than humans he still maintains that humans are better at perceiving how information, or a battlespace might be changing.
“A computer might have to go through a big long checklist, whereas a pilot might immediately know that the engines are out without going through a checklist. He is able to make a quicker decision about where to land,” he said.
The F-35s AI based “Sensor fusion” capabilities – which aggregate and analyses all of the information coming from clusters of sensors – already uses computer algorithms to acquire, distill, organize and present otherwise disparate pieces of intelligence into a single picture for the pilot.
“Right now we are using lots of bandwidth to send our real-time video, for example. One of the things that we have now is a smarter on board processor. These systems can learn over time and be a force multiplier. There’s plenty of opportunity to go beyond the code base of an original designer and work on a greater ability to sense your environment or sense what your teammate might be telling you as a human,” he said, “for example, with advances in computer technology, autonomy and AI, drones will be able to stay above a certain area and identify relevant objects or targets at certain times, without needing a human operator.”
This is particularly relevant because the large amount of ISR video demands organizing algorithms and technology to help process and sift through the vast volumes of gathered footage – in order to pinpoint and communicate what is tactically relevant. “With image processing and pattern recognition, you could just send a signal instead of using up all this bandwidth,” he explained. This development could greatly enhance mission scope, flexibility and effectiveness by enabling a fighter jet to conduct a mission with more weapons, sensors, targeting technology and cargo.
For instance, real time video feeds from the electro-optical and infrared sensors on board an Air Force Predator, Reaper or Global Hawkcould go directly into an F-35 cockpit, without needing to go to a ground control station. This could speed up targeting and tactical input from drones on reconnaissance missions in the vicinity of where a fighter pilot might want to attack.
In fast moving combat circumstances involving both Air-to-Air and Air-to-Ground threats, increased speed could make a large difference. Additionally, drones could be programmed to fly into heavily defended or high risk areas ahead of manned fighter jets in order to assess enemy air defenses and reduce risk to pilots.
Unlike ground robotics where autonomy algorithms have to contend with an ability to move quickly in relation to unanticipated developments and other moving objects, simple autonomous flight guidance from the air is much more manageable and easier to accomplish because there are often fewer obstacles in the air compared with the ground, drones above the ground can be programmed more easily to fly toward certain pre-determined locations.
At the same time, unanticipated movements, objects or combat circumstances can easily occur in the skies as well.
“The question is what happens when you have to react more to your environment and a threat is coming after you,” he said.
As a result, scientists are now working on advancing autonomy to the point where acan, for example, be programmed to spoof a radar system, see where threats are and more quickly identify targets independently.