Matthew Griffin, described as “The Adviser behind the Advisers” and a “Young Kurzweil,” is the founder and CEO of the World Futures Forum and the 311 Institute, a global Futures and Deep Futures consultancy working between the dates of 2020 to 2070, and is an award winning futurist, and author of “Codex of the Future” series. Regularly featured in the global media, including AP, BBC, Bloomberg, CNBC, Discovery, RT, Viacom, and WIRED, Matthew’s ability to identify, track, and explain the impacts of hundreds of revolutionary emerging technologies on global culture, industry and society, is unparalleled. Recognised for the past six years as one of the world’s foremost futurists, innovation and strategy experts Matthew is an international speaker who helps governments, investors, multi-nationals and regulators around the world envision, build and lead an inclusive, sustainable future. A rare talent Matthew’s recent work includes mentoring Lunar XPrize teams, re-envisioning global education and training with the G20, and helping the world’s largest organisations envision and ideate the future of their products and services, industries, and countries. Matthew's clients include three Prime Ministers and several governments, including the G7, Accenture, Aon, Bain & Co, BCG, Credit Suisse, Dell EMC, Dentons, Deloitte, E&Y, GEMS, Huawei, JPMorgan Chase, KPMG, Lego, McKinsey, PWC, Qualcomm, SAP, Samsung, Sopra Steria, T-Mobile, and many more.
WHY THIS MATTERS IN BRIEF
If you had the world’s most sophisticated drone wouldn’t full hunter-killer automation be too tempting to ignore?
Recently the United Nations announced that they believed they’d seen the first example of an autonomous Turkish Hunter-Killer (HK) drone hunting down and killing soldiers on the battlefield in Libya. And that news caused ripples all over the world as countries everywhere eye up a future that could be ultimately dominated by all manner of fully autonomous war machines and weapons – with few to no humans in the loop.
AI has been beating top gun pilots for some years now – convincingly – and recently the USAF made an AI an official crew member on board a prototype U2 spy plane, so it’s no surprise therefore that the Pentagon’s Joint Artificial Intelligence Center have announced they’ve awarded a $93.3 million contract to General Atomics, makers of the famous MQ-9 Reaper, to equip the drone with new AI technology – a move that inevitably puts it on a path to becoming the US military’s first fully autonomous capable HK drone.
Initially though the aim is for the Reaper to be able to carry out autonomous flight, decide where to direct its battery of sensors, and to recognize objects on the ground. The contract, announced at the end of last month, builds on a successful test earlier this year.
In some ways this is not a major development, more of an incremental step using existing technology. What makes it significant though is the drone that is being equipped, and what it will be able to do afterwards.
Military drones are notoriously backward when it comes to on-board intelligence, even compared to their tiny cousins in the consumer world. You can buy a drone like the SkyDio 2 which can carry out a complete flight on its own, taking off and locking on to the owner to autonomously shoot video of them while they surf, ski or skateboard, then landing automatically afterwards. By contrast, military drones need a remote pilot to take off and land, and a payload operator to point the cameras and other sensors at the target, not to mention launching missiles …
The biggest drone manpower requirement is PED — Processing, Exploitation and Dissemination – the teams of analysts who look through hours and hours of high-resolution video, trying to determine whether people on the ground are mending a pothole or planting an IED, whether someone is carrying a mortar tube, an RPG or just a length of pipe, and similar challenges. This is something AI, especially machine learning, could help with.
Agile Condor, which has been in development by the Air Force Research Laboratory for some years, is effectively a flying supercomputer – “high-performance embedded computing” — optimized for AI applications. Built by SRC Inc, it packs the maximum computing capacity into the minimum space, with the lowest possible power requirements. Its modular architecture is built around machine learning, suggesting a lot of GPUs or other processors optimized for parallel processing, and the makers anticipate upgrades to neuromorphic computing hardware which mimics the human brain.
One of the key aims of Agile Condor is speeding up the PED process. At the mass of data is beams back to an operations center, as far as the bandwidth allows, and then pored over to extract information. Agile Condor’s AI should be able to do all that instantly at the edge of the network without needing to send the data anywhere.
“Instead of taking hours, sometimes days or even weeks – decisions can now be made in near real-time. If the system detects an anomaly on the ground, war fighters are alerted within minutes, allowing them to investigate and act while it’s still relevant,” according to SRC’s page on Agile Condor.
Obviously the Pentagon’s announcement also opens up the possibility of the Reaper operating on its own. An Air Force slide of the Agile Condor concept of operations shows the drone losing both its communications links – which is what happened with the Turkish HK drone – and GPS navigation at the start of its mission. An existing Reaper would circle in place or fly back to try and re-establish communications; the AI-boosted version uses its AI to navigate using landmarks and find the target area – as well as spotting threats on the ground and changing its flight path to avoid them.
The Agile Condor was evidently a success, and the new contract suggests that the Air Force wants more of this.
“This will bring a tremendous increase in unmanned systems capabilities for applications across the full-range of military operations,” said GA-ASI Vice President of Strategic Development J.R. Reid in a press release.
Of course, when most people think about Reaper operations they think about drone strikes on terrorists and insurgents. When it comes to autonomous weapons, the Pentagon’s official stated policy is always that a human operator will always make the firing decision, but needless to say this policy has some flexibility: it simply demands “appropriate levels of human judgment,” whatever that means. And policies change … especially when retaining your military advantage is on the cards, which is why very few countries are signing up to the United Nation’s initative to ban the development and use of autonomous weapons.
Nobody is suggesting that the AI equipped Reapers will be carrying out autonomous strike missions if communication is impossible, just yet, but this might become a tempting possibility as the technology evolves. And, as others have previously noted, the US has changed it stance on Reaper exports, leading to a slew of recent deals including with the UAE and even Morocco. If an AI upgrade is available, there are likely to be plenty of takers. The capability would certainly keep the Reaper ahead of the growing competition from Chinese, Turkish and Israeli drone makers, but the US might have little say in how it was used.