Matthew Griffin, described as “The Adviser behind the Advisers” and a “Young Kurzweil,” is the founder and CEO of the World Futures Forum and the 311 Institute, a global Futures and Deep Futures consultancy working between the dates of 2020 to 2070, and is an award winning futurist, and author of “Codex of the Future” series. Regularly featured in the global media, including AP, BBC, CNBC, Discovery, RT, and Viacom, Matthew’s ability to identify, track, and explain the impacts of hundreds of revolutionary emerging technologies on global culture, industry and society, is unparalleled. Recognised for the past six years as one of the world’s foremost futurists, innovation and strategy experts Matthew is an international speaker who helps governments, investors, multi-nationals and regulators around the world envision, build and lead an inclusive, sustainable future. A rare talent Matthew’s recent work includes mentoring Lunar XPrize teams, re-envisioning global education and training with the G20, and helping the world’s largest organisations envision and ideate the future of their products and services, industries, and countries. Matthew's clients include three Prime Ministers and several governments, including the G7, Accenture, Bain & Co, BCG, Credit Suisse, Dell EMC, Dentons, Deloitte, E&Y, GEMS, Huawei, JPMorgan Chase, KPMG, Lego, McKinsey, PWC, Qualcomm, SAP, Samsung, Sopra Steria, T-Mobile, and many more.
WHY THIS MATTERS IN BRIEF
- Jurors often have to rely on reconstructed scenes to help them visualise how a crime unfurled, now, using virtual reality they can see and move around the actual scene days, months or even years after the crime(s) took place
A physical crime scene may be the main topic under investigation in a criminal trial, but despite its importance, jurors are often reliant on less-than-satisfactory second and third hand information about exactly what the scene looks like, and, most importantly what took place there.
“The problem with today’s crime scene reconstruction practices is that they usually involve still photography, hand drawn sketches and, in rare cases, videography,” says Mehzeb Chowdhury, a PhD researcher in forensic science and criminal investigations at Durham University, “experts will then bring 3D rendered crime scene animations, which are created and rendered using a combination of the still images and sketches, to court. This is an approximation of reality, not reality itself. Juries are bamboozled by conflicting crime scene recreations, as each side presents its own version of the crime scene, and where the evidence was found.”
So Chowdhury’s solution? A robot that will let jurors explore crime scenes for themselves using virtual reality. His MABMAT robotic imaging system is capable of recording 360 degree HD video using a NASA inspired rover unit which can autonomously roam a crime scene at the time it’s being investigated to capture every salient detail.
The rover is built on a combination of two low cost micro-controller boards, a Arduino and Raspberry Pi, and runs on open source software. The VR footage it records can then be viewed at a later date, using any headset all the way down to a low end smartphone with Google Cardboard headset.
This isn’t the first time VR has been considered as a means by which to let juries take a look at crime scenes. Previous attempts have explored using everything from lasers and video game engines to help model individual crime scenes after the fact, to other efforts involving Hollywood style green screens but there are two main differences with Chowdhury’s concept.
The first is the price point. Even taking into consideration the camera and robot, the entire system costs less than $400 and the cost of creating the VR scenes the jurors would get to explore is also very little.
“If we had gone for the traditional means of VR content creation, such as 3D scanning a crime scene, texturing it, and then using a gaming engine to render the world, it would have been too expensive,” Chowdhury said, “today’s VR ready computers require massive amounts of processing power, expensive graphics cards, and headsets all with eye movement and head tracking capabilities.”
The second, perhaps more crucial, point is that Chowdhury’s entire mission statement was to take away non objective guesswork from the courtroom. Had he relied on recreating scenes based on eyewitness reports or CCTV, there would be the possibility of bias being introduced.
“Unlike 3D recreations, my system would provide jurors with the real representations of how things were, rather than a user created propaganda video to sway the jury,” he continued, “the most problematic aspect of crime scene visits is that, with time, every characteristic of the scene changes in some way or the other. This is called scene degradation. Years could pass between a crime being committed, and a jury scene visit, with very little remaining the same. A contemporaneous snapshot of the entire crime scene would preserve the necessary details for investigation and trial.”
As to how long it is before tools like this can be deployed, Chowdhury suggested that it might be sooner than you think.
“Realistically the system is a few months away from field testing,” he said, “the plan is to work with police departments in the UK and US and so far around 50 police forces from these two countries have already participated. The ideal scenario would be to collaborate with them full time. Unlike other projects which are looking at similar technology, this is self funded, with months spent in my own garage. The system is on track for testing in the next few months, but further development would depend on the support it gets.”