WHY THIS MATTERS IN BRIEF
If we had to code every behaviour a robot needs to learn it’d be a never ending task, but if robots can learn like humans without needing to be coded then that’d be game changing.
Interested in the Exponential Future? Connect, download a free E-Book, watch a keynote, or browse my blog.
Alphabet’s X moonshot division, formerly known as Google X, have unveiled what they’re calling the Everyday Robot project, with the goal of developing what’s known as “General Purpose Robots” – in short robots that can use cameras, sensors, and complex Artificial Intelligence (AI) machine learning algorithms to learn how to do things in a similar way to how we humans do, by experiencing and interacting with the world around them, without needing to be explicitly coded to do so. And, when you also consider the fact that elsewhere we are creating robots that can share and combine code with one another, like animals do with DNA, to evolve, and that we’ve also created the first self-evolving and self-manufacturing robots, as well as have a path to creating advanced shape shifting and polymorphic robots that can change shape using a variety of sci-fi like technologies and materials, I think we can firmly say that the future of robotics is going to be anything but boring.
At a high level Alphabet X’s plan sounds very familiar because it’s the same goal that OpenAI, who recently put a robot hand through thousands of years worth of synthetic training in a simulated environment to create the world’s most dexterous robot hand that can also solve a Rubik’s cube one handed, are trying to accomplish. And it’s interesting because it means that one day you might just be able to unbox a new robot at home and it will learn what needs to be done without having to be programmed or continually updated to do it. Furthermore, when that robot it plugged into a robot hive mind, like the one Google created in 2017, then that one robot could then teach all its new found skills to every other connected robot on the planet – all of which is, needless to say, a game changer.
In X’s case the team are first testing robots that can help out in workplace environments, although right now these early robots are focused on learning how to sort trash like the robot Wall-e in the movie.
Courtesy: Everyday Robot
Here’s a GIF of a robot actually sorting a recyclable can from a compost pile to a recycling pile, and watch how it grasps the can…
Courtesy: Everyday Robot
The concept of grasping something comes pretty easily to most humans, but it’s a very challenging thing to teach a robot, and Everyday Robot’s robots get their practice in both the physical world and the virtual world.
In a tour of X’s offices, Wired described how a “playpen” of nearly 30 of the robots, which are supervised by humans, spend their daytime hours sorting trash into trays for compost, landfill, and recycling. At night, Everyday Robot has virtual robots practice grabbing things in simulated buildings, according to Wired. That simulated data is then combined with the real world data, which is given to the robots in a system update every week or two.
With all that practice, X says the robots are actually getting pretty good at sorting, apparently putting less than 5 percent of trash in the wrong place – X’s humans put 20 percent of trash in the wrong pile, according to X so its a step up.
That doesn’t mean they’re anywhere remotely near to replacing human janitors, though. Wired observed one robot grasping thin air instead of the bowl in front of it, then attempting to put the “bowl” down, and another lost one of its “fingers” during the demo. Engineers also told Wired that, at one point, some robots weren’t moving through a building because some types of light caused their sensors to hallucinate holes in the floor. Trippy!
Everyday Robot lead Hans Peter Brondmo told Wired that he hopes to one day make a robot that can assist the elderly. But he also acknowledged something like that might be a few years out — so for now, it seems the robots will keep getting better at sorting trash.
[…] realm: touch and tactile feedback. Google is already a major player in AI robotics with their Everyday robotics projects, but embedding a super-knowledgeable model like Gemini with the ability to understand the world […]