Matthew Griffin, described as “The Adviser behind the Advisers” and a “Young Kurzweil,” is the founder and CEO of the World Futures Forum and the 311 Institute, a global Futures and Deep Futures consultancy working between the dates of 2020 to 2070, and is an award winning futurist, and author of “Codex of the Future” series. Regularly featured in the global media, including AP, BBC, CNBC, Discovery, RT, and Viacom, Matthew’s ability to identify, track, and explain the impacts of hundreds of revolutionary emerging technologies on global culture, industry and society, is unparalleled. Recognised for the past six years as one of the world’s foremost futurists, innovation and strategy experts Matthew is an international speaker who helps governments, investors, multi-nationals and regulators around the world envision, build and lead an inclusive, sustainable future. A rare talent Matthew’s recent work includes mentoring Lunar XPrize teams, re-envisioning global education and training with the G20, and helping the world’s largest organisations envision and ideate the future of their products and services, industries, and countries. Matthew's clients include three Prime Ministers and several governments, including the G7, Accenture, Bain & Co, BCG, Credit Suisse, Dell EMC, Dentons, Deloitte, E&Y, GEMS, Huawei, JPMorgan Chase, KPMG, Lego, McKinsey, PWC, Qualcomm, SAP, Samsung, Sopra Steria, T-Mobile, and many more.
WHY THIS MATTERS
Teaching robots to dance helps researchers fine tune robots motor skills so they can be more agile and dextrous.
At IROS in Madrid, Marc Raibert from Boston Dynamics, one of the world’s most advanced robotics companies, showed a few videos during his keynote presentation. One was of Atlas, the world’s most advanced humanoid robot, running through parks and doing parkour, which I wrote about a while ago, and the other was just a brief clip of SpotMini, Boston Dynamic’s robot dog sensation, dancing.
Now though the company has posted a new video of SpotMini, which they’re increasingly referring to as simply “Spot,” dancing to Uptown Funk, and frankly displaying more talent than the original human performance. The twerking is cute, but gets a little weird when you realize that SpotMini’s got some eyeballs back there as well.
Dog’s got moves
While we don’t know exactly what’s going on in this video, as with many of Boston Dynamics’ video, my guess would be that these are a series of discrete, scripted behaviours that are played in sequence. They’re likely interchangeable and adaptable to different beats, but, again, as with many of their videos, it’s not clear to what extent these dancing behaviours are autonomous and off the cuff, and how Spot would react to a different song.
Oddly though the guys and gals at ETH Zurich thought this would be an ideal time to write a paper on “Real-Time Dance Generation to Music for a Legged Robot.” Why write a paper on this you ask… well, apparently it’s because people like dancing and researchers think that that should then automatically mean people like dancing robots. We’ll see.
“Dance, as a performance art form, has been part of human social interaction for multiple millennia. It helps us express emotion, communicate feelings and is often used as a form of entertainment. Consequently, this form of interaction has often been attempted to be imitated by robots,” says the paper, before adding, “The researchers goal with [this work] is to bridge the gap between the ability to react to external stimuli, in this case music, and the execution of dance motions that are both synchronized to the beat, and visually pleasing and varied.”
The first step to creating a dancing robot is a beat tracker, which requires the robot to listen to music for between 1 second and 8 seconds, depending on how dominant the beat is. Once the beat is detected, a dance choreographer program builds dance moves by choosing dance motion “primitives” from a library. These primitives include parameters like repeatability, duration, trajectory amplitude, and tempo range, to help the program keep the overall choreography interesting. The overall idea is to emulate improvisation, in the same way that human dancers adaptively string together a series of different moves with transitions in between.
Once the robot has an idea for the dance moves that it wants to perform, the really tricky bit is performing those moves such that they properly land on the beat. For example, depending on how the robot’s limbs are positioned, the time that it takes for it to take a step in a particular direction can change, meaning that the robot has to predict how long each dance move will take to execute and plan its timing accordingly to compensate for the delay caused by actuators and motion and physics and all that stuff. Spot, for one, seems talented enough that it’s able to keep the beat even when it’s being shoved around, or when the surface that it’s standing on changes abruptly.
“We have successfully built the foundation for a fully autonomous, improvising and synchronized dancing robot. Nevertheless, the robot is still limited in its ability to entertain for a long duration and appears robot-like, due to its still limited variation in dance choreography. Future work can be done to achieve our ultimate goal of building a system, which outputs natural human-like dance, with great variations in movement and seamless transitions from one song to another,” said Raibert.
So, as the competition to create realistic dancing robots that jig to the beat all by themselves heats up I’m sure that this won’t be the last video we see of a robot dancing to some odd tune, so stay tuned. If you dare.