In the blockbuster sci-fi movie Elysium the main character Max (Matt Damon) is fitted with an exoskeleton, a sort of body armor that augments his strength.
“Things like the Elysium exoskeleton is closer than we think,” says Aaron D. Ames, PhD and Bren Professor of Mechanical Engineering at the California Institute of Technology (Caltech) in Pasadena, California just outside Los Angeles.
“Robotic assistive devices that help you walk,” according to Dr. Ames who is Caltech University’s resident robot authority, ”or give you a workout, or help do your job better might be something in the future.”
Dr. Ames is part of the team at Caltech’s new Center for Autonomous Systems and Technologies (CAST). Launched in October, 2017 CAST will unite over a dozen engineers and scientists from many disciplines to advance research on robotics, drones, driverless cars, and machine learning.
Ames also heads up the Advanced Mechanical Bipedal Experimental Robotics (AMBER) Lab at Caltech, in the Department of Computing and Mathematical Sciences. He has spent twenty years designing algorithms and efficiency equations, to teach robots to generate their own walking gait.
Ames specializes in research on humanoid robots, robots that walk like a human on two legs using bipedal locomotion. He developed DURUS, an ultra-efficient walking humanoid robot. The SRI DURUS prototype walks on a treadmill using a heel-toe strike much like a human, and even wears Addidas sneakers. Currently AMBER lab works with the custom built robot AMBER 3M and Cassie (built by agilityrobotics).
Professor Ames has a list of degrees including a PhD in electrical engineering and computer sciences from UC Berkeley, and post doc awards to take up a lengthy paragraph on his Caltech bio page.
But in person he doesn’t exactly fit the mold of shy mathematician. He speaks to his students in a booming voice that hardly needs a microphone to project from the stage. And standing at over six feet, with dark brown hair and beard, and buffed out bi-ceps he could pass for someone you might run into at the gym.
He is a theoretical mathematician who lectures on the Lyapunov function and unified control framework mathematical computations. But Ames is also a science fiction fan at heart. So speaking to a small group of students and alumni at Caltech recently, he referenced popular sci-fi movies to frame what the future might hold.
“The idea behind iRobot starring Will Smith, that we’ll all be living with assistive robots that take out the garbage or have feelings,” says Ames, “is not something I see near term.”
Instead according to Ames, the most immediate changes that we can expect to see from robots and artificial intelligence in the next twenty years will be appear in the realm of: how we drive, how we work, and even how we recover from injury.
Robots are getting more complex and artificial intelligence is also getting more complex. So while Ames’s research focuses on the robot platform, he says the real potential lies in understanding the connecting point between robots and AI.
The industry research seems to agree. Leading research firm Boston Consulting Group (BCG) is conservatively projecting that the robotics industry will reach $87 billion by 2025.
Sci-fi movie Minority Report according to Ames, is a closer depiction by Hollywood of reality. The film depicts Washington D.C as the city of the future that includes autonomous self-driving vehicles whizzing around streets. In the film’s jaw-dropping chase scene, Tom Cruise’s self-driving vehicle is constantly negotiated pathway to its target destination, while avoiding traffic jams.
This swarm technology and collision control depicted in Minority Report, is in fact something that CalTech’s new CAST center will be working on for years to come.
“In terms of the Google car,” he says “We’re not quite there yet.”
According to Ames for the immediate future there is still much work to do in building algorithms, designing efficient hardware, and integrating AI into existing robots and drones.
Today at Caltech’s new Center for Autonomous Systems and Technologies (CAST), Professor Ames will be working with artificial intelligence and machine learning experts on this problem.
To illustrate the current obstacles facing autonomous robots and cars, he cues up a YouTube video clip of Boston Dynamics’s Atlas robot striding through the snow. The jaw-dropping clip shows a 6” tall 320 1b humanoid robot name Atlas walking in the snow using its upper limbs to negotiate the rough outdoor terrain.
“What the YouTube clip didn’t show,” says Ames, “was the person in the background using a joystick to drive the robot.”
Ames points to the video to a joystick operator casting his or her shadow in the snow, causing a few chuckles in the audience.
“Is that robot learning?”
Ames reminds his students that the Atlas robot is being fed commands (from the joystick operator), and is following functions of dynamics and control. The robot certainly looks cool, but is not a display of artificial intelligence, machine learning, or autonomy on a robot platform.
The hope is that CAST will create an environment that makes needed interdisciplinary collaboration possible – to bring together the state-of-the-art in robots and machine learning and Artificial Intelligence.
The new CAST center will focus on the study of mechanical engineering, AI and machine learning that could be applied to different platforms – exoskeletons, robots, or self-driving cars.
Ames says that is has been rare for systems designers like himself to interact with AI designers.
Could the next generation humanoid robot developed at CAST, one day hold the key to self-driving cars?
That just might be the million-dollar question.