Subscriber Benefit
As a subscriber you can listen to articles at work, in the car, or while you work out. Subscribe NowAniket Bera has been fascinated by robots since he began studying the Roomba vacuum cleaner as a doctoral student 10 years ago.
The machine, famous for its ability to turn corners and work around obstacles without human intervention, was an amazing invention, Bera thought. But instead of building a consumer appliance, he wanted to build humanoid robots that could walk, interact with people and deal with all kinds of environments, from city streets to sandy beaches.
Those robots could one day help people with household tasks, such as cooking meals and painting rooms. But that meant the robots would need to learn to co-exist with humans and understand their world.
“We were like, ‘OK, tomorrow there will be robots that will walk among us,’” said Bera, now an associate professor of computer science at Purdue University. “How would they look at humans? How would they predict human behaviors? How would they navigate in that space?”
It’s an area Bera, 35, has researched extensively, writing dozens of papers that have been cited nearly 2,000 times.
Along the way, he has also “taught” robots how to walk, dance, navigate, learn spatial surroundings, understand queries and acknowledge commands.
It’s a daunting challenge, trying to marry the hardware of robots with the software of artificial intelligence. It requires building endless mathematical models to help the robot see and move, then translating the math into software coding.
After that comes the engineering: assembling the robot with computer sensors to perceive its surroundings, a computer to process the information, and actuators to move the robot. Today, Bera has a whole suite of human-robot interactions and collaboration.
In addition to teaching, Bera is also director of Purdue Computer Science’s IDEAS Lab (Intelligent Design for Empathetic and Augmented Systems).
“My lab typically works on the brains,” he said. “But a lot of times, it’s not just the brains. You need to have physical things to make sure you’re able to accomplish what you want. So we do engineering. … We bring the hardware together. We fuse different sensors to it.”
And the potential applications are dizzying, from helping elderly and disabled people with household tasks to supporting soldiers in battle.
It might sound a bit futuristic, but some of the world’s largest companies are racing to make it happen.
On the bandwagon
Earlier this year, tech giant Nvidia unveiled computer software that is teaching humanoid robots how to pour coffee, do calisthenics, play the drums and walk over uneven ground—all on their own, without remote operators.
Agility Robotics, a private company based in Salem, Oregon, recently opened a factory to scale up production of its humanoid robot called Digit.
The company plans to build 10,000 robots a year that can take commands from humans and make decisions about industrial production.
“Inside the world’s first humanoid factory, where robots could eventually build themselves,” said the headline from CNBC.com on Oct. 9.
Automotive company Tesla is working on a line of humanoid robots called Optimus and is trying to teach them independent tasks.
At a publicity event this month, Tesla stationed Optimus robots to interact with attendees. But several publications later discovered humans were operating the robots, indicating that the technology was still in early stages.
“The use of human input raises questions over the capabilities and market-readiness of the bot, which [CEO Elon] Musk said last week he expects to be ‘the biggest product ever of any kind,’” the Los Angeles Times reported.
Musk told the crowd that Optimus robots could handle many household tasks and could eventually be available to consumers for $20,000 to $30,000 each.
If this proves true, it will be a big leap in robotics.
Today, most robots are industrial helping hands, working in factories to assemble cars, package food, and make plastic and chemical products. For the most part, they cannot make decisions or react to the world around them.
That’s where researchers like Bera come in. He earned his doctorate from the University of North Carolina a decade ago and joined Purdue two years ago after a faculty stint at the University of Maryland.
Some robots, when given enough training, can even guess how you’re feeling by the way you walk. When he was at the University of Maryland, Bera helped develop an algorithm called ProxEmo, which gives robots the power to analyze your gait in real time and act accordingly.
“If somebody’s feeling sad or confused, the robot can go up to the person and say, ‘Oh you’re looking sad today; do you need help?’” Bera told Wired magazine in 2020.
The ability to judge extreme emotions could be helpful for a humanoid robot, he added.
“Like if an angry person were to walk towards it, it would give more space, as opposed to a sad person or even a happy person,” Bera said.
Teaching methods
His main task, as he sees it, is to help robots understand the physical world around them so they can learn how to interact in the human world and in nature.
Bera is even teaching robots dance movements to help them with coordination, cognition and perception, according to a Sept. 17 Purdue press release.
“How does a robot learn to define dancing? By watching dancers,” the release said. “To that end, Bera and his team have filmed dozens of people performing a diverse range of traditional cultural dance forms, including Latin, ballroom and folk dances.”
The next step is to use the data to train a machine-learning model to teach a robot to dance by itself or with others.
Bera estimates that 50% of what he does is pure math, building the foundational principles of an activity. He spends another 20% translating the math to software code. Then another 30% assembling the robot to enable the software to work.
Teaching a robot to do a simple task, such as catching water from a stream in the hand, can also involve physics and higher-level math.
“Now as a human, I understand how fluids work,” Bera said. “I don’t necessarily understand fluid dynamics, but in looking at the world for a long time, I know how water works.”
There are two ways to teach a robot the same thing. One is to draw up equations for fluid dynamics and to show the robot how to recognize water and to understand how it behaves.
The other, Bera said, is a data-driven approach, using machine learning, where the robot gradually learns by watching.
“So we look at lots and lots of videos, and we build a mental model of what the mathematics behind that is,” he said.
And perhaps the next challenge is teaching the robot how to distinguish between water and other fluids, such as motor oil, apple juice, milk and alcohol.
The relatively young field of computer science is evolving so rapidly that Bera and his laboratory are racing to stay near the front.
“It’s to the point that if I did not stay on top of things for even a few weeks, I would get outdated, and it would be very hard to catch up,” he said.
The same hard work goes into teaching a robot how to walk on sand or mud and up and down uneven steps without human intervention.
And like the Roomba, the robot will sense when to turn a corner and how to react to an obstacle.
“Our human minds are very good at adapting to things from one domain to another,” Bera said. “Robots are not like that. So we are building these models so that the robots can adapt and learn.”•
Please enable JavaScript to view this content.