BYU's newest robot can lift extremely heavy and unwieldy objects
In 2025, humanoid robots aren’t anything new — just check out some of the droids that came out of the Consumer Electronics Show last week. But most of those robots getting headlines are still limited in how they interact with people.
For BYU graduate student Curtis Johnson and his mentors in the BYU Robotics and Dynamics Lab, it’s time to make robots more helpful.
“What if the robot could push with its shoulder or push things out of the way with its whole body? That really expands its capabilities,” said Johnson, a doctoral student in mechanical engineering.
To that end, Johnson, under the tutelage of mechanical engineering professor and robotics expert Marc Killpack, has built Baloo, a revolutionary robot that can interact with the world with its whole body. Baloo can impressively lift large and unwieldy objects such as ladders, kayaks, car tires, chairs, and heavy boxes. And it does so safely because its whole structure is flexible.
If soft robots like Baloo can be taught to do helpful tasks, they could become safer working companions. Robots with exclusively hard components can “either break the robot or break you, whichever one is stronger,” Johnson said.
In addition to designing intelligent heavy lifters like Baloo, the Robotics and Dynamics lab is working on collaborative techniques as part of a National Robotics Initiative project, funded by the National Science Foundation. This project is a collaboration between Killpack and John Salmon, a fellow professor in the mechanical engineering department. They recently published a study providing insights on physical human-human co-manipulation of bulky or heavy objects. This study is foundational to their current human-robot studies.
Study coauthor Shaden Moss recently completed his master’s thesis in which he helped to program a robot to follow an object that is co-manipulated by a person with a VR headset. Moss analyzed how the stiffness in the arm of the robot affected the way people completed cooperative tasks such as moving a table in various directions.
“We used math to describe the tasks in virtual reality so the person can see it on the VR headset,” Moss said. “They’re the same every time, but the robot never knows that. All the robot knows is that it can feel the person start to pull on the table and so the robot will move in that direction.”
Dallin Cordon also recently finished his master’s thesis in robotics. Using the same platform as Moss, Cordon developed a control system that mapped the displacement of the flexible robotic arm to the motion of a mobile base, enabling human-robot manipulation of a rigid object.
“No one has ever really done collaborative manipulation with a soft robot, so this is a novel approach,” Cordon said. “I think the first moment when it worked, even if it didn’t work well, was exciting because it finally did what we wanted it to do. That was electrifying.”