A small team of University of Tulsa researchers will soon purchase a Barrett WAM® robotic arm thanks to a $137,446 grant from the National Science Foundation. Joining Associate Professor of Mechanical Engineering Joshua Schultz, the group’s principal investigator, are Professor of Mechanical Engineering Steve Tipton and Associate Professor of Anthropology Danielle Macdonald. The title of their project is Major Research Instrumentation: Acquisition of a Lightweight 7-Axis Robotic Manipulator with Force Sensing for Archaeological and Engineering Research and Education.
The Barrett WAM® robotic arm is one of the most sophisticated robot manipulators in existence, used by many of the world’s leading robotics research groups. While TU has had a robotic research group for nearly a decade, the lack of a robot arm has limited the types of experiments that could be performed and required extensive time and effort to build fixtures and jigs. “This arm will be a creative tool that allows TU researchers to conduct novel experiments that advance our understanding of robots and autonomous systems,” noted Schultz.
The new arm is capable of moving a tool attached to its end to any position it can reach, at any angle. “We will use this arm to do experiments that move something over and over thousands of times, possibly changing it a little bit each time,” explained Tipton. One example he points to involves bending metal tubes back and forth to determine how many times they can bend before they break, knowledge of which is critical for determining when, for example, critical tubing systems should be removed from service.
The arm will also be used to position TU’s robotic hand so that researchers can study how to pick up and move objects with that hand. It will also be deployed to bump and push soft robots to see how they behave when they collide with other objects.
Outside of its applicability for engineering research and teaching, the Barrett WAM® arm will be a boon for researchers in archaeology. “In this regard,” commented Macdonald, “we can program the robot to use stone and bone tools the way prehistoric people did. The wear traces on experimental archaeological tools used by the robotic arm will be compared to wear traces on artifacts, which will allow us to gain deeper insight into the lives of ancient people.”
Mechanical engineering doctoral student Caroline Schell and postdoctoral associate Peter Bui take us inside the emerging field of soft robotics in this experTU video. Members of Biological Robotics at Tulsa, these scholars understand what it takes to make their soft robot, fittingly named “Squishy,” respond to its environment, fragile objects and touch.
When most people think of robots, they likely envision rather hard-surfaced (often metallic), fairly durable machines. Researchers at The University of Tulsa, however, have designed and are conducting experiments with Squishy – a rather more delicate fabric-reinforced inflatable soft robot.
Postdoctoral Associate Peter (Phuc D. H.) Bui and Associate Professor of Mechanical Engineering Joshua Schultz recently published their latest soft robotics findings in Frontiers in Robotics and AI. In this article, the duo report on the development of a semilinear parameter-varying (SPV) observer (state estimator) they tested using Squishy.
What’s a soft robot?
As the name implies, a soft robot has a body made of soft materials, such as silicone. It is designed to do some tasks that a rigid robot cannot do, such as handling or touching fragile objects without damaging them or itself.
“Because of its soft and inflatable body, a soft robot is capable of safely and compliantly interacting with its surrounding environment,” explained Bui. “It can even survive a strong collision or falling from a great height.” With Squishy, a particular advantage is that it can be used to push an object using any part of its body, whereas a rigid robot typically interacts with its surroundings using some sort of tool at its tip.
Behaviorial evaluation for robots
The SPV model Bui and Schultz discuss in their article is an advance on the current state of knowledge and practice in the field of soft robotics because it accounts for Squishy’s hysteresis (i.e., the motion when a robot follows two different trajectories: one when inflating and one when deflating). Other researchers have not considered this behavior in their models.
Robotics researchers such as Bui and Schultz often use the word “perception” when discussing the measurement of a robot’s “pose” (i.e., its shape and position in a 3D space).
But perceiving the pose of a soft robot in real time is challenging because its body is a continuum, unlike the body of a rigid robot, which is composed of well-defined links and joints. The normal way of measuring a soft robot’s pose is to employ a 3D motion-tracker system. But such a system is noisy, complicated and difficult to use in real time. To address this challenge, the TU researchers devised a state estimator for Squishy based on the output from a pressure sensor.
Bui and Phuc’s research has been assisted and broadened by graduate students Caroline Schell and Garrett Williamson, who have been instrumental in helping to build Squishy. The project’s co-principal investigator, Associate Professor of Mechanical Engineering Michael Keller, has contributed to creating its “smart material.”
The TU group is partnered with a team at Brigham Young University. There, co-principal investigator Marc Killpack has performed calculations and generated a Python simulation program to validate the SPV modeling approach. In addition, several Brigham Young undergraduate students have been involved in the research.
“Measuring a soft robot’s pose, velocity and acceleration is almost impossible using exclusively onboard-sensing devices,” explained Schultz. “Therefore, our work, which we discuss in the Frontiers in Robotics and AI article, entailed designing and testing an observer algorithm – the Sliding mode-based SPV observer — that can estimate all those states. Our model provides the information necessary to understand Squishy’s working status and supports the control process to generate correct control actions.”
Next frontier: Machine-learning
As they look to the future, Bui and Schultz are developing an embedded sensing system that can be used with Squishy in any environment. In particular, they are focused on creating a system for use when something in the robot’s environment pushes it off its natural free-space path and that can support a machine-learning algorithm to localize where that push is coming from.
“We also envision that the new sensing approach and observer designed in this research will be combined to serve the controller. The result of this will be increased reliability and the ability for the robot to perform intelligent behavior,” remarked Bui.
The research reported in this story is supported by NSF grant No. 1935312 EFRI C3 SoRo: Between a Soft Robot and a Hard Place: Estimation and Control Algorithms that Exploit Soft Robots’ Unique Abilities.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.