Diaries from the robot uprising
Not-so-human touch: UTA engineers teach robots how to feel through sensory-ladenskin
Engineers at UT Arlington’s Research Institute (UTARI) recently won a $1.35 million grant from the National Science Foundation that will fund a project to help robots feel pressure, pain, temperature and other data through skin. Robot skin.
We’re on our way to replicants here. UTARI engineers hope to develop skin filled with sensors that would enable robots to better interact with humans as well as improve prosthetics.
“Our goal is to make robots more human-like and human-friendly,” Popa said. “Robotic devices need to be safe and better able to detect human intent.”
Dan Popa, a UT Arlington associate professor of electrical engineering, will lead the project.
“Our goal is to make robots and robotic technology more human-like and more human-friendly,” Popa said in a statement. “Robotic devices need to be safe and better able to detect human intent.”
So far there is no known way to understand robot intent, as their cold, shiny shells fail to reveal anything known as “remorse” or “empathy.”
Popa expects the research to yield breakthroughs in how prosthetics are currently developed, creating a more realistic approach to artificial limbs.
“When someone is wearing a prosthetic, we want that prosthetic to be able to determine when a baseball is being thrown at it, then catch the ball,” Popa said.
The technology has the potential to yield exciting results. Those who have lost limbs or appendages could regain near — if not all — of the subtleties that come from being able to sense things like pressure and temperature.
At the same time, the project aims to create a future where robots “will share their living spaces with humans, and, like people, will wear sensor skins and clothing that must be interconnected, fitted, cleaned, repaired and placed.”
If this future holds true, the first thing to be replaced is humans.