People with profound motor deficits can improve their quality of life using robotic body surrogates

0
1378
The robotic body surrogate (Willow Garage PR2). (A) The PR2 robot. (B) One of the robot’s seven DoF arms, including the tactile-sensing fabric skin (gray) and foam padding (black) on the metallic gripper. (C) The base of the robot, including tactile-sensing fabric skin (blue), placed atop foam padding.

An interface system that uses augmented reality technology could help individuals with profound motor impairments operate a humanoid robot to feed themselves and perform routine personal care tasks such as scratching an itch and applying skin lotion.

The web-based interface displays a “robot’s eye view” of surroundings to help users interact with the world through the machine.

The system, described March 15 in the journal PLOS ONE, could help make sophisticated robots more useful to people who do not have experience operating complex robotic systems.

Study participants interacted with the robot interface using standard assistive computer access technologies – such as eye trackers and head trackers – that they were already using to control their personal computers.

The paper reported on two studies showing how such “robotic body surrogates – which can perform tasks similar to those of humans – could improve the quality of life for users.


Enabling system operation through single-button mouse-type input simplifies design and provides broad accessibility.Individuals with diverse disease or injury conditions likely have diverse and possibly changing levels of impairment. These individuals may choose to use a variety of commercially-available, off-the-shelf input devices that enable single-button mouse-type input, which can be used to operate our robotic body surrogate. The many possible combinations of disease/injury, impairment, and usable computer interface are connected here by gray lines. These devices make our system accessible across a range of sources of impairment and personal preferences. Also, system developers only need to support a single mode of interaction, reducing development and support effort. Examples: (Blue line) An individual with ALS may have limited hand function and choose to use a head-tracking mouse; (Orange line) An individual with spinal muscular atrophy (SMA) may experience upper-extremity weakness, and prefer the use of a voice-controlled mouse; (Green line) An individual with a spinal cord injury (SCI) may only retain voluntary eye movement, and use an eye-gaze based mouse. All three of these individuals can operate our system without modification, making it accessible across types and sources of motor impairment.

The work could provide a foundation for developing faster and more capable assistive robots.

“Our results suggest that people with profound motor deficits can improve their quality of life using robotic body surrogates,” said Phillip Grice, a recent Georgia Institute of Technology Ph.D. graduate who is first author of the paper.

“We have taken the first step toward making it possible for someone to purchase an appropriate type of robot, have it in their home and derive real benefit from it.”


The end-effector position control ring augmented reality interface with virtual preview (yellow) and goal (green) gripper displays.(A) The control ring’s rotation remains aligned with the robot’s body. (B) The control ring appears parallel to the floor to convey vertical height. (C) A yellow virtual gripper ‘previews’ commands by displaying the pose the gripper will attempt to reach if commanded. (D) A green virtual gripper displays the gripper’s current goal, and disappears once it reaches this goal.  

Grice and Professor Charlie Kemp from the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory University used a PR2 mobile manipulator manufactured by Willow Garage for the two studies.

The wheeled robot has 20 degrees of freedom, with two arms and a “head,” giving it the ability to manipulate objects such as water bottles, washcloths, hairbrushes and even an electric shaver.

“Our goal is to give people with limited use of their own bodies access to robotic bodies so they can interact with the world in new ways,” said Kemp.

Seeing through a robot's eyes helps those with profound motor impairments
Showing its capabilities as a body surrogate, a PR2 controlled remotely by an individual with profound motor deficits picks up a cup in a research laboratory at the Georgia Institute of Technology. Credit: Phillip Grice, Georgia Tech

In their first study, Grice and Kemp made the PR2 available across the internet to a group of 15 participants with severe motor impairments.

The participants learned to control the robot remotely, using their own assistive equipment to operate a mouse cursor to perform a personal care task.

Eighty percent of the participants were able to manipulate the robot to pick up a water bottle and bring it to the mouth of a mannequin.


 Contact displays overlaid on the video interface based on data from the fabric-based tactile sensors.(A) Contact on the forearm against the table edge. (B) Contact between the robot’s base and the wheelchair. (C) Contact with the robot’s base behind the current field of view. 

“Compared to able-bodied persons, the capabilities of the robot are limited,” Grice said.

“But the participants were able to perform tasks effectively and showed improvement on a clinical evaluation that measured their ability to manipulate objects compared to what they would have been able to do without the robot.”

In the second study, the researchers provided the PR2 and interface system to Henry Evans, a California man who has been helping Georgia Tech researchers study and improve assistive robotic systems since 2011.

Evans, who has very limited control of his body, tested the robot in his home for seven days and not only completed tasks, but also devised novel uses combining the operation of both robot arms at the same time – using one arm to control a washcloth and the other to use a brush.


The interface used to operate the robotic body surrogate.(A) ‘Looking’ mode. (B) ‘Spine’ mode. (C) ‘Driving’ mode. (D) ‘Hand position’ mode. (E) ‘Hand rotation’ mode. (F) ‘3D Peek’ depth display. 

“The system was very liberating to me, in that it enabled me to independently manipulate my environment for the first time since my stroke,” said Evans.

“With respect to other people, I was thrilled to see Phil get overwhelmingly positive results when he objectively tested the system with 15 other people.”

The researchers were pleased that Evans developed new uses for the robot, combining motion of the two arms in ways they had not expected.

“When we gave Henry free access to the robot for a week, he found new opportunities for using it that we had not anticipated,” said Grice.

“This is important because a lot of the assistive technology available today is designed for very specific purposes.

What Henry has shown is that this system is powerful in providing assistance and empowering users.

The opportunities for this are potentially very broad.”

Seeing through a robot's eyes helps those with profound motor impairments
The view through the PR2’s cameras showing the environment around the robot. Clicking the yellow disc allows users the control the arm. Credit: Phillip Grice, Georgia Tech

The interface allowed Evans to care for himself in bed over an extended period of time.

“The most helpful aspect of the interface system was that I could operate the robot completely independently, with only small head movements using an extremely intuitive graphical user interface,” Evans said.

The web-based interface shows users what the world looks like from cameras located in the robot’s head.

Clickable controls overlaid on the view allow the users to move the robot around in a home or other environment and control the robot’s hands and arms.

When users move the robot’s head, for instance, the screen displays the mouse cursor as a pair of eyeballs to show where the robot will look when the user clicks.


15 participants with profound motor deficits operated the robotic body surrogate over long distances to simulate getting themselves a drink.(A) The layout of the task room at the beginning of the tasks. The bottle (left) is placed on a shelf, approximately two meters in front of the robot, and the mannequin in a wheelchair is placed nearby. The observing researcher sits in the back of the room. (B) A participant remotely retrieving the water bottle. (C) A participant reaching and rotating the grasped bottle toward the mannequin’s mouth. (D) The straw in the bottle at the center of the mannequin’s mouth, showing the small screw adhered to the magnet behind the mannequin’s mouth, indicating successful completion of the task.

Clicking on a disc surrounding the robotic hands allows users to select a motion.

While driving the robot around a room, lines following the cursor on the interface indicate the direction it will travel.

Building the interface around the actions of a simple single-button mouse allows people with a range of disabilities to use the interface without lengthy training sessions.

“Having an interface that individuals with a wide range of physical impairments can operate means we can provide access to a broad range of people, a form of universal design,” Grice noted.

“Because of its capability, this is a very complex system, so the challenge we had to overcome was to make it accessible to individuals who have very limited control of their own bodies.”

While the results of the study demonstrated what the researchers had set out to do, Kemp agrees that improvements can be made.

The existing system is slow, and mistakes made by users can create significant setbacks.

Still, he said, “People could use this technology today and really benefit from it.”

The cost and size of the PR2 would need to be significantly reduced for the system to be commercially viable, Evans suggested. Kemp says these studies point the way to a new type of assistive technology.

“It seems plausible to me based on this study that robotic body surrogates could provide significant benefits to users,” Kemp added.


More information: Phillip M. Grice et al, In-home and remote use of robotic body surrogates by people with profound motor deficits, PLOS ONE (2019). DOI: 10.1371/journal.pone.0212904

Provided by Georgia Institute of Technology

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Questo sito usa Akismet per ridurre lo spam. Scopri come i tuoi dati vengono elaborati.