twitterfacebookgoogleyoutuberss
Full Issues
Download the latest full issue here! →

Robots that Care: Some beneficial effects of human-robot interaction in healthcare

  • Posted On: 5th June 2014

By Jeroen Arendsen & Agali Mert

Robots are a type of technology that interacts with people in a special way. They can be put to good use in all kinds of domains, of which healthcare is, at the moment, a very important one. Here, an overview is given of current developments in the field of human-robot interaction and of some potential beneficial effects this might have in healthcare. This field of scientific study – sometimes referred to as “social robotics” – is very young. Published controlled and randomized studies are still sparse and importantly, they represent only a small fraction of the fast pace of developments. Therefore, this overview also includes hypothesized and speculative effects.

Features

Naturally, it is essential for a robot to be able to move on its own (or besteered by remote). The “mere” feature of being able to move alongside us allows robots to capture most people’s attention much longer and stronger than computer programs on a screen, especially if the robot walks and moves likes us (with feet and hands), as do Asimo and its smaller brother Nao. These robots can navigate our man-made spaces better and engage in shared activities, such as imitation games or dances. In many robots, such as Careo- Bot, vision is supplemented with extra- human sensing capabilities (e.g. Lidar, GPS, beacons) to move around quickly and safely. Being and moving together with people is also a feature that drives “telepresence” robots, like RP-7, that can relay conversation and attention of a distant doctor to a patient’s bedside. Robots like Telenoid are designed to enable dutiful Japanese sons to (virtually) visit their distant fathers more often without the hassle of travel.

Screen shot 2014-06-05 at 11.03.08

 

One of the oldest available means of interacting with robot animals and interactive dolls is by handling them. Having robots respond to being handled can engage people’s attention, stimulate motor activity of, for example, inactive elderly people, or generate a pleasant, soothing somatosensory experience. Paro was “designed for handling” with a weight reminiscent of that of a human baby. It also senses being touched or stroked, due to advanced surface tactile sensors in its body (which it likes) and whiskers (which it doesn’t like) and will respond with sounds.

Vision is a very important feature for robots. Fitting one or two cameras, preferably as eyes, into a robot’s head is easy, but making sense of the video input is often challenging. Seeing where people are enables robots, like Paro and Opto-Isolator, to look at us and make eye contact. Robots can express nonverbal actions to create a very lifelike impression, capture our attention, and set the stage for prolonged personal and emotional interaction. For example, blinking and looking away now and then, instead of staring people down, is an important mechanism to reduce tension in social interactions. Seeing also enables robots, like Bandit, to imitate and understand our actions.

Gesturing, defined as acting with an intention to communicate, requires robots to move body parts, to address people and to understand what an action means or communicates. Automatic Gesture Recognition (AGR) is an area of development that is immature and, at the moment, mostly independent from robotics research. Some robots use AGR technology to see and understand gestures, like Asimo or Nao. Perhaps most powerfully, gestures can be used together with speech to create a convincing talking robot. Gestures and facial expressions are used by the MDS (Mobile Dexterous Social) Robot, for example, to make robots appear lively and capable of expressing emotions.

Most robots can listen and respond to speech by using the automatic speech recognition (ASR) technology that has developed over the last fifty years (but which is still awkward in many respects). Users can use their voice to control robots like i-Sobot, Robosapien, Robopet and many others, which stimulates interaction. ASR technology also includes identification of people’s voices, which Paro uses to know who is handling it and to adapt its behavior accordingly.

Talking is an art mastered by only a few robots. Most robots can play soundbites or use Text-To-Speech (TTS) engines to read out text, but do so without lip or mouth movements which can appear somewhat strange and disrupt the illusion of artificial life. However, robots like iCat, Kismet and MDS speak with their mouths, combined with facial expressions (MDS also integrates head, arm and hand gestures). This makes robots more equal, interactive partners and gives robots a greater power to stimulate desirable behaviour. In this way, the iCat has been used to stimulate elderly users to participate in fitness exercises.

Caring

As discussed above, users can obviously handle robots, but robots can handle humans as well. Robot nurses RIBA and RI-MAN lift patients from their bed, carry them and help to relieve the heavy physical demands placed on nurses. Likewise, nurses can strap on exoskeletons Hal 3 or Sarcos. Disabled people can also use (partial) exoskeletons to regain their own mobility or rehabilitate better (e.g. NESS L300 Foot Drop System). Other nursing activities are also being marked as robot jobs. Having a robot wash patients or help them with their bath (e.g. Avant Santelubain 999) can be useful because with these activities people, both patients and nurses, can experience feelings of shame or embarrassment (especially in societies with strict social conventions). This is even more so with assistance in sexual activities for which robots are also available, such as Roxxxy.

Robots can also help people in their Activities of Daily Living (ADL). Robot feeders, such as MySpoon, Meal Buddy, or the Mealtime Partner Dining System, help disabled people eat independently. This makes eating more satisfying and it lifts a time-consuming burden from nurses. Robot arms, like Focal’s Jaco or Bridget, can help people with limited arm function to grab things, open doors, etc. This improves their independence and self esteem, and makes robots suitable for training ADL functioning.

Robots, equipped with many interactive features, as described above, are also outfitted with ever more sophisticated cognitive models. Their perception of us is interpreted on pragmatic, semantic, intentional, or emotional levels. In turn, their responses, speech, gestures, and other expressive actions are shaped according to rules of empathic and social interaction. Does that mean they actually care? They act like they care and sound like they care, so the human brain, under the right circumstances, creates meaning from this interaction.

The Road Ahead

There are robots like Asimo or Probo, for example, which seem to integrate nearly all of the available features. Asimo was originally presented as a robot who could help out in a house or a hospital by, for example, bringing people coffee. After some 15 years of development, it is one of the most sophisticated and expensive robots on the market. It is mostly a showcase and entertains people in Disneyland. Probo is designed as a “research platform to study cognitive human-robot interaction (cHRI) with a special focus on children” and is loaded with features that it may or may not need for actual healthcare applications. For example, children with autism, one of the stated target research groups, may prefer not to have to interpret difficult emotional expressions or the meaning of a raised trunk.

For the future, we may do well to let successful designs like Paro, Keepon, the NESS L300 Foot Drop System, or Careo- Bot inspire us. These robots share a quality, which is that they were designed without compromise to fulfil a specific goal and only have the features they really need. Apart from robots that are successful, it is just as important to identify those projects that were not as successful. They also give insight into key elements of robots for successful interaction with humans. Even more so, knowing the circumstances under which the human brain is willing to postpone its disbelief that interaction with robots is actually mediated, is in this stage of robot development a practical research and development focus.

Jeroen Arendsen, Ph.D.                                                                                                     TNO Human Factors                                                               jeroenarendsen@gmail.com

Agali Mert, M.D.                                                                                                           National Military Rehabilitation Center Aardenburg                 a.mert@mrcdoorn.nl.                                                                                                           The Netherlands                                                                               http://robotsthatcare.com

Brenda Wiederhold About Brenda Wiederhold
President of Virtual Reality Medical Institute (VRMI) in Brussels, Belgium. Executive VP Virtual Reality Medical Center (VRMC), based in San Diego and Los Angeles, California. CEO of Interactive Media Institute a 501c3 non-profit Clinical Instructor in Department of Psychiatry at UCSD Founder of CyberPsychology, CyberTherapy, & Social Networking Conference Visiting Professor at Catholic University Milan.

Written by Brenda Wiederhold

President of Virtual Reality Medical Institute (VRMI) in Brussels, Belgium. Executive VP Virtual Reality Medical Center (VRMC), based in San Diego and Los Angeles, California. CEO of Interactive Media Institute a 501c3 non-profit Clinical Instructor in Department of Psychiatry at UCSD Founder of CyberPsychology, CyberTherapy, & Social Networking Conference Visiting Professor at Catholic University Milan.