Strained Collaborations: Can social robots prevent dementia?
Jeon, C., Shin, H., Kim, S., & Jeong, H. (2020). Talking over the robot: A field study of strained collaboration in a dementia-prevention robot class. Interaction Studies, 21(1), 85-110.
In the world of increasingly aged population, how to provide care for older adults? This is a particularly pertinent question in the countries where the numbers of healthcare staff are plummeting. Governments are turning towards technology as one solution to such issues. Among popular ideas of technological innovations are social robots such as Silbot, which was designed in South Korea. Silbot is branded as a “dementia-prevention robot” and is the main feature of weekly classes for aging adults. The efficacy of this “robot teacher” in preventing dementia has been measured through controlled trials by scanning the students’ brain before and after the classes.
But Chihyung Jeon and colleagues, who conducted ethnographic research of these classes, argue that there is more to their efficacy than can be shown by brain scans: the human instructor who facilitates the interactions between Silbot and the students plays a crucial role. Surprisingly, however, he remains completely invisible in both class advertisements and in the efficacy research.
(Photo from Jeon et al. 2020)
A robot cannot social in and of itself, but is made social only through interactions with people,
as Jeon and colleagues note. Here's the catch, though: robots are often viewed only in relation to their end-user. However, things are not so simple. There is usually also a mediator, a person who is engaged in setting the robot up, maintaining it, responding to any technological breakdowns, and ensuring that interactions with the end-users run smoothly.
In the case of Silbot, the instructor and the robot engaged in what Jeon and colleagues call strained collaboration. It's collaboration, because the human and the robot actively work together to engage the students in the class activities. It's strained, because the authority of the teacher is not evenly distributed among them. Although the advertisements for the class feature only Silbot, it is actually the human instructor who leads the class in varying degrees of collaboration with the robot, trying to overcome its social awkwardness and functional limitations.
When observing the clases, the ethnographers discovered that before the robot was even activated, the instructor gave the students a mini-lecture on what dementia was and what to do to avoid it. This was key to boosting the students’ motivation. Only then Silbot was invited to proceed with various cognitive exercises. But problems arose immediately: the instructions for the exercises were often unclear and the students struggled with using the tablet to note their answers. So the instructor offered additional explanations and also helped the students with memorizing strategies. Additionally, when Silbot spoke, the students regularly struggled to understand its mechanical voice. The instructor promptly solved the issue by speaking over the robot in a louder, clearer voice. In other exercises, such as a sing-along exercise, Silbot was able to be more involved, as the students simply had to follow its movement and music. Here, the collaboration was less strained and the division of labor between the instructor and the robot smoother.
Who was the teacher, then, and who was the teacher's assistant? As Jeon and colleagues note, the tension in the collaboration between the two teachers stems from Silbot's ambiguous identity, as the robot was neither fully autonomous not a mere teaching instrument. As they write:
“Success or failure of human-robot interactions largely depends on the mediator’s actions, both planned and improvised, based on their knowledge and experience of the other participants' physical, psychological, or technical characteristics.”
All this goes to show that robots can work well provided that a human mediator intervenes and compensates for their technical and social limitations when needed. Despite playing a key role in the efficacy of the robot, the human instructor in this case remained an “invisible technician” (Shapin 1989) – or in other words, an “absent presence” (Law 2002) – an element that is crucial to the functioning of the technology, but whose work is rendered invisible in the representations of technology.
Other sources cited:
Law, John. 2002. Aircraft Stories: Decentering the Object in Technoscience. London: Duke University Press.
Shapin, S. (1989). The invisible technician. American scientist, 77(6), 554–563.