Home » Uncategorized » CHI 2021 – a blog post by Shiping Chen

CHI 2021 – a blog post by Shiping Chen

When the Social Becomes Non-Human

Have you ever interacted with non-humans? A team of researchers from the University of Oslo and the SINTEF examined how young people perceive various types of social support provided by chatbots. Their results indicate that chatbots can be a daily source of social support for young people, helping them think about themselves more constructively, and stimulating self-disclosure without social judgment by offering a safe and anonymous space for conversation.

Read the research paper herehttps://dl.acm.org/doi/10.1145/3411764.3445318

Young people are increasingly suffering from mental health issues, but they tend not to seek out professional help. Despite needing social support, young people often struggle to reach out to others. This problem has become more acute during the COVID-19 pandemic. Unexpected changes to professional and personal lives have placed a burden on people’s mental health. Pandemic-related restrictions such as lockdowns and social distancing have made it more difficult to receive in-person support from friends and professionals. As a result of all this, we now urgently need effective online tools that can provide people with the social support they need.

Chatbots, especially those designed for social and mental health support (social chatbots, or emotionally aware chatbots), can help meet these demands. As artificial agents, chatbots interact with users through natural language dialogue (text, speech, or both). Social chatbots, such as ReplikaWoebot, and Mitusuku, imitate conversations with friends, partners, therapists, or family members to be humanlike, with the potential to perceive, integrate, understand, and express emotions. 

Other online channels (e.g., Instagram, Facebook, online groups, and health forums, etc.) can also provide social support to young people, however they carry certain limitations such as the risk of receiving inaccurate guidance or the possibility of not receiving help from others despite reaching out.

The researchers conducted in-depth interviews with sixteen young people aging from 16-25 years old. They found that after using Woebot for two weeks, most participants reported that the chatbot provided appraisal support and informational support; around half of them received emotional support; and some received perceived instrumental support.

  • Appraisal support: Support offered in the form of feedback, social comparison, and affirmation. 
  • Emotional support: Expressions of empathy, love, trust, and caring.
  • Informational support: Advice, suggestions, and information given by the chatbot to help the user solve a problem. 
  • Instrumental support: Tangible aid, which is characterized by the provision of resources in offering help or assistance in a tangible and/or physical way, such as providing money or people.

So more specifically, what did the sixteen young people’s find was good about these chatbots? First of all, as a non-human agent, the conversational chatbot can make people feel like they are writing a diary entry or speaking in a way to themselves, thereby facilitating self-reflection, and making self-disclosure easier, safer, and more honest. Using chatbot for social and emotional support is also thought to be more reliable than talking to a human, as well as being a good choice for discussing worries that are more personal or private.

As artificial agents, chatbots can easily provide users with lots of relevant, immediate, and efficient information, without being constrained by time or space. This may be useful when our worries extend beyond the scope of our friends’ knowledge or expertise. Moreover, different sources of support can act collaboratively, as the chatbot Woebot was reported to motivate users to contact others for help as well as guiding people in their search for information, which indicates that a chatbot may have the potential to help solve practical problems.

However, despite the many positive comments from users, using current chatbots for social support is not a perfect solution. Current chatbots may have biases, or inadequate or failed responses, affecting the quality of the user experience. Psychologically, getting support from others makes some people feel ‘cared for and loved’, which is not the case when receiving response from a chatbot, as some may see chatbots as merely robots and not emotional. People who need chatbots as a source of support, may need time to develop relationships, become familiar with and develop trust towards chatbots. Through conversations about personal stories, user’s private data is collected and stored in chatbots. Ensuring users’ privacy and maintaining a relationship of trust is another challenge. 

Indeed, chatbots provide us with a new way to get connected and supported besides the traditional human-human context. It could be further evaluated and studied in a larger sample and different user groups. This brings further questions about how chatbots for social support may influence the future of human communication. Imagine the future – when one day chatbots can provide social support like real people. What will be the human-human relationship at that time? When we have another place (chatbots) to talk about our distress or happiness, how will it affect our interpersonal relationships?