Jusletter IT

Ethical Implications of AI-Powered Chatbots

Considerations for the AI Act: A case study of Tessa

  • Authors: Dawn Branley-Bell / Johannes Feiner / Sabine Prossnegg
  • Category of articles: AI & Law
  • Region: EU
  • Field of law: AI & Law
  • Collection: Conference proceedings IRIS 2024
  • DOI: 10.38023/f8b4776d-12bd-4661-aadd-cf6e148ce73b
  • Citation: Dawn Branley-Bell / Johannes Feiner / Sabine Prossnegg, Ethical Implications of AI-Powered Chatbots, in: Jusletter IT 15 February 2024
The use of Artificial Intelligence (AI), including the use of chatbots, is common and prevalence is expected to continue to rise. This paper delves into the creation and deployment of a chatbot named Tessa. Tessa was intended to aid users’ self-assessment of symptoms indicative of eating disorders and guide them towards relevant support services. The chatbot was designed to help ease strain on overburdened healthcare staff and offer support for individuals who may face significant delays in being able to access an in-person medical consultation. Unfortunately, despite a promising start, a recent incident with Tessa demonstrated how chatbots can go wrong. This paper analyses the incident from technical, psychological, and legal viewpoints, with a specific focus on key considerations around responsibility and safeguarding of chatbots within the health domain and the AI Act. This paper contributes to the ongoing discourse on the implications of AI-driven healthcare interventions, fostering a critical dialogue for future developments in this evolving landscape. We support the idea of regular assessments of AI interventions, improved regulation, and more stringent consideration of ethical and safeguarding issues.

Table of contents

  • 1. Introduction
  • 2. Technical Perspective – What may have gone wrong?
  • 3. Impact on human trust in AI
  • 4. Legal Perspective – Tessa and the AI Act
  • 4.1. The AI Act
  • 4.2. Chatbot Tessa within the AI Act
  • 5. Recommendations for the AI Act
  • 6. Conclusions
  • 7. References

No comments

There are no comments yet

Ihr Kommentar zu diesem Beitrag

AbonnentInnen dieser Zeitschrift können sich an der Diskussion beteiligen. Bitte loggen Sie sich ein, um Kommentare verfassen zu können.