How do you solve a problem like Alexa?
Emotional Artificial Intelligence (EAI) is emerging as a mainstream technology. With the increasing use of EAI in different sectors, novel legal and ethical concerns are being raised. This paper focuses on an entertainment and creative content delivery recommendation system application, namely Amazon Alexa’s use of Neural-Text-to-Speech (NTTS) technology to capture and respond to users’ perceived emotions based on their voice. It also enables Alexa to play music based on, among other elements, its perception of users’ emotions. Given the recognised impact ‘music‘ has on emotions and Alexa‘s increasing involvement with big music sector actors, we are particularly concerned about how this domain can enable the manipulation of users. We use this example to highlight problems with the approach to AI regulation taken in the proposed EU Artificial Intelligence Act (AIA).
Table of contents
- 1. Introduction
- 2. EAI, Emotions, and the “Music” factor: What’s at stake?
- 2.1. Alexa and EAI: Speech recognition systems and emotions
- 2.2. The powerful impact of music
- 3. The AIA risk-based approach & Alexa’s EAI: Legal analysis
- 3.1. Alexa as Prohibited AI?
- 3.2. Alexa as High Risk AI?
- 4. Conclusion
- Bibliography
Loggen Sie sich bitte ein, um den ganzen Text zu lesen.
There are no comments yet
Ihr Kommentar zu diesem Beitrag
AbonnentInnen dieser Zeitschrift können sich an der Diskussion beteiligen. Bitte loggen Sie sich ein, um Kommentare verfassen zu können.
No comments