Faculty

Arts and Humanities

Supervisor Name

Luke Stark

Keywords

artificial intelligence, emotion, judgement, education, harm

Description

Emotion AI systems are a form of artificial intelligence (AI) technologies that have to do with measuring, diagnosing, and manipulating human emotion. Emotion AI technologies claim to be able to detect the "true" emotional states of their users via facial scanning, voice recordings, and physical movement tracking (Greene, 2020), and are especially being integrated into education, with aims of tracking student attention and engagement during lessons. However, the theories of emotion of which these technologies are built on have long been disputed by scholars and psychologists (Leys, 2017). This infographic illustrates the harmful implications of using problematic theories in the development and deployment of emotion AI technologies in the education context, ultimately arguing that these systems are disempowering to children.

Acknowledgements

I'd like to thank the USRI program for making this research opportunity possible. I'd also like my supervisor, Dr. Luke Stark, for your guidance and support on this project.

Creative Commons License

Creative Commons Attribution 4.0 License
This work is licensed under a Creative Commons Attribution 4.0 License.

Document Type

Poster

Share

COinS
 

Problematic Paradigms: Harmful Implications of AI Technology in Education

Emotion AI systems are a form of artificial intelligence (AI) technologies that have to do with measuring, diagnosing, and manipulating human emotion. Emotion AI technologies claim to be able to detect the "true" emotional states of their users via facial scanning, voice recordings, and physical movement tracking (Greene, 2020), and are especially being integrated into education, with aims of tracking student attention and engagement during lessons. However, the theories of emotion of which these technologies are built on have long been disputed by scholars and psychologists (Leys, 2017). This infographic illustrates the harmful implications of using problematic theories in the development and deployment of emotion AI technologies in the education context, ultimately arguing that these systems are disempowering to children.