Electronic Thesis and Dissertation Repository

Thesis Format

Monograph

Degree

Doctor of Philosophy

Program

Library & Information Science

Supervisor

Burkell, Jacquelyn

2nd Supervisor

Rothbauer, Paulette

3rd Supervisor

Taylor, Graham

Affiliation

University of Guelph

Abstract

This study uses folk theories to enhance human-centered “explainable AI” (HCXAI). The complexity and opacity of machine learning has compelled the need for explainability. Consumer services like Amazon, Facebook, TikTok, and Spotify have resulted in machine learning becoming ubiquitous in the everyday lives of the non-expert, lay public. The following research questions inform this study: What are the folk theories of users that explain how a recommender system works? Is there a relationship between the folk theories of users and the principles of HCXAI that would facilitate the development of more transparent and explainable recommender systems? Using the Spotify music recommendation system as an example, 19 Spotify users were surveyed and interviewed to elicit their folk theories of how personalized recommendations work in a machine learning system. Seven folk theories emerged: complies, dialogues, decides, surveils, withholds and conceals, empathizes, and exploits. These folk theories support, challenge, and augment the principles of HCXAI. Taken collectively, the folk theories encourage HCXAI to take a broader view of XAI. The objective of HCXAI is to move towards a more user-centered, less technically focused XAI. The elicited folk theories indicate that this will require adopting principles that include policy implications, consumer protection issues, and concerns about intention and the possibility of manipulation. As a window into the complex user beliefs that inform their iii interactions with Spotify, the folk theories offer insights into how HCXAI systems can more effectively provide machine learning explainability to the non-expert, lay public.

Summary for Lay Audience

This study uses folk theories to enhance how artificial intelligence systems (AI) explain their behaviours. Folk theories are the beliefs people hold about how a system works. This study gathered the folk theories of the Spotify music system, an advanced AI system that provides personalized music recommendations for its users. Machine learning systems are powerful, complex, consequential, and opaque. They have difficulty explaining their actions and as a result users have difficulty fully trusting them. The field of “explainable AI” (XAI) is about enabling machine learning systems to tell us what they did, why they did it, and why they didn’t do something else instead. The principles of human-centered explainable AI (HCXAI) place the non-expert, lay public at the center of the AI explainability challenge. The folk theories of Spotify users describe beliefs about agency, power, process, intent, and relationships. These folk theories support, challenge, and augment the principles of HCXAI. The objective of HCXAI to move towards a more user-centered, less technically focused XAI means adopting principles that include policy implications, consumer protection issues, and concerns about intention and the possibility of manipulation. As a window into the complex user beliefs that inform their interactions with Spotify, the folk theories offer insights into how HCXAI systems can more effectively provide machine learning explainability to the non-expert, lay public.

Creative Commons License

Creative Commons Attribution-Noncommercial-No Derivative Works 4.0 License
This work is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 4.0 License.

Share

COinS