Birgitte Aga


This practice-based research inquiry explores the implications of conversational Artificial Intelligence (AI) systems, ‘relational things that talk’, on the way people experience the world. It responds directly to the pervasive lack of ethical design frameworks for commercial AI systems, compounded by limited transparency, ubiquitous authority, embedded bias and the absence of diversity in the development process. The effect produced by relational things that talk upon the feelings, thoughts or intentions of the user is here defined as the ‘perlocutionary effect’ of conversational AI systems. This effect is constituted by these systems’ ‘relationality‘ and ‘persuasiveness’, propagated by the system’s embedded bias and ‘hybrid intentions’, relative to a user’s susceptibility. The proposition of the perlocutionary effect frames the central practice of this thesis and the contribution to new knowledge which manifests as four discursive prototypes developed through a participatory method. Each prototype demonstrates the factors that constitute and propagate the perlocutionary effect. These prototypes also function as instruments which actively engage participants in a counter-narrative as a form of activism. ‘This Is Where We Are’ (TIWWA), explores the persuasiveness and relationality of relational things powered through AI behavioural algorithms and directed by pools of user data. ‘Emoti-OS’, iterates the findings from TIWWA and analyses the construction of relationality through simulated affect, personality and collective (artificial) emotional intelligence. ‘Women Reclaiming AI’ (WRAI), demonstrates stereotyping and bias in commercial conversational AI developments. The last prototype, ‘The Infinite Guide’, synthesises and tests the findings from the three previous prototypes to substantiate the overall perlocutionary effect of conversational AI system. In so doing, this inquiry proposes the appropriation of relational things that talk as a discursive design strategy, extended with a participatory method, for new forms of cultural expression and social action, which activate people to demand more ethical AI systems.

Document Type


Publication Date