University of Leicester
Browse

How do Artificial Intelligence chatbots respond to questions from adolescent personas about their eating, body weight, or appearance?

journal contribution
posted on 2025-11-03, 11:30 authored by Florence SheenFlorence Sheen, Bethany Mullarkey, Gemma Witcomb, Marie-Christine Opitz, Ellen Maloney, Saffron Baldoza, Hannah White
<p dir="ltr">Background: Body image and eating behaviours are common areas of concern for early</p><p dir="ltr">adolescents. Artificial Intelligence (AI) interactions are becoming commonplace, including</p><p dir="ltr">with chatbots that provide human-like communication. Adolescents may prefer using</p><p dir="ltr">chatbots to anonymously ask sensitive questions, rather than approaching trusted adults or</p><p dir="ltr">peers. It is unclear how chatbots answer such questions. We explored how chatbots would</p><p dir="ltr">respond to eating, weight, or appearance-related queries from adolescents.</p><p dir="ltr">Method: Ten fictitious adolescent personas and scripts were created to facilitate</p><p dir="ltr">conversations with ChatGPT and Claude.AI. Personas asked questions about eating, body</p><p dir="ltr">weight and/or appearance, presenting as “curious”, “worried”, or “having a potential eating</p><p dir="ltr">disorder”. Conversation outputs were analysed using reflexive thematic analysis to explore</p><p dir="ltr">the content of chatbot responses.</p><p dir="ltr">Results: Five themes were identified: 1) Live a “healthy” adolescent lifestyle; 2) Eat</p><p dir="ltr">“healthily”; 3) Promoting regular physical activity; 4) Seek support; 5) Focus on you. Advice</p><p dir="ltr">was often framed within societal ideals relating to eating, body weight and/or appearance.</p><p dir="ltr">Chatbots signposted to trusted adults and healthcare professionals for support, but not to</p><p dir="ltr">regulated resources (e.g., NHS).</p><p dir="ltr">Conclusions: Framings around eating, weight and/or appearance may be problematic for</p><p dir="ltr">adolescents with eating disorder symptomatology. A lack of prompting for further</p><p dir="ltr">information or signposting to regulated support means vulnerable adolescents may receive</p><p dir="ltr">unhelpful information or not reach adequate support. Understanding how AI could be</p><p dir="ltr">supportive and/or unhelpful to adolescent users presenting with eating, body or appearance-</p><p dir="ltr">related concerns is important. Findings can inform policy regulating AI chatbots’</p><p dir="ltr">communications with adolescents.</p>

Funding

Eating Disorders: Delineating illness and recovery trajectories to inform personalised prevention and early intervention in young people (EDIFY)

UK Research and Innovation

Find out more...

History

Author affiliation

University of Leicester College of Life Sciences Psychology & Vision Sciences

Version

  • AM (Accepted Manuscript)

Published in

Child and Adolescent Mental Health

Publisher

Wiley

issn

1475-357X

eissn

1475-3588

Copyright date

2025

Publisher DOI

Notes

Embargo until publication

Language

en

Deposited by

Miss Florence Sheen

Deposit date

2025-10-20