Abstract
Image-based sexual abuse (IBSA) is an umbrella term including the acts of nonconsensual taking, creating, or sharing of nude or sexual images, photos or videos. Organizations supporting IBSA victim-survivors have begun integrating AI chatbots into their services. The AI Act emphasizes transparency, requiring AI systems to disclose the use of AI to users. However, experimental research on transparency in chatbot design—such as disclosing bot identity, functionality, and limitations—has yielded mixed findings, and existing studies have not explored users’ in-depth experiences. To address this gap, we conducted semi-structured interviews with young adults (N=26, ages 18–29) to answer two research questions: RQ1: What are young adults’ positive and negative user experiences with a chatbot designed to support victim-survivors of IBSA? RQ2: What are young adults’ positive and negative user experiences with the disclosures of using AI in a chatbot designed to support victim-survivors of IBSA? We found that young adults liked the chatbot because it was easy to use, not embarrassing to talk to, and provided both informational and emotional support. However, they disliked that the information was often too long, sometimes irrelevant or not personalized, and that the emotional support did not meet their expectations compared to a human. Regarding the transparency design, they liked the non-transparent version because it enabled quicker access to support and fostered a greater sense of intimacy. Conversely, they appreciated the transparent version because disclosures such as “made by experts” and “protects your privacy” enhanced their trust and sense of security. However, the limitation disclosure (e.g., “I cannot understand like a real human”) received some negative comments, such as making users feel distant from the chatbot, less confident in its ability, and less inclined to engage further. These findings suggest that while transparency can build trust, opaque or unclear disclosures may undermine user confidence and engagement. Further research is needed to refine communication strategies that balance trust and usability in AI-driven support systems.
| Original language | English |
|---|---|
| Title of host publication | GoodIT '25: Proceedings of the 2025 International Conference on Information Technology for Social Good |
| Publisher | ACM |
| Pages | 60-69 |
| Number of pages | 10 |
| Volume | 5 |
| ISBN (Print) | 9798400720895 |
| DOIs | |
| Publication status | Published - 9 Dec 2025 |
| Event | GoodIT '25: International Conference on Information Technology for Social Good - Antwerp, Belgium Duration: 3 Sept 2025 → 5 Jan 2026 Conference number: 5 https://goodit2025.idlab.uantwerpen.be/ |
Conference
| Conference | GoodIT '25 |
|---|---|
| Abbreviated title | GoodIT '25 |
| Country/Territory | Belgium |
| City | Antwerp |
| Period | 3/09/25 → 5/01/26 |
| Internet address |