Change display time — Currently: Eastern Daylight Time (EDT) (Event time)

AI and Caring: Exploring Trust, Loneliness, and Well-Being with ChatGPT

,

Roundtable presentation
Research Paper
Save to My Favorites

Session description

Globally, over 900 participants shared how they use AI tools like ChatGPT for emotional support and stress relief. This session highlights benefits, risks, and ethical concerns. Educators will gain evidence-based insights and practical strategies to address well-being, loneliness, and trust while guiding students in safe, supportive AI use.

Framework

This research is informed by Salovey and Mayer’s (1990) theory of emotional intelligence, which emphasizes the ability to monitor and manage emotions in self and others, and Picard’s (1997) work on affective computing, which explores how technology can recognize and respond to human emotions. Building on these perspectives, the study examines ChatGPT as an emotionally responsive AI, investigating user trust, perceptions of empathy, and implications for well-being. This framework positions AI within the broader shift from technology-oriented to humanity-oriented design (Yang et al., 2021), while also engaging scholarship on trust in human–AI relationships (Hancock et al., 2020).

More [+]

Methods

This study employed a mixed-methods design using a 16-item Qualtrics survey to investigate how individuals engage with ChatGPT for personal and emotional support. The survey included four demographic questions (gender, ethnicity, geographic location, education level), seven quantitative Likert-scale and multiple-choice items, and five open-ended questions. Quantitative items assessed familiarity with ChatGPT, frequency of use, emotional responsiveness, trustworthiness, satisfaction, and comparisons with human interaction. Open-ended questions explored perceived benefits, risks, ethical concerns, trust, and suggestions for improvement.

Participants were recruited through educational listservs, social media, and email using snowball sampling. Eligibility required participants to be 18 or older and to provide informed consent. A total of 953 individuals responded, representing diverse educational and cultural backgrounds.

Quantitative data were analyzed descriptively, while qualitative responses were coded inductively using comparative thematic analysis (Merriam & Tisdell, 2016). Coding was conducted independently by multiple researchers, with intercoder agreement reached through iterative discussions to ensure credibility and consistency. This approach produced both statistical patterns and thematic insights into participants’ perceptions of ChatGPT as a tool for emotional support and well-being.

More [+]

Results

Analysis revealed that while most participants used ChatGPT primarily for informational or academic tasks, a smaller but significant group turned to it for emotional support, stress relief, and advice. Some described feeling comforted by its availability, nonjudgmental tone, and perceived empathy. Others, however, raised concerns about generic or inaccurate responses, lack of empathy, privacy risks, and the potential for overreliance or emotional detachment from human relationships.

Quantitative data showed mixed perceptions of ChatGPT’s emotional responsiveness: many participants acknowledged thoughtful replies, but the majority rejected the idea of recommending it for emotional support. Qualitative analysis highlighted themes of preference for human connection, skepticism about AI’s capacity for care, and conditional acceptance when paired with professional resources.

Taken together, these results illustrate both the potential of AI to reduce stress and loneliness and the risks of dependence, misalignment, and ethical concerns. For educators, these findings provide critical insights into how students and communities may interact with AI tools in emotionally vulnerable contexts, shaping conversations around digital citizenship, well-being, and safe classroom practices.

More [+]

Importance

This study contributes to emerging scholarship on emotional artificial intelligence by examining how nearly 1,000 individuals worldwide use ChatGPT for personal and emotional support. While most AI in education research emphasizes productivity and academic uses, this work uniquely addresses well-being, loneliness, and trust. These findings illuminate a critical but underexplored dimension of technology use with direct implications for K–16 educators.

As students increasingly experiment with AI tools for connection, reflection, and stress management, understanding both benefits and risks becomes vital. By identifying patterns of reliance, skepticism, and ethical concern, this research equips educators to foster safe, responsible use of AI in classrooms and to integrate conversations about digital citizenship, emotional health, and human connection into their teaching.

More [+]

References

Hancock, J. T., Naaman, M., & Levy, K. (2020). AI-mediated communication: Definition, research agenda, and ethical considerations. Journal of Computer-Mediated Communication, 25(1), 89–100. https://doi.org/10.1093/jcmc/zmz018

Merriam, S. B., & Tisdell, E. J. (2016). Qualitative research: A guide to design and implementation (4th ed.). Jossey-Bass.

Picard, R. W. (1997). Affective computing. MIT Press.

Salovey, P., & Mayer, J. D. (1990). Emotional intelligence. Imagination, Cognition and Personality, 9(3), 185–211. https://doi.org/10.2190/DUGG-P24E-52WK-6CDG

Yang, J., Guo, S., & Yu, H. (2021). From technology-oriented to humanity-oriented: A review of human-centered artificial intelligence. Computers in Human Behavior Reports, 3, 100059. https://doi.org/10.1016/j.chbr.2021.100059

More [+]

Presenters

Photo
Professor
UT Tyler
Photo
Professor
The University of Texas at Tyler
This is presentation 1 of 4, scroll down to see more details.

Other presentations in this group:

Session specifications

Topic:

Artificial Intelligence

Grade level:

Community College/University

Audience:

Counselor, School Level Leadership, Teacher

Attendee devices:

Devices not needed

Subject area:

Technology Education, Other: Please specify

ISTE Standards:

For Educators: Citizen, Designer
For Students: Digital Citizen

Transformational Learning Principles:

Cultivate Belonging, Elevate Reflection