Student Perspectives on Robot Teachers
Listen and learn : Research paper
Research papers are a pairing of two 18 minute presentations followed by 18 minutes of Discussion led by a Discussant, with remaining time for Q & A.
This is presentation 1 of 2, scroll down to see more details.
Other presentations in this group:
|Audience:||Curriculum/district specialists, Teachers, Technology coordinators/facilitators|
|Attendee devices:||Devices useful|
|Attendee device specification:||Smartphone: Android, iOS, Windows
Laptop: Chromebook, Mac, PC
Tablet: Windows, Android, iOS
|Subject area:||Preservice teacher education, STEM/STEAM|
|ISTE Standards:||For Students:
|Additional detail:||Graduate student|
The proposed study is framed using a post-structuralist paradigm (Sharma, 2020), socio-material theory (Fenwick, 2010), Biesta’s three functions of education (Biesta 2020), the education ecosystem and socio-material theory (Bandyopadhyay & Dey, 2021).
Post-structural research is designed around the belief that a student’s education and a classroom environment are unique and complex. Post-structuralism rejects grand narratives for local ones (Grant & Giddon, 2002). For example, one classroom’s use of an agent may contradict another classroom’s use. Definitions of educational success, the requirements of educational policy and curricula, and the materials available to students will differ between classrooms and the post-structuralist paradigm is designed for such variation (Sharma, 2020). I use post-structuralism to focus the research design on student’s perspectives rather than an abstract definition of educational success.
Sociomaterial theory is a way of observing post-structuralism using a grounded approach (Law, 2008). Sociomaterial theory explores the relationship between people (actors) and things (materials) in their environment (Fenwick, 2010). Materials can be technologies, organisations, objects and environments (Fenwick & Dahlgreen, 2015). Actors use and interpret materials such as texts, symbols, meanings, intentions (Fenwick & Dahlgreen, 2015). Actors and materials are given equal importance in explaining an event or change (Fenwick, 2010). A sociomaterial approach can be used to explore the many interactions between social systems and digital agents (Fenwick, 2012). I use a sociomaterial approach to explore how students use agents, within the context of other parts of the education ecosystem.
Biesta's functions of education
Education is not just what students learn, but the reason they learn it and who they learn it from (Biesta, 2020). Biesta (2012) proposed that education can be thought of as three interconnecting functions (qualification, socialisation and subjectification). Qualification is the acquisition of disciplinary knowledge and skills which has characterised classroom learning since its beginnings (Seldon et al, 2012). Socialisation is the situation of knowledge and skills in cultural, historical and social contexts, resulting in students being prepared to function in a given community. Biesta (2020) notes that socialisation can be intentional (for example national curricula) or unintentional (for example a teacher’s beliefs they may not know they hold), but it is always present. Subjectification is students developing the capabilities to be autonomous, making their own decisions.
An education ecosystem
The education ecosystem was first proposed by Cremin (1976) as one way to analogise how ‘... classrooms deliver education by interacting and collaborating with other parts.’ Ecosystems are defined in ecological science as ‘…systemic communities... which interact and connect...creating a complex network ...’ (Chaplin et al, 2000). Bandyopadhyay & Dey (2021) define their education ecosystem as a series of living (human) biotic entities and non-human abiotic entities. The ecosystem is socio-material, meaning it illustrates a number of interactions between people and material objects. Biotic entities in this study are actors, abiotic entities are materials.
Theoretical frameworks and the literature inform my methods. A multi case-study approach will be taken. The variables identified in the literature along with those identified in the data collection will be used for thematic analysis that will inform a discussion answering the research questions.
Data will be collected from multiple sources including observations, agent logs, student focus groups and teacher interviews. Data from observations and agent logs will help me prepare for the student focus groups and teacher interviews which will all be semi-structured. The case study data collection will focus around student use of three digital agents - a smart speaker, a zoomorphic smart 'dog' and a software agent run through Minecraft.
Methods of analysis
Case study data will be apriori coded to variable groups and also emerging themes will be identified ( (Johnson & Christensen, 2016). Thematic analysis will explore commonalities . An inductive approach will be used, with the data guiding the formation of axial codes, where themes and hierarchies will be identified to help explain the observed phenomena (Braun & Clarke, 2019).
A self-nomination form will collect basic information and ask teachers if they can commit to the duration and activities of my study. I intend to use purposive sampling (Johnson & Christensen, 2016) choosing cases that are likely to provide sufficient data. To make regular visits to the classroom realistic for me, the convenience of location will be a factor in case selection. Given the small sample of four classrooms, the selection process may be limited by the requirement for case study classes to return 100% parent and student consent (N must equal n).
My study will add to the discussion on student use of, and engagement with, agents in a classroom ecosystem. It will be the largest such study in the primary school classroom. My study will also uniquely compare three agents in multi case-studies over an extended period, adding to an understanding of how use and engagement with agents may change over time. My study will be useful both for educators making choices about if and how to use agents in the classroom and also inform the designers of edTech agents.
My presentation aims to give useful descriptive insight and qualitative data analysis for both educators and those developing intelligent agents for education settings. It will add to an exciting body of knowledge on how students choose and use intelligent agents in the classroom, with two significant contributions. Firstly, it will be the first such study in the New Zealand primary school classroom. Secondly, it will compare three agents in multi case studies over an extended period, increasing the variables to better study the choices students have, and possible change in their choices over time. My study will help inform educators to make better choices about the tools they make available to students and the skills students may need to best utilise these agents for their education.
Al-Gahtani, S. (2014). Empirical Investigation of E-Learning Acceptance and Assimilation: A Structural Equation Model. Applied Computing and Informatics, 4.
Arksey, H., & O’Malley, L. (2005). Scoping studies: towards a methodological framework. International Journal of Social Research Methodology, 8(1), 19–32. https://doi.org/10.1080/1364557032000119616
Asimov, I. (1950). I, Robot. Fawcett Publications.
Bartneck, C. (2004). From Fiction to Science –
A cultural reflection of social robots. Proceedings of the CHI2004 Workshop on Shaping Human-Robot Interaction, Vienna. https://doi.org/10.1109/temscon.2018.8488444
Bandyopadhyay, S., Bardhan, A., Dey, P., & Bhattacharyya, S. (2021). Education Ecosystem. In S. Bandyopadhyay, A. Bardhan, P. Dey, & S. Bhattacharyya (Eds.), Bridging the Education Divide Using Social Technologies: Explorations in Rural India (pp. 43–75). Springer. https://doi.org/10.1007/978-981-33-6738-8_3
Biesta, G. (2004). Against Learning. Reclaiming a Language for Education in an Age of Learning. Nordisk Pedagogik, 23. https://doi.org/10.18261/issn1891-5949-2004-01-06
Biesta, G. (2009). Good Education in an Age of Measurement: On the Need to Reconnect with the Question of Purpose in Education. Educational Assessment Evaluation and Accountability, 21. https://doi.org/10.1007/s11092-008-9064-9
Biesta, G. (2016). ICT and Education Beyond Learning. In E. Elstad (Ed.), Digital Expectations and Experiences in Education (pp. 29–43). SensePublishers. https://doi.org/10.1007/978-94-6300-648-4_2
Biesta, G. (2018). Creating spaces for learning or making room for education? New parameters for the architecture of education: Contemporary Visions for Education (pp. 27–39). https://doi.org/10.4324/9781315148366-3
Biesta, G. (2020a). Risking Ourselves in Education: Qualification, Socialization, and Subjectification Revisited. Educational Theory, 70(1), 89–104. https://doi.org/10.1111/edth.12411
Biesta, G. (2020b). Chapter 3: Regaining the Democratic Heart of Education. In Flip the System US: How Teachers Can Transform Education and Save Democracy (pp. 32–37). CRC Press.
Boden, M. A. (2018). Artificial Intelligence: A Very Short Introduction. Oxford University Press.
Clarke, V., Braun, V., Terry, G & Hayfield N. (2019). Thematic analysis. In Liamputtong, P. (Ed.), Handbook of research methods in health and social sciences (pp. 843-860). Springer.
Breazeal, C., Harris, P. L., DeSteno, D., Kory Westlund, J. M., Dickens, L., & Jeong, S. (2016). Young Children Treat Robots as Informants. Topics in Cognitive Science, 8(2), 481–491. https://doi.org/10.1111/tops.12192
Broadbent, E., Tamagawa, R., Patience, A., Knock, B., Kerse, N., Day, K., & MacDonald, B. A. (2012). Attitudes towards health-care robots in a retirement village. Australasian Journal on Ageing, 31(2), 115–120. https://doi.org/10.1111/j.1741-6612.2011.00551.x
Catlin, D., Smith, J. L., & Morrison, K. (n.d.). Using Educational Robots as Tools of Cultural Expression: A Report on Projects with Indigenous
Chen, S.-C., Jones, C., & Moyle, W. (2019). Health Professional and Workers Attitudes Towards the Use of Social Robots for Older Adults in Long-Term Care. International Journal of Social Robotics. https://doi.org/10.1007/s12369-019-00613-z
Coeckelbergh, M. (2011). Talking to Robots: On the Linguistic Construction of Personal Human-Robot Relations. In M. H. Lamers & F. J. Verbeek (Eds.), Human-Robot Personal Relationships (Vol. 59, pp. 126–129). Springer Berlin Heidelberg. https://doi.org/10.1007/978-3-642-19385-9_16
DeFalco, J. A., Sinatra, A. M., Rodriguez, E., & Stan Hum, R. (2019). Conscientiousness, Honesty-Humility, and Analogical/Creative Reasoning: Implications for Instructional Designs in Intelligent Tutoring Systems. In S. Isotani, E. Millán, A. Ogan, P. Hastings, B. McLaren, & R. Luckin (Eds.), Artificial Intelligence in Education (pp. 52–57). Springer International Publishing. https://doi.org/10.1007/978-3-030-23207-8_10
Dennett, D. C. (n.d.). Consciousness in Human and Robot Minds. Retrieved December 21, 2020, from https://doi.org/10.1109/temscon.2018.8488444
Dick, P. K. (2011). Do androids dream of electric sheep? Gollancz.
Dixon, S. (2004). A Brief History of Robots and Automata. TDR: The Drama Review 48(4), 16-25. https://www.muse.jhu.edu/article/175438.
Dousay, T. A., & Hall, C. (2018). “Alexa, tell me about using a virtual assistant in the classroom.” Proceedings of EdMedia: World Conference on Educational Media and Technology, 1413–1419. https://doi.org/10.1145/2872518.2888606
Fenwick, Tara. “Re-Thinking the ‘Thing’: Sociomaterial Approaches to Understanding and Researching Learning in Work.” Journal of Workplace Learning, vol. 22, Feb. 2010, pp. 104–16, https://doi:10.1108/13665621011012898
García, D. H., Esteban, P. G., Lee, H. R., Romeo, M., Senft, E., & Billing, E. (2019). Social Robots in Therapy and Care. 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI), 669–670. https://doi.org/10.1109/HRI.2019.8673243
Goodrich, M., & Schultz, A. (2007). Human-Robot Interaction: A Survey. Foundations and Trends in Human-Computer Interaction, 1, 203–275. https://doi.org/10.1561/1100000005
Grant, M. J., & Booth, A. (2009). A typology of reviews: an analysis of 14 review types and associated methodologies. Health Information & Libraries Journal, 26(2), 91–108. https://doi.org/https://doi.org/10.1111/j.1471-1842.2009.00848.x
Groom, V., & Nass, C. (2007). Can robots be teammates?: Benchmarks in human–robot teams. Interaction Studies, 8(3), 483–500. https://doi:10.1075/is.8.3.10gro
Hancock, P. A., Billings, D. R., & Schaefer, K. E. (2011). Can You Trust Your Robot?: Ergonomics in Design. https://doi.org/10.1177/1064804611415045
Haugsbakk, G., & Nordkvelle, Y. (2007). The Rhetoric of ICT and the New Language of Learning: A Critical Analysis of the Use of ICT in the Curricular Field. European Educational Research Journal, 6(1), 1–12. https://doi.org/10.2304/eerj.2007.6.1.1
Heerink, M., Krose, B., Evers, V., & Wielinga, B. (2009). Measuring acceptance of an assistive social robot: a suggested toolkit. RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication, 528–533.
Idel, M. (1990). Golem: Jewish Magical and Mystical Traditions on the Artificial Anthropoid. State University of New York Press.
Irwin, R., & White, T. H. (2019). Decolonising Technological Futures: A dialogical tryptich between Te Haumoana White, Ruth Irwin, and Tegmark’s Artificial Intelligence. Futures, 112, 102431. https://doi.org/10.1016/j.futures.2019.06.003
Kahn, P. H., Kanda, T., Ishiguro, H., Freier, N. G., Severson, R. L., Gill, B. T., Ruckert, J. H., & Shen, S. (2012). “Robovie, you’ll have to go into the closet now”: Children’s social and moral relationships with a humanoid robot. Developmental Psychology, 48(2), 303–314. https://doi.org/10.1037/a0027033
Kanda, T., Hirano, T., Eaton, D., & Ishiguro, H. (2004). Interactive robots as social partners and peer tutors for children: A field trial. Human-Computer Interaction, 19(1–2), 61–84. https://doi.org/10.1207/s15327051hci1901&2_4
Lee, J. D., & See, K. A. (2004). Trust in Automation: Designing for Appropriate Reliance. Human Factors: The Journal of the Human Factors and Ergonomics Society, 46(1), 50–80. https://doi.org/10.1518/hfes.46.1.50_30392
Lehman-Wilzig, S. N. (1981). Frankenstein unbound: Towards a legal definition of artificial intelligence. Futures, 13(6), 442–457. https://doi.org/10.1016/0016-3287(81)90100-2
Leifler, E. (2020). Teachers’ capacity to create inclusive learning environments. International Journal for Lesson & Learning Studies, 9(3), 221–244. https://doi.org/10.1108/IJLLS-01-2020-0003
Lewis, M., Sycara, K., & Walker, P. (2018). The Role of Trust in Human-Robot Interaction. In H. A. Abbass, J. Scholz, & D. J. Reid (Eds.), Foundations of Trusted Autonomy (pp. 135–159). Springer International Publishing. https://doi.org/10.1007/978-3-319-64816-3_8
Luckin, R., Holmes, W., Griffiths, M., & Forcier, L. B. (2016). Intelligence unleashed. An argument for AI in education. Pearson. https://static.googleusercontent.com/media/edu.google.com/en//pdfs/Intelligence-Unleashed-Publication.pdf
Luckin, R. (2019). Is education ready for artificial intelligence? Machine learning and EdTec. Cambridge Summit of Education Proceedings. Cambridge, UK. https://www.cambridgeassessment.org.uk/insights/is-education-ready-ai-rose-luckin/
McLean, G., & Osei-Frimpong, K. (2019). Hey Alexa… examine the variables influencing the use of artificial intelligent in-home voice assistants. Computers in Human Behavior, 99, 28–37. https://doi.org/10.1016/j.chb.2019.05.009
Nomura, T., Suzuki, T., Kanda, T., & Kato, K. (2006). Measurement of negative attitudes toward robots. Interaction Studies, Volume 7, Issue 3, Jan 2006, p. 437 - 454 https://doi.org/10.1075/is.7.3.14nom
Mlekus, L., Bentler, D., Paruzel, A., Kato-Beiderwieden, A.-L., & Maier, G. W. (2020). How to raise technology acceptance: user experience characteristics as technology-inherent determinants. Gruppe. Interaktion. Organisation. Zeitschrift Für Angewandte Organisationspsychologie (GIO), 51(3), 273–283. https://doi.org/10.1007/s11612-020-00529-7
Moyle, W., Bramble, M., Jones, C., & Murfield, J. (2016). Care staff perceptions of a social robot called Paro and a look-alike Plush Toy: a descriptive qualitative approach. Aging & Mental Health, 22, 1–6. https://doi.org/10.1080/13607863.2016.1262820
Parasuraman, R., & Riley, V. (1997). Humans and Automation: Use, Misuse, Disuse, Abuse. Human Factors: The Journal of the Human Factors and Ergonomics Society, 39(2), 230–253. https://doi.org/10.1518/001872097778543886
Pu, L., Moyle, W., Jones, C., & Todorovic, M. (2019). The Effectiveness of Social Robots for Older Adults: A Systematic Review and Meta-Analysis of Randomized Controlled Studies. The Gerontologist, 59(1), e37–e51. https://doi.org/10.1093/geront/gny046
Qin, F., Li, K., & Yan, J. (2020). Understanding user trust in artificial intelligence-based educational systems: Evidence from China. British Journal of Educational Technology, 51(5), 1693–1710. https://doi.org/10.1111/bjet.12994
Robaczewski, A., Bouchard, J., Bouchard, K., & Gaboury, S. (2020). Socially Assistive Robots: The Specific Case of the NAO. International Journal of Social Robotics. https://doi.org/10.1007/s12369-020-00664-7
Robinette, P., Howard, A. M., & Wagner, A. R. (2017). Effect of Robot Performance on Human–Robot Trust in Time-Critical Situations. IEEE Transactions on Human-Machine Systems, 47(4), 425–436. https://doi.org/10.1109/THMS.2017.2648849
Robins, B. (2005). A humanoid robot as assistive technology for encouraging social interaction skills in children with autism [PhD Thesis, University of Hertfordshire]. Hertfordshire, UK. https://doi.org/10.18745/th.14273
Rosenberg-Kima, R., Koren, Y., Yachini, M., & Gordon, G. (2019). Human-Robot-Collaboration (HRC): Social Robots as Teaching Assistants for Training Activities in Small Groups. 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI), 522–523. https://doi.org/10.1109/HRI.2019.8673103
Rudhru, O., Ser, Q. M., & Sandoval, E. (2016). Robot Maori Haka: Robots as cultural preservationists. 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI), 569–569. https://doi.org/10.1109/HRI.2016.7451860
Russell, S., & Norvig, P. (2016). Artificial Intelligence: A Modern Approach (Third). Pearson.
Savela, N., Turja, T., & Oksanen, A. (2018). Social Acceptance of Robots in Different Occupational Fields: A Systematic Literature Review. International Journal of Social Robotics, 10(4), 493–502. https://doi.org/10.1007/s12369-017-0452-5
Schaefer, K., Chen, J., Szalma, J., & Hancock, P. (2016). A Meta-Analysis of Factors Influencing the Development of Trust in Automation: Implications for Understanding Autonomy in Future Systems. Human Factors: The Journal of the Human Factors and Ergonomics Society, 58. https://doi.org/10.1177/0018720816634228
Seldon, A., Metcalf, T., & Abidoye, O. (2020). The Fourth Education Revolution Reconsidered: Will Artificial Intelligence Enrich or Diminish Humanity? (2nd ed.). University of Buckingham Press.
Selwyn, N. (2019). Should Robots Replace Teachers? Polity Press.
Sharkey, A. J. C. (2016). Should we welcome robot teachers? Ethics and Information Technology, 18(4), 283–297. https://doi.org/10.1007/s10676-016-9387-z
Shi, C., Satake, S., Kanda, T., & Ishiguro, H. (2016). How would store managers employ social robots? 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI), 519–520. https://doi.org/10.1109/HRI.2016.7451835
Shin, D.-H., & Choo, H. (2011). Modeling the acceptance of socially interactive robotics: Social presence in human–robot interaction. Interaction Studies, 12(3), 430–460. https://doi.org/10.1075/is.12.3.04shi
Song, Y. W. (2019). User acceptance of an artificial intelligence (AI) virtual assistant : an extension of the technology acceptance model [Thesis]. https://doi.org/10.26153/tsw/2132
Starkey, L. (2019). Three dimensions of student-centred education: a framework for policy and practice. Critical Studies in Education, 60(3), 375–390. https://doi.org/10.1080/17508487.2017.1281829
Taiuru, K. (2020). Treaty of Waitangi/Te Tiriti and Māori Ethics Guidelines for: AI, Algorithms, Data and IOT. http://www.taiuru.Maori.nz/TiritiEthicalGuide
Tondeur, J., Petko, D., Christensen, R., Drossel, K., Starkey, L., Knezek, G., & Schmidt-Crawford, D. A. (2021). Quality criteria for conceptual technology integration models in education: bridging research and practice. Educational Technology Research and Development. https://doi.org/10.1007/s11423-020-09911-0
Ullman, D., & Malle, B. F. (2017). Human-Robot Trust: Just a Button Press Away. Proceedings of the Companion of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, 309–310. https://doi.org/10.1145/3029798.3038423
Underwood, J. (2017). Exploring AI language assistants with primary. EUROCALL 2017 Conference, Southampton, United Kingdom. https://doi.org/https://doi.org/10.14705/rpnet.2017.eurocall2017.733
Laura is a PhD student and tech coach from New Zealand, with over 7 years of experience in the classroom. In 2019, as her Master's thesis, she completed the second-biggest study of voice assistant devices in the classroom. Laura has presented and participated in panels on AI in education and works with teachers 1-1 to get them started with Smart Tech in their classrooms. She was an ISTE Live 2020 presenter and is a Seesaw Ambassador, Apple Teacher, and Google Innovator (#SYD19). You can find her on Twitter @ElleButlerEDU