Get ready for ISTELive 21! Launch the site now.
Leadership Exchange
at ISTELive 22
Edtech Industry
Network Summit
Creative Constructor
Lab Virtual
Edtech Advocacy &
Policy Summit

Storytelling as Design Metaphor for K-12 Augmented Reality (AR) Science Applications

Listen and learn

Listen and learn : Research paper
Roundtable presentation

Tuesday, December 1, 12:30–1:15 pm PST (Pacific Standard Time)
Presentation 3 of 3
Other presentations:
Preservice Teachers and Challenges to Mobile Phone Integration
Analysis of AR Pedagogy: Empowering Students as Content Creators and Curators

Dr. Julie Baca  
Nicholas Harvel  
Vu Tran  
Dr. Daniel Carruth  

This session will describe a collaborative research effort to deliver science content, created by actual scientists in real-world settings, to students in an educational setting, using storytelling via augmented reality "story" apps. Results of testing the story apps with students in actual classroom settings will be presented.

Audience: Teachers, Professional developers
Attendee devices: Devices not needed
Topic: Augmented, mixed & virtual reality
Grade level: 6-12
Subject area: Science
ISTE Standards: For Students:
Empowered Learner
  • Students use technology to seek feedback that informs and improves their practice and to demonstrate their learning in a variety of ways.
Innovative Designer
  • Students develop, test and refine prototypes as part of a cyclical design process.
Knowledge Constructor
  • Students build knowledge by actively exploring real-world issues and problems, developing ideas and theories and pursuing answers and solutions.
Additional detail: Session recorded for video-on-demand

Proposal summary


In reviewing the literature, we were initially inspired by the results of an HCI investigation conducted in the U.K [5] which examined the use of both tactile objects and the notion of storytelling to more deeply engage learners in a variety of subject areas, including math and science, subjects of specific interest to our goals.

Rooted in Vygotsky’s research on the value of social interaction among learners [6], the research seeks to increase learner engagement by facilitating greater social interaction, specifically through the use of physical objects and storytelling.
The study entailed development and testing of several prototype applications for elementary school students. In each of the prototype “story” apps, the use of a physical object, programmed to the app, was integral to advancing in the story [5]. For example, a geology app used a story in which a girl stumbles onto a mine, and begins to explore the underground and all the things that live beneath her, e.g., worms, moles, and “magical” gemstones, through the use of magnifying glass, programmed to the app. Learners shared the magnifying glass to advance in the story and find new items, in the process, learning how the glass lens could shift shapes around the room. Another example involved the use of a shared telescope, programmed to the app, for maritime exploration.

The results of the pilot experiments indicated that the tangible media, combined with storytelling, facilitated learner engagement with the educational content, and enhanced social interaction among the learners [5].

Several aspects of the research resonated with our goals to better reach K-12 learners. In particular, the successful use of tangible objects aligned with what we were already observing in demonstration apps we had developed using AR technology with physical objects, such as 3D prints, as key elements in the interaction. The length and intensity of the audience attention, though not formally measured, was clearly heightened.

There were a few subtle, but key differences in our goals, however. A significant portion of our audience lies in the middle grades and above, thus our apps would require more sophisticated “stories.” Second, our primary goal is to increase engagement and interest, not to provide formal education. Nonetheless, to achieve even the goal of engagement with acceptable quality would require an HCI-based user-centered design (UCD) approach [7,8], and hence, the participation and collaboration of actual learners and teachers.

At this stage in our design process, we reached out to our university partner research center, collocated with a laboratory middle school, to broker a partnership of potentially mutual benefit both to the school and to our needs: enlisting learners and educators in real-world classroom settings to help design and test the prototype apps would greatly enhance the quality of our demonstrations, while the teachers would have the opportunity, not otherwise afforded, to use and evaluate this technology for its learning potential.


The human factors group within our university partner research center specializes in the use of augmented and virtual reality technologies for human performance and training. This expertise, coupled with their collocation with a K-12 laboratory school, created an ideal environment for collaboration.

Our goal was to identify, design, and test science apps, of mutual interest to the school and our purposes, that would “tell a story,” using the tactile nature of AR to increase learner engagement. To do this properly would require using standard HCI practices of iterative refinement [9], and user-centered design (UCD) [7,8], enlisting participation of teachers fully in the design process.

The rationale and methods for assembling user advisory panels (roughly 2-5 users) is argued in a variety of sources, including [7-9]. If properly composed, a group of this approximate size can be very effective in representing the needs of a larger class of users to implement a UCD process [9]. For this research, we enlisted a teacher advisory group, comprised of 4 science educators, 1 lead educator with over 3 decades of high school science classroom experience, and 3 middle school science teachers currently working in the classroom. This group participated in all phases of design and testing the prototype story apps, representing their instructional needs as well as the educational needs of their students. In our first phase of the research, we spent approximately four months in online and in person dialogue with this group to fully understand the current curriculum requirements and identify the specific subject areas and learning objectives for which AR technologies would best be applied.

Three subject areas were selected for story app development. These included genes and genetic mutations (life sciences), waves (physical sciences), and natural hazards (earth and space science).

Genes and Genetic Mutations
For the subject area of genes and genetic mutations, a story app featuring a 3 dimensional (3D) representation of a DNA structure was designed by our university partners. The 3D structure featured in the app displays the fundamental components of a DNA strand. It allows students to explore and “write” their own genetic “story” by modifying, removing, or replacing components of the DNA and observing how the DNA strand changes.

For the topic of waves, we targeted a popular app among school audiences in our center that demonstrated research examining the impact of waves on high speed planing boats. The goal of the research is to better understand and minimize the loads placed on humans in the boat, while maintaining the performance and structural integrity of the boat. The app “story” features the experience of the boat crew members as central characters, and how they are impacted as the wave properties are adjusted.

Natural Hazards
For the subject of natural hazards, we designed an app that features a representation of a small coastal town and simulates the impact of hurricane force winds, storm surge, and tornadoes on the buildings in the town. This app supports multiple students working together to protect their town using various damage mitigation measures (e.g., elevated homes, improved ditch and drainage systems, jetties, etc.) [10]. The students can then subject their town to selected weather events (tornado F0 to F5 or hurricane category 1-5). Students are given real-time influence over the weather event and can directly observe how natural hazards affect the landscape and structures of their town and how mitigation measures can reduce the impact of storms.

Learning Centers
The apps were designed to be integrated into learning centers that support students working by themselves or in small groups to experience and interact with demonstrations of concepts presented in class. Each app presents a distinct experience that launches the student into the “story” with minimal to no interaction with traditional user interface elements (e.g., menus) to maximize a sense of realism and engagement.


The DNA app completed multiple iterative cycles of design, implementation, and review by the user advisory group. Improvements to the DNA app included the addition of informational labels, improved user interfaces, and extended interactivity with DNA elements. The DNA app was used by teachers in classroom learning centers during the second nine weeks of the school year and again in the final nine weeks to review DNA-related material.

Three versions of the wave app were implemented to allow one app to be used during the second nine weeks of instruction when wave concepts were introduced to the students. The second app was designed to be used when the classroom reviewed wave concepts layer in the year. The first version was created in CoSpaces. The CoSpaces wave app demonstrated wave actions by passing colliders through a collection of spheres. Students were given control of the speed, size, and distance between colliders. The second version was created in Unity and used the ObiFluid fluid simulation plugin. ObiFluid simulates fluid movement using collections of particles. The third version also used Unity and ObiFluid but this version simulated a water surface with waves of different amplitude, speed, and period.

The natural hazards app is undergoing multiple iterations of design, implementation, and review. The initial iteration consisted of a description of the concept and simple demonstrations using ObiFluid to crudely simulate wind and water surge damage to objects. The final version of the natural hazards app is incorporating feedback from our user advisory group to create an exciting and engaging interactive educational experience that could be shared with multiple students.

All of story apps are being incorporated into the student learning centers over the course of the academic year. Results of the student interactions with the apps will be discussed in the final paper and presentations.


This research contributes to the study of how to incorporate real-world science content, developed by actual scientists in research laboratories, into the K-12 educational setting in a way that maximally engages students and enhances learning. It also contributes to growing body of research on the use of storytelling as a design metaphor in human-computer interaction through augmented reality for educational purposes.



1. Wong, C. and Bergeron, D.: 30 years of multidimensional multivariate visualization. [book auth.] G. Hagen, H. and Muller, H. Nielsen. Scientific Visualization. Los Alamitos,CA, USA, IEEE Computer Society (1997).
2. Rheuter, L., Tukey, P., Maloney, L.T., Pani, J.R., Smith, S.: Human perception and visualization. Viz’90 Proceedings of the 1st Conference on Visualization ’90, pp. 401-406, IEEE Computer Society Press, San Francisco, CA, USA (1990).
3. Wick, J. van. The value of visualization. In Proc. Visualization, pp. 79-86. IEEE Computer society, Los Alamitos (2005).
4. Isenberg, T., Isenberg, P., Chen, J., Sedlmair, M., and Moller, T. A systematic review on the practice of evaluating visualization, IEEE Transactions on Visualization and Computer Graphics. 19(12), 2818-2827 (2013).
5. Glowacki, B. Mixed play spaces: Augmenting digital storytelling with tactile objects. Interactions, 2 (February 2018), 58-63.
6. Vygotsky, L. Mind in Society: The Development of Higher Psychological Processes. Harvard Univ. Press, Boston (1978).
7. Norman, D.A., and S.W. Draper. User Centered System Design: New Perspectives on Human Computer Interaction. Hillsdale, NJ: Erlbaum (1986).
8. Abras, C., Maloney-Krichmar, D., Preece, J. User-Centered Design. In Bainbridge, W. Encyclopedia of Human-Computer Interaction. Thousand Oaks: Sage Publications (2004).
9. Preece, J., Rogers, Y., and Sharp, H.: The process of interaction design. Chap. In Interaction Design: Beyond Human-Computer Interaction. John Wiley & Sons, New York, NY (2002).
10. Federal Emergency Management Association. (2005). Hurricane Mitigation: A Handbook for Public Facilities.

More [+]


Dr. Julie Baca, U.S.Army Corps of Engineers

Dr. Baca conducts research in human-computer interaction with a focus on scientific visualization and virtual and augmented reality interfaces. She has over two decades of experience in academic and government research laboratories, designing, developing and evaluating the effectiveness of human-computer interfaces. She regularly partners with human factors researchers in collaborations with domain experts, such as the educators in this study, for the purpose of designing more engaging user experiences.

Vu Tran, U.S.Army Corps of Engineers
Dr. Daniel Carruth, Mississippi State University

People also viewed

Analysis of AR Pedagogy: Empowering Students as Content Creators and Curators
Making History Pop: Bringing Augmented Reality Into Elementary School Social Studies
P.D. Magic: Applying the Best Practices of The Walt Disney Co.