MORE EVENTS
Leadership
Exchange
Solutions
Summit
DigCit
Connect

Effect of Question Placement in Videos on Students’ Understanding of Climate Change

Times and dates are displayed based on your device's time zone setting.

Participate and share : Poster
Poster presentation

Mengyu Wang  
Meredith Jacques  
William Kyle  

Do you use videos in class or for flipped instruction but aren’t sure whether students are really learning or retaining the information? We present evidence to support a style of instruction while using videos synchronously or asynchronously to increase effectiveness. Our study tested questioning systems within a video learning tool.

Audience: Teachers, Curriculum/district specialists, Library media specialists
Attendee devices: Devices not needed
Topic: Instructional design & delivery
Grade level: 6-12
Subject area: Science, STEM/STEAM
ISTE Standards: For Educators:
Learner
  • Stay current with research that supports improved student learning outcomes, including findings from the learning sciences.
Designer
  • Use technology to create, adapt and personalize learning experiences that foster independent learning and accommodate learner differences and needs.
Analyst
  • Use technology to design and implement a variety of formative and summative assessments that accommodate learner needs, provide timely feedback to students and inform instruction.

Proposal summary

Framework

Online educational videos have become an increasingly more important mode of instructional content delivery in k-12 education, especially since the outbreak of Covid-19 pandemic.Educational psychological research found that taking a memory test not just assesses what learners know but even strengthens their later retention on knowledge (Roediger & Karpicke, 2006a, 2006b), which is known as “testing effect”. Although previous studies showed “testing effect” outperforms rereading the learning materials and even concept mapping (a generative learning method) when learning materials are text format (Karpicke, 2012; Karpicke & Blunt, 2011; Roediger & Karpicke, 2006a,2006b). Fewer studies explored “testing effect” and question placement on high school students’ learning performance and retention of knowledge when learning materials are in video format. This study focuses on two common placements of questions during video learning: dispersed questions throughout the video lectures and stacked questions at the end of video lectures. A common video learning strategy is to have students re-watch the video lectures and take notes. This restudy approach was used in a control group for comparison to the question placement variations. In particular, these three conditions (Dispersed questions group vs. Stacked questions group vs. Restudy group) are designed to compare the benefits of the two patterns of questions in video-recorded lectures with the restudy condition. 60 students from a Midwest high school in the U.S were assigned randomly into the three video learning conditions. The content of the video learning was about sea level rise, which is an important topic of environmental science education. After learning from video lectures, students’ learning performance was assessed by a published assessment on sea level rise (Breslyn et al., 2016). Two tests are administered to evaluate immediate knowledge understanding (posttest following video lectures) and long-term retention of knowledge (one-week delayed test). The findings will help inform future online video-based learning designs, formative assessment approaches, and teaching and learning practices in secondary education.
The hypotheses are:
1.Students working with questions (both stacked and dispersed question placements’ groups) will outperform those who re-watch and take notes (restudy group) on both posttest and one-week delayed test.
2.Students working with dispersed questions will have similar performance as students working with stacked questions during the video lectures on both posttest and one-week delayed test.

Methods

Participants
60 students who are concurrently enrolled in environmental science classes from a Midwest high school in the U.S participated in the study. Participants were assigned randomly into three experimental groups with the factor “condition” : dispersed questions (n = 20), stacked questions (n = 20) and restudy control (n = 20).

Learning Materials
The learning material used in this study is a 13-min video lecture introducing sea level rise. To test the effect of question placement, the video is divided into 8 clips so the questions can be inserted. In total, there are eight questions used in both test conditions. The questions are free-response questions related to the content of sea level rise and they were designed by two environmental science educators. A specialist independently checked the validity of these questions. The assessment tool for the posttest/delayed test was created and published by the climate change learning science research group of Maryland University, which can be downloaded from their website (http://www.climateedresearch.org). The assessment tool consists of 16 multiple-choice questions, half of which were similar to the free-response questions throughout the video lecture.

Procedure
All participants in the study were randomly assigned to three groups. For the dispersed questions placement group, students watched the eight video clips and completed one free-response question after each video clip. In the stacked questions placement group, students watched the undivided video lecture and answered the 8 free recall questions together after the video lecture. In the control group, students watched the undivided video lecture twice, and were prompted to take notes during second viewing. Immediately after video learning, all students took the posttest. One week later, they completed the one-week delayed test. The time given to complete video learning during the study was kept equal among three groups to avoid confounding variables. Learning modules and testing materials were constructed in Qualtrics and distributed to students via weblinks so they could complete the study all online. The total time of the study is 70mins over two weeks. Since there was only one correct answer to each multiple-choice question in the test materials, the correct answer for each question was worth one point. One-way ANOVA was used to see if there were any significant differences among the three conditions on the posttest and one-week delayed test. The analysis is based on the research questions and proposed hypotheses.

Results

We investigated students’ learning performance after video learning with immediate posttest, and retention of knowledge after one week with a delayed test. Table 1 shows the average scores and respective S.D.s for the three conditions on the immediate posttest and delayed test. A one-way ANOVA revealed that there was no significant difference among the three groups in the immediate posttest, F(2, 57) = 1.96, p = .15, partial eta squared = .064, although the dispersed questions group showed slight higher scores than the other two. However, the delayed test score was significantly higher for both recall groups than the restudy group. Specifically, the ANOVA revealed a significant main effect of question placement on delayed test performance, F(2, 57) = 4.129, p = .02, partial eta squared = .127. A post hoc LSD test indicated that participants in the restudy group (M 􏰂= .74, SD 􏰂= .18) have lower scores compared to participants in the dispersed group (M 􏰂= .86, SD 􏰂=.14), p 􏰄= .016, and in the stacked group (M 􏰂= .86, SD 􏰂= .11), p 􏰂= .016. The dispersed group did not produce significantly better final-test performance than stacked group (p 􏰂=1.00). Thus, we can conclude that doing embedded questions during video learning can lead to more positive results than restudying and note-taking, especially for retention of knowledge. Further, different question placement doesn’t impact students’ learning performance. Overall, students learning with either the dispersed questions or the stacked questions during the video lecture led to better learning performance (retention of knowledge) than students who re-watched the video lecture and took notes (Figure 1).
Table 1. Means of Proportion Correct and Standard Deviations on Immediately and After One-Week Delay tests for Three Groups

Figure 1. Means of Proportion Correct on Immediately and After One-Week Delay tests for Three Groups

(Sorry, figures seem not be uploaded here)

Importance

Our findings complement and confirm prior studies showing benefits of “testing effect” in video-based learning activities (Johnson & Mayer, 2009; Szpunar et al., 2013; Zu et al., 2019). Answering questions during video learning can be an effective teaching strategy for students learning from assigned educational videos. Our primary interest was determining the most effective question placement during a video lecture. We found that answering inserted questions while watching the video and answering questions after watching the entire video provided equivalent benefits on the final test. The length of video lecture in our study is 13min, a fairly typical time range for educational videos which may be shown during a class in high school. However, it can be difficult for students to maintain focus over an extended period of time. Instruction is moving increasingly online, in and out of the classroom, and this study provides insight to an approach that could increase the effectiveness of the video modality of instruction. Many tools exist that enable teachers to construct video lessons with embedded questions, such as edpuzzle. The findings of this study confirm that learning from a video by high school students is truly enhanced by the addition of questions. We recommend providing students with questions either throughout a video or immediately after the whole video to enhance their learning and retention of knowledge.

References

Breslyn, W., McGinnis, J. R., McDonald, R. C., & Hestness, E. (2016). Developing a learning progression for sea level rise, a major impact of climate change. Journal of Research in Science Teaching, 53(10), 1471-1499. https://doi.org/10.1002/tea.21333
Johnson, C. I., & Mayer, R. E. (2009). A testing effect with multimedia learning. Journal of Educational Psychology, 101(3), 621–629. https://doi.org/10.1037/a0015183
Karpicke, J. D. (2012). Retrieval-based learning: Active retrieval promotes meaningful learning. Current Directions in Psychological Science, 21(3), 157-163. https://doi.org/10.1177/0963721412443552
Roediger, H. L., & Karpicke, J. D. (2006a ). The power of testing memory: Basic research and implications for educational practice. Perspectives on psychological science, 1(3), 181-210. https://doi.org/10.1111/j.1745-6916.2006.00012.x
Roediger, H. L., III, & Karpicke, J. D. (2006b ). Test-enhanced learning: Taking memory tests improves long-term retention. Psychological science, 17(3), 249-255. https://doi.org/10.1111/j.1467-9280.2006.01693.x
Karpicke, J. D., & Blunt, J. R. (2011). Retrieval practice produces more learning than elaborative studying with concept mapping. Science, 331(6018), 772-775.
https://doi.org/10.1126/science.1199327
Szpunar, K. K., Khan, N. Y., & Schacter, D. L. (2013). Interpolated memory tests reduce mind wandering and improve learning of online lectures. Proceedings of the National Academy of Sciences, 110(16), 6313-6317. https://doi.org/10.1073/pnas.1221764110
Zu, T., Munsell, J., & Rebello, N. S. (2019). Comparing retrieval-based practice and peer instruction in physics learning. Physical Review Physics Education Research, 15(1), 010105. https://doi.org/10.1103/PhysRevPhysEducRes.15.010105

More [+]

Presenters

Photo
Mengyu Wang, UMSL
Graduate student

My name is Mengyu Wang, a doctoral student at University of Missouri, St.Louis. After getting my Master degree at Beijing Normal University, I pursue my Ph.D in the U.S. I am interested in environmental science education and I am currently working on how to integrate multimedia technologies into regular environmental science class at high school level.

Photo
Meredith Jacques, Parkway Schools

People also viewed

Differentiating Teacher Professional Learning
Lift Up Student Voices Through the Engineering Design Process
Observing the ISTE Standards for Students in Practice