Change display time — Currently: Central Daylight Time (CDT) (Event time)

Building Evidence for Virtual and Hybrid Learning Models: A Mixed-Methods Study

,

Lecture presentation
Blended Content
Save to My Favorites
This is presentation 1 of 2, scroll down to see more details.

Other presentations in this group:

Session description

Emergency remote learning during the COVID-19 pandemic is often blamed for “learning loss,” but what if virtual and hybrid learning (VHL) models could accelerate learning? This study explores high-quality VHL models to uncover their effectiveness, conditions, and impact on learning acceleration for diverse populations in varied contexts.

Framework

Effective engagement of the instructional core (Elmore, 2008) drives the quality of any learning experience, whether in person or online. With VHL models, this concept includes technology that supports students’ interaction with learning content, objectives, and tasks; pedagogy that encourages active engagement with rigorous content in support of mastery; and, relationships - both to others as well as to the task- that strengthen commitment to and motivation for learning (Rabbitt, 2020). While these three factors of technology, pedagogy, and relationships work interdependently, they also build on each other serially. For example, technology supports effective engagement during instruction if designed based on sound pedagogical principles.

Core to this framework lies the need for students to possess a solid foundation to learn independently (Johnson & Galy, 2013; Shyu & Brown, 1992; Stansfield, McLellan, & Connolly, 2004) and exert self-direction (Stansfield, McLellan, & Connolly, 2004). This involves both helping the student to build core skills as well as supporting their families and community. As such, high-quality VHL approaches prioritize building these skills and relationships (Rabbitt, 2020).

More [+]

Methods

This research paper includes a two phased study. The methodology is as follows:

Phase 1: Landscape Scan Methodology
The first phase was designed to establish an understanding of the landscape of VHL models as well as identify the sample for phase two. First, we sourced the names of fully virtual and hybrid models from past work, the 2024 National Educational Technology Plan (NETP), and the Virtual Learning Leadership Alliance (VLLA). Then, we categorized these models based on where they operated (at the state-, system-, or school-level), their intention (i.e., fully virtual, hybrid, part-time course access), the types of available instruction (e.g., core content areas, advanced placement, personalized learning), and evidence of student success (e.g., state benchmark data, grades, attendance, etc.). Next, we conducted unstructured interviews with a purposive sample of nine leaders to better understand their available data, the students they serve, and the practices that they associate with their success followed by seven structured interviews to narrow the sample. Finally, the research team engaged in collaborative workshops with leaders from six virtual or hybrid schools to co-design evaluation studies for the 2024-25 school year and build out detailed theory of change models.

Qualitative Coding
To analyze the data from the individual blueprint sessions, a holistic coding process (Miles, Huberman, & Saldãna, 2018) captured and documented categories and themes. The initial analysis identified emergent codes which was followed by two rounds of analyses that used two levels of hypothesized deductive codes aligned to existing frameworks (Rabbitt, 2020; The Learning Accelerator, 2021). In each round, additional emergent themes were noted, codes were added as needed, and various codes were refined or removed. Once each layer of codes was identified, a matrix database was developed (Appendix C).

Theory of Change Development
Theory-based evaluation approaches move beyond qualitative analyses and answer causal questions around how, why, and under what conditions programs have contributed to change. These methods are particularly useful for opening the “black box” of how and why a program works within complex systems (Koleros & Mayne, 2019; Leviton & Lipsey, 2007) such as virtual and hybrid schools. By exploring cause-and-effect relationships, this growing suite of methodologies (e.g. contribution analysis, process tracing, outcome harvesting) rely on a theory of change (ToC) as the basis against which to assess evidence to make credible, causal claims about the contribution a strategy is having to observed changes (Stern et al., 2012).

During the first phase, the research team engaged in a series of workshops with each of the identified sites to develop unique ToCs based on the conceptual framework. After completing these models, we identified unique causal mechanisms that will be examined in phase two to answer the research questions.

Plans for Phase 2
During the 2024-25 school year, the research team will conduct a mixed-methods outcome evaluation with a nested causal pathways analysis in the context of the six virtual and hybrid schools. The outcome evaluation will include pre- and post-surveys (Appendix D) with a sample of teachers and students from within each site as well as qualitative data collected during virtual “learning loops” on two occasions. The two virtual sessions will bring together the leaders and some teachers from each of the sites to share insights and data.

The causal pathways analysis will be nested within the outcome evaluation. A sample of administrators, students, teachers, family members, and community partners from four of the six sites will be interviewed, where relevant, to gain a deeper understanding of the context. We will use a realist interview protocol, which will allow us to draw causal inferences about each program and provide detailed, contextual information about participants’ experiences; perceptions of how the model accelerates student learning; and interpretations of how and why certain elements of the program supports student success. The interviews will be conducted virtually and last approximately 60 minutes.

More [+]

Results

Findings
Of the 64 models in the landscape scan, 21 operated as state-based, 34 as system-based, and nine as standalone schools (Figure 2). Each was intentionally designed to meet the needs of specific student groups. All of the state-based models were fully virtual, and 19 of the 21 provided individual course access. In contrast, system- and school-based models offered both fully virtual and hybrid learning opportunities, although fewer allowed students to enroll solely for course access. Of note, for those that offered course access, many allowed students from neighboring districts to enroll (Table 1).

Different settings used diverse instructional practices. State-based models offered higher percentages of Career & College Prep and Dual Credit & Enrollment. System- and school-based models all offered Instruction in Core Subjects as well as Personalized instruction. Primarily fully virtual models that are system- or school-based provided consistent instruction in core subjects such as ELA and math (Table 2). Over 70% of the hybrid models incorporated Blended Learning, and 60% included Career Technical Education (CTE).

These models then used different forms of evidence. Course access models do not administer benchmark proficiency assessments or participate in state testing, reporting metrics such as Course Enrollment. Many system- and school-based models leveraged Graduation Rates, Students Reached, and Growth on Standardized Assessments (Table 3). However, interviews with school leaders revealed an important caveat: many of these models often seek to serve students who have been "failed by the system." Therefore, focusing solely on proficiency scores does not provide a complete picture of effectiveness.

Finally, many of these models offer dual enrollment, CTE opportunities, project-based learning, and personalized or competency-based learning experiences. Because they leveraged non-traditional metrics, their evidence varied widely and did not always conform to conventional measures. Interviews also revealed that while elementary and middle schools used formative assessment systems, most high schools relied on standardized measures such as SAT/ACT scores or state testing to evaluate proficiency (16 of the 64 models reported using SAT/ACT scores). This diversity in assessment practices highlights the need for a nuanced, context-specific approach to understanding "evidence of success" in VHL environments.

Thematic analysis of the six selected sites’ workshop data revealed that while their VHL models varied in approach, they all were grounded in whole child, personalized, and mastery-based learning. Further, every site had evidence of practices that support effective VHL, including relationships, pedagogy, technology, and foundations that support self-directed learning (Rabbitt, 2020). During the course of the 2024-2025 school year, we will study how their individual models and practices influence changes in learners and the subsequent outcomes that document learning acceleration.

Expectations
The second phase of this study will examine both traditional and nontraditional outcomes achieved by each of the sites as well as the causal mechanisms that create pathways to learning acceleration for diverse learners. It is expected that findings from each site and across sites will provide a robust evidence base as to how learning acceleration is happening in a high quality VHL models, for whom, and under what conditions.

More [+]

Importance

Coming out of the pandemic, questions emerged about the quality of virtual learning, challenging its efficacy. However, countless examples – both empirical and anecdotal – indicate that students need virtual options. For many, brick-and-mortar schooling has failed them as learners, and yet they thrived during the pandemic. VHL models provide a valuable solution, ensuring that students engage in their education, regardless of limitations or the availability of specific opportunities in their local setting. The challenge is how to help the sector learn about successful, high-quality VHL models so that more students may benefit.

This multi-phase, mixed-methods study intends to challenge the current narrative and produce a rigorous evidence base. The first phase identified the core tenets of successful VHL models. The second will deepen our understanding of what works, for whom, and under which conditions.

More [+]

References

References
              
Creswell, J. W., & Clark, V. L. P. (2017). Designing and conducting mixed methods research. Sage publications.

Dorn, E., Hancock, B., Sarakatsannis, J., & Viruleg, E. (2020). COVID-19 and student learning in the United States: The hurt could last a lifetime. McKinsey & Company. https://www.mckinsey.com/industries/education/our-insights/covid-19-and-student-learning-in-the-united-states-the-hurt-could-last-a-lifetime

Elmore, R. F. (2008). Improving the instructional core. Harvard University. https://www.hepg.org/her-home/issues/harvard-educational-review-volume-78-issue-1/herarticle/_1236

Kazu, I. Y., & Yalçınii, E. (2022). Virtual learning and its impact on student achievement: A meta-analysis. Journal of Educational Technology, 55(4), 523-538. https://doi.org/10.1016/j.jedutech.2022.05.006

Koleros, A., & Mayne, J. (2019). Theory-based evaluation and its use in practice. Evaluation, 25(3), 252-269. https://doi.org/10.1177/1356389019875550
Kuhfeld, M., Tarasawa, B., Johnson, A., & Lewis, K. (2020). Learning during COVID-19: Initial findings on students' reading and math achievement and growth. NWEA. https://www.nwea.org/research/publication/learning-during-covid-19-initial-findings-on-students-reading-and-math-achievement-and-growth/

Leviton, L. C., & Lipsey, M. W. (2007). Theory as method: Small theories of treatments. In S. I. Donaldson, M. Scriven, & C. Christie (Eds.), Evaluating social programs and problems: Visions for the new millennium (pp. 55-71). Lawrence Erlbaum Associates.

Li, X., & Wang, Y. (2021). The effectiveness of online learning: A meta-analysis of empirical studies. Educational Research Review, 31, 100360. https://doi.org/10.1016/j.edurev.2020.100360

Means, B., Toyama, Y., Murphy, R., Bakia, M., & Jones, K. (2013). The effectiveness of online and blended learning: A meta-analysis of the empirical literature. Teachers College Record, 115(3), 1-47. https://doi.org/10.1177/016146811311500303
Miles, M. B., Huberman, A. M., & Saldaña, J. (2018). Qualitative data analysis: A methods sourcebook. Sage publications.
Molnar, A., Hu, B., Miron, G., & Elgeberi, N. (2020). Virtual schools in the U.S. 2020: Research evidence on the performance of virtual and blended learning programs. National Education Policy Center. https://nepc.colorado.edu/publication/virtual-schools-annual-2020

NWEA. (2024). Recovery still elusive: 2023-24 student achievement highlights persistent achievement gaps and a long road ahead. NWEA. https://www.nwea.org/uploads/recovery-still-illlusive-2023-24-student-achievement-highlights-persistent-achievement-gaps-and-a-long-road-ahead_NWEA_researchBrief.pdf

Stern, E., Stame, N., Mayne, J., Forss, K., Davies, R., & Dart, J. (2012). Broadening the range of designs and methods for impact evaluations. Department for International Development. https://www.gov.uk/government/publications/broadening-the-range-of-designs-and-methods-for-impact-evaluations

Rabbitt, B. (2020). Driving quality in virtual and remote learning: A framework for research-informed virtual and remote experiences for K-12 learners. The Learning Accelerator: Portland, ME.

The Learning Accelerator. (2021). What can teaching and learning practice look like? https://practices.learningaccelerator.org/learn/what-can-teaching-learning-practice-look-like

Ulum, Ö. G. (2022). The impact of hybrid learning on student outcomes: A comprehensive review and meta-analysis. Educational Research Review, 34, 100431. https://doi.org/10.1016/j.edurev.2022.100431

More [+]

Presenters

Photo
Partner, Research, Measurement, & Policy
The Learning Accelerator
Co-author: Beth Holland

Session specifications

Topic:

Virtual and Blended Learning

TLP:

Yes

Grade level:

PK-12

Audience:

District Level Leadership, Government/Non-profit, School Level Leadership

Attendee devices:

Devices not needed

ISTE Standards:

For Education Leaders:
Visionary Planner
  • Include a wide range of perspectives from the community to develop and sustain a vision for using technology to advance student learning and success.
  • Share lessons learned, best practices, challenges and the impact of learning with technology with other education leaders who want to learn from this work.

TLPs:

Connect learning to learner, Ignite Agency