MORE EVENTS
Leadership
Exchange
Solutions
Summit
DigCit
Connect
Change display time — Currently: Mountain Daylight Time (MDT) (Event time)

LEARN how evidence informs school and district procurement practices

,
Colorado Convention Center, 108/10/12

Lecture presentation
Listen and learn: Research paper
Save to My Favorites

Research papers are a pairing of two 18 minute presentations followed by 18 minutes of Discussion led by a Discussant, with remaining time for Q & A.
This is presentation 2 of 2, scroll down to see more details.

Other presentations in this group:

Presenters

Photo
Sr. Principal Education Researcher
SRI International
Jessica Mislevy, Ph.D., directs SRI Education’s Digital Learning & Technology Policy program, in which she studies highly innovative teaching and learning approaches that use advanced technology in the STEM disciplines. With a background in measurement and statistics, she leads mixed-methods research, evaluation, and technical assistance projects to improve college readiness and completion through the effective adoption of evidence-based practices and products. Mislevy has led/co-led large-scale impact, implementation, and cost evaluations of digital learning products and initiatives. She also provides coaching and consultation to program developers and education stakeholders around rigorous evaluation design and evidence standards.
Photo
Senior Education Researcher
SRI International
Graduate student
@ela_joshi
Ela Joshi, PhD, specializes in mixed-methods studies that aim to understand, examine and improve how people interact with and within schools, school districts and other education organizations. Joshi has a decade of experience in conducting studies that examine how students and teachers experience program implementation, how social networks and networked improvement communities bring education innovations to scale and how innovative programs and policies impact students from historically underserved groups. Her research experience includes evaluations of programs that enhance teacher effectiveness and leadership as well as school efforts to improve equity and access for students from socially significant populations.
Photo
Research Associate
SRI International
Erin is a research associate and the project manager for the LEARN Network. She supports research activities and brings experience with qualitative data collection and policy writing.
Photo
Education Research Associate
SRI International

Session description

The LEARN Network aims to promote learning growth by increasing usage of evidence-based products. From interviews and nationally-representative surveys, learn how schools and districts procure ed-tech products, how evidence is used in these decisions, and ways to help educators find products aligned with students' needs and contexts.

Framework

The current research was informed by Morrison and colleagues’ (2019) Procurement "Action-Point" Framework. This framework has roots in the Technology Acceptance Model (TAM; Davis, 1993) and Rogers' (2003) Diffusion of Innovation Theory. It includes key action points of typical procurement processes in school districts and associates key procurement needs that occur at one time or another along the pathway from allotment of funding to the acquisition of selected products. Our research was also informed by Farley-Ripple and colleagues’ (2018) depth of research use framing, which includes six dimensions of practice that literature on organizational and evidence-based decisionmaking suggests are important for meaningful and systematic use of evidence likely to generate improvements in policy and practice. Finally, our research drew upon the conceptual framework developed by Pennuel and colleagues’ (2017) in their study of how school and district leaders access, perceive, and use research. Included in this framework are characteristics and organizational conditions that correlate with educational leaders’ use of research.

More [+]

Methods

The LEARN Network is conducting a descriptive and correlational study characterizing educators’ procurement practices and use of evidence therein, changes in procurement due to COVID-19, challenges faced, and desired supports. We use a mixed-methods approach that includes interviews, focus groups, and surveys with K-12 education stakeholders to obtain perspectives from stakeholders in diverse educational contexts.
Interviews. We conducted interviews with a broad array of education stakeholders to explore procurement experiences, perceptions, barriers, and facilitators in depth with key education stakeholder groups and to inform the development of the surveys. To date, we have interviewed 28 education leaders representing priority stakeholder roles at the school, district, and state level serving students from contexts and populations who have been historically underserved and/or students whose academic outcomes have been disproportionately affected by COVID-19. To compose the interview sample, we solicited nominations from our education advisors and purposively select interviewees from a stratified list of prospective participants representing priority stakeholder roles, institution types, and contexts, emphasizing those serving student populations disproportionately affected by COVID-19.

We developed an initial draft of the interview protocol, covering topics such as: procurement processes and experiences (perceptions of the process, stakeholders involved, if and how evidence is used, etc.); barriers and challenges faced related to procurement; and, tools or resources that may be helpful in identifying needs-aligned evidence-based products. Protocols were semi-structured, with interviews designed to last 45–60 minutes. We asked our educator advisors to review the draft protocol, and revised it in response to their feedback.

In the first phase of the analysis, researchers documented themes from each interview using a structured post-interview form designed to capture key learnings to inform survey development. In addition, the research team engaged in ongoing debriefing conversations to help surface and synthesize themes across the interviews. In the second phase of the analysis—which is currently underway—researchers have developed an initial coding framework based on the question topics in the interview protocol, the research questions, review of the literature, and preliminary themes from the phase one analysis. The development of the final framework was iterative; the researchers piloted the framework using interview data and conferred frequently on the coding scheme, the meaning of the codes, the constructs captured, and the codes. Four members of the research team coded interviews using the final framework. We took several steps to ensure interrater reliability and consistency in code application. First, all qualitative team members were trained on the coding scheme to establish coding norms and participate in inter-rater reliability checks to ensure coding consistency. Specifically, we used the inter-rater reliability test feature from the qualitative coding software, Dedoose, to calculate Cohen’s Kappa values. Kappa values among the team fell within acceptable bounds, ranging from 0.79 to 0.98 (Cicchetti, 1994; Landis & Koch, 1977).

The research team is in the process of preparing analytic memos synthesizing themes within each code. Researchers will read all excerpts within a given code, examining patterns within and across levels, institutions, roles, and product types. Memos will synthesize and summarize these themes. Researchers will systematically read across memos to examine cross-cutting findings and themes, identifying key findings and themes.

Focus groups. To ensure that communities most directly affected by EBPs have a voice, we conducted 2 teacher focus groups with a total of 9 teachers, representing a range of grade levels and contexts, and also conducted 3 parent focus groups with a total of 10 parents/caregivers. We engaged our education advisors to support recruitment, following similar criteria as outlined above for the interviews. Focus groups explored whether and in what ways these stakeholders have a voice in EBP selection, their satisfaction with current procurement processes, and perceptions related to the performance, reliability, convenience, and cost (PRCC) and cultural relevance of products currently used in their school/district. Protocols were semi-structured, with focus groups designed to last 45–60 minutes. We asked our educator advisors to review the draft protocol and revised it in response to their feedback. Researchers used the same process as for the interviews to code and analyze the focus group transcripts and report on key themes.

Surveys. We also conducted surveys to confirm or triangulate interview findings and produce nationally representative estimates for the key education stakeholder groups of school and district leaders. We utilized RAND’s American Educator Panels (AEP) to field surveys to nationally representative samples of K–12 public school principals and school district and charter management organization (CMO) leaders. The AEP samples are scientifically drawn, probability-based samples that can be weighted to produce nationally representative estimates (Robbins & Grant, 2020).

The survey asked about the types of ed tech and other products schools and districts are buying, how schools and districts learned about, evaluated, and acquired the products, the extent to which research and evidence is used in the procurement process, and tools that would be helpful to identify and procure products with a rich evidence base. Because one goal is to test the extent to which prior research on K-12 procurement of ed tech and use of research and evidence in decisionmaking holds across contexts, roles, and post-pandemic, we repeated several key constructs from Morrison et al. (2019) and Penuel et al. (2017). These scales were developed for and validated by Morrison et al. (2019) or Penuel et al. (2017) with samples overlapping with our target population and shown to demonstrate adequate reliability. The survey consisted primarily of fixed-choice items with a small number of open-ended items (estimated completion time 10–15 minutes). We prepared an initial draft of the survey, shared it with advisors for review, and made revisions in response to their feedback.

We used RAND’s AEP to administer the school leader and district leader surveys online during spring 2023. Panelists received an email invitation, reminders, and a final notice. School leader respondents received gift cards for completing the survey (district leaders were not offered gift cards, as district policy typically prohibits district leaders from accepting such incentives). A total of 1,036 out of the 3,334 school leaders sampled from the ASLP responded to the survey for a completion rate of 31.6%. Of the 1,107 districts enrolled in the ASDP participated in the survey, for a completion rate of 20.1%. RAND created weights that account for the probability of selection and probability of response in order to produce estimates for the ASLP and ASDP that reflect the national population of public school leaders and school districts, respectively.

We are currently analyzing the survey data. We are first preparing descriptive statistics (e.g., frequencies, means and standard deviations) to summarize survey responses, applying weights and adjusting standard errors as appropriate for the AEP data in accordance with Robbins & Grant (2020). For previously validated survey scales, we will first calculate scale scores as directed by the original authors (e.g., create a simple index that is the sum of responses aggregated across items associated with each scale). We will present means, standard deviations, and a distribution of responses for each scale and examine the scale reliability with our sample using Cronbach’s alpha (Cronbach, 1951).
To examine differences between stakeholder groups, we will conduct analysis of variance (ANOVA) omnibus tests on the survey responses, with significant differences followed by post hoc tests using a Benjamini-Hochberg adjustment for multiple comparisons (Benjamini & Hochberg, 1995). Subgroups may include district type (traditional vs. CMO), locale (urban, suburban, rural), poverty (based on FRPL), and student race/ethnicity (majority white vs. majority students of color).

More [+]

Results

As previously described, our data collection efforts are complete, and analyses are underway. We have completed the first phase of qualitative interview/focus group analysis and expect to complete the second phase of the qualitative analysis and the quantitative survey analyses in fall 2023.

Several key themes have begun to emerge from our initial interview and focus group analysis:

1. Needs assessment. Needs assessment can be triggered by a variety of situations including the expiration of a contract, a set timeline for periodic review, or a policy shift such as changes in state standards. Many respondents mentioned looking at student data to identify need.

2. Discovery and evaluation. Respondents described a variety of methods for discovering new products. These include requests for proposals, state or district approved lists, conferences, and word of mouth. In terms of evaluation, respondents report that it is critical for studies to be relevant to their district or school’s context. In other words, the setting of the study helps them determine if the findings are relevant to their own setting.

3. Decisionmaking. Core curricular products are often selected by a committee that includes a select number of teachers and district leaders. Large purchases may also require the approval of the superintendent or school board or both. Decisions on supplemental curriculum are often made at the school level by teachers or principals, although the district may have resources to support these decisions, such as a list of pre-approved or pre-purchased products. In our parent focus groups, we learned that parents are interested in being more involved in procurement decisions, but ultimately believe that teachers should have the most say in which products get used in their classrooms. In our teacher focus groups, we learned that teachers would like to have more voice in selecting products, but we also heard that teachers are stretched too thin. This means that even in places where teachers are invited to participate in procurement decisions, they may not have the capacity to do so.

4. Product implementation. Participants shared strong preferences for products that do not have onerous training requirements. In fact, the amount of training required is sometimes weighted more heavily than the evidence base for a product. This is especially important for districts or schools experiencing high turnover because they have to retrain new hires. Implementation plans are often conceived of and executed at the district level with district employees providing the necessary training to school-level employees.

5. Strengths and challenges of procurement process. When asked what was working well, respondents pointed to extensive stakeholder engagement processes and the fact that multiple products were considered and compared before final decisions were made. When asked about what barriers they faced in the procurement process, respondents mentioned a lack of funds and a lack of time to conduct thorough stakeholder engagement processes. Respondents also described having a difficult time finding research that was applicable to their district context. In other words, studies were set in contexts that did not match their own.

More [+]

Importance

Despite research funded by the U.S. Department of Education and other agencies that documents positive learning impacts of numerous education innovations, few EBPs have scaled at a level to produce wide-ranging impact over a sustained period. There is new urgency to adapt and scale adoption of EBPs that could help states, districts, schools, and IHEs to address the negative impacts of the COVID-19 pandemic on learning and educational equity, especially in areas where major gaps have emerged (e.g., literacy, behavior), and with populations disproportionately impacted (e.g., students with disabilities, multilingual learners, students of color, and students in low-income households). This study will generate knowledge about educator needs and barriers to procurement and adoption of EBPs that can help educators overcome obstacles they face, as well as support developers in designing and adapting their products while considering educator context and decisionmaking processes.

More [+]

References

Agostinelli, F., Doepke, M., Sorrenti, G., & Zilibotti, F. (2022). When the great equalizer shuts down: Schools, peers, and parents in pandemic times. Journal of Public Economics, 206, 104574.

Alpert, W. T., Couch, K. A., & Harmon, O. R. (2016). A randomized assessment of online learning. American Economic Review, 106(5), 378–82. https://www.aeaweb.org/articles?id=10.1257/aer.p20161057

Bauer, L., Clevenstine, V., Edelberg, W., Raczek, E., & Yee, W. (2021, October 14). Has COVID disrupted the postsecondary pipeline? Brookings Institution. https://www.brookings.edu/blog/up-front/2021/10/14/has-covid-disrupted-the-postsecondary-pipeline/

Benjamini, Y. & Hochberg, Y. (1995). Controlling the False Discovery Rate: A Practical and Powerful Approach to Multiple Testing. Journal of the Royal Statistical Society: Series B (Methodological), 57(1), 289-300. https://doi.org/10.1111/j.2517-6161.1995.tb02031.x

Birkland, T. A. (2011). An introduction to the policy process: Theories, concepts, and models of public policy making. Routledge.

Cellini, S. R., (2021, August 13). How does virtual learning impact students in higher education? Brookings Institution. https://www.brookings.edu/blog/brown-center-chalkboard/2021/08/13/how-does-virtual-learning-impact-students-in-higher-education/

Cicchetti, D. V. (1994). Guidelines, criteria, and rules of thumb for evaluating normed and standardized assessment instruments in psychology. Psychological Assessment, 6(4), 284–290. https://doi.org/10.1037/1040-3590.6.4.284

Cronbach, L.J. (1951). Coefficient alpha and the internal structure of tests. Psychometrika, 16, 297–334. https://doi.org/10.1007/BF02310555

Davis, F. A. (1993). User acceptance of information technology: System characteristics, user perceptions and behavioral impacts. International Journal of Man-Machine Studies, 38(3), 475-487. https://doi.org/10.1006/imms.1993.1022

EdTech Evidence Exchange. (2021, July). The EdTech Genome Project report. https://edtechevidence.org/wp-content/uploads/2021/07/1.-FINAL-EdTechGenomeProject-FinalReport_July2021-2.pdf

Engzell, P., Frey, A., & Verhagen, M. D. (2021). Learning loss due to school closures during the COVID-19 pandemic. Proceedings of the National Academy of Sciences, 118(17).

Farley-Ripple, E. N. (2012). Research Use in School District Central Office Decision Making: A Case Study. Educational Management Administration & Leadership, 40(6), 786–806. https://doi.org/10.1177/1741143212456912

Farley-Ripple, E., May, H., Karpyn, A., Tilley, K., & McDonough, K. (2018). Rethinking Connections Between Research and Practice in Education: A Conceptual Framework.
Educational Researcher, 47(4), 235–245. https://doi.org/10.3102/0013189X18761042

Goldberg, S. B. (2021). Education in a pandemic: the disparate impacts of COVID-19 on America’s students. U.S. Department of Education, Office for Civil Rights.
https://www2.ed.gov/about/offices/list/ocr/docs/20210608-impacts-of-covid19.pdf

Hammerstein, S., König, C., Dreisörner, T., & Frey, A. (2021). Effects of COVID-19 related school closures on student achievement-a systematic review. Frontiers in Psychology, 4020.

Hollands, F., & Escueta, M. (2020). How research informs educational technology decision-making in higher education: The role of external research versus internal research. Educational Technology Research and Development, 68(1), 163–180.

Kaufman, J. H., & Diliberti, M.K. (2021). Divergent and inequitable teaching and learning pathways during (and perhaps beyond) the pandemic: Key findings from the American Educator Panels Spring 2021 COVID-19 surveys. RAND Corporation. https://www.rand.org/pubs/research_reports/RRA168-6.html

Landis, J. R. & Koch, G. G. (1977). The measurement of observer agreement for categorical data. Biometrics, 33(1), 159-74. PMID: 843571.

Maldonado, J. E., & De Witte, K. (2020). The effect of school closures on standardised student test outcomes. Discussion Paper Series, 20, 17. KU Leuven.

Martin, P. (2014). Improving ed-tech purchasing. Digital Promise.

Morrison, J. R., Ross, S. M., Cheung, A. C. K. (2019). From the market to the classroom: How ed-tech products are procured by school districts interacting with vendors. Educational Technology Research and Development, 67(2), 389–421. http://dx.doi.org/10.1007/s11423-019-09649-4

Penuel, W. R., Briggs, D.C., Davidson, K.L., Herlihy, C., Sherer, D., Hill, H. C., Farrell, C., & Allen, A.-R. (2017, May). How school and district leaders access, perceive, and use research. AERA Open. https://doi.org/10.1177/2332858417705370

Polikoff, M. S., Campbell, S. E., Rabovsky, S., Koedel, C., Le, Q. T., Hardaway, T., & Gasparian, H. (2020). The formalized processes districts use to evaluate mathematics textbooks. Journal of Curriculum Studies, 52(4), 451–477.

Robbins, M. W. & Grant, D. (2020) RAND American educator panels technical description. RAND Corporation. https://www.rand.org/pubs/research_reports/RR3104.html

Rogers, E. M. (2003). Difusion of innovations (5th ed.). New York: Free Press.

Tomasik, M. J., Helbling, L. A., & Moser, U. (2021). Educational gains of in‐person vs. distance learning in primary and secondary schools: A natural experiment during the COVID‐19 pandemic school closures in Switzerland. International Journal of Psychology, 56(4), 566–576.

More [+]

Session specifications

Topic:
Curriculum planning & evaluation
Grade level:
PK-12
Audience:
Chief technology officers/superintendents/school board members, Curriculum/district specialists, Principals/head teachers
Attendee devices:
Devices useful
Attendee device specification:
Smartphone: Android, iOS, Windows
Laptop: Chromebook, Mac, PC
Tablet: Android, iOS, Windows
ISTE Standards:
For Education Leaders:
Visionary Planner
  • Engage education stakeholders in developing and adopting a shared vision for using technology to improve student success, informed by the learning sciences.
  • Evaluate progress on the strategic plan, make course corrections, measure impact and scale effective approaches for using technology to transform learning.
  • Communicate effectively with stakeholders to gather input on the plan, celebrate successes and engage in a continuous improvement cycle.