ISTE20Creative Constructor
Lab Virtual
Digital Leadership
Summit at ISTE20 Live
Edtech Advocacy &
Policy Summit

The Design and Evaluation of Protocols to Support Systemic Innovation

Listen and learn

Listen and learn : Research paper
Roundtable presentation


Tuesday, June 25, 2:45–3:45 pm
Location: 121AB, Table 4

Presentation 3 of 4
Other presentations:
Integrative STEM for PK-4 Preservice Teachers
Improving Confidence and Competence With STEM Content and Pedagogy
Diagnosing Disease Inside a Cell: Collaborative Learning Environments in VR

Dr. Beth Holland  
Learn about a researcher's effort to design a digital toolkit to improve communication, develop shared language to describe innovation and increase districts' capacity for organizational learning. Though the digital design failed, the protocols inherent in the tools proved useful to school and district leaders. Learn what did and didn't work.

Audience: Chief technology officers/superintendents/school board members, Teacher education/higher ed faculty, Principals/head teachers
Attendee devices: Devices not needed
Focus: Leadership
Topic: Educational policy and leadership
ISTE Standards: For Administrators:
Systemic Improvement
  • Lead purposeful change to maximize the achievement of learning goals through the appropriate use of technology and media-rich resources.
Visionary Leadership
  • Inspire and facilitate among all stakeholders a shared vision of purposeful change that maximizes use of digital age resources to meet and exceed learning goals, support effective instructional practice, and maximize performance of district and school leaders.
For Education Leaders:
Visionary Planner
  • Build on the shared vision by collaboratively creating a strategic plan that articulates how technology will be used to enhance learning.

Proposal summary

Framework

When organizations lack a common understanding to support a new initiative, then reform movements often result in sporadic efforts rather than systemic change (Evans, Thornton, & Usinger, 2012). To achieve the goal of systemic innovation of classroom practice to prepare students for the knowledge economy requires districts to improve communication between the layers in their organization such that members ultimately build collective knowledge - a trait associated with Organizational Learning Communities (OLCs) (Senge & Kim, 2013).

According to Senge and Kim (2013), three interrelated activities promote organizational learning: theory-building, practice, and capacity-building. In the study, the process of developing common language to support innovation of classroom practice served as theory-building. Practice occurred when participants engaged in sociocultural activities by using the digital toolkit as they communicated and collaborated with colleagues. Finally, improved quantity and quality of communication resulting from the strengthening of social networks (Daly & Finnigan, 2010; Frank, Zhao, & Borman, 2004; Umekubo et al., 2015) intended to increase organizational learning capability (Goh, Quon, & Cousins, 2007) and serve as an indicator of capacity-building.

Therefore, organizational learning (Senge, 1990;2006) served as the theoretical framework and proposed for the following to occur. Participants would engage in sociocultural activities (practice) when using the digital toolkit. These actions intended to increase the quantity and quality of communication between central office and building leaders to strengthen social ties and support the creation of common language to describe innovation of classroom practice (theory-building). As a result of the communication and language construction, districts would engage in organizational learning (capacity-building).

Methods

Oftentimes in educational research, intervention studies do not take the variability of context into account (LeMahieu, Edwards, & Gomez, 2015). Therefore, the researcher used a variant of an embedded mixed-methods design and conducted a multi-site, explanatory case study that included collection and analysis of both qualitative as well as quantitative data (Creswell & Plano Clark, 2011). Frequently employed to evaluate school innovations, multi-site explanatory case studies present rich descriptions and deep explanations on which to make inferences (Martinson & O'Brien, 2010).

With a multi-site case study design, researchers frequently use purposive sampling strategies to specifically address the research questions (Martinson & O’Brien, 2010). The intervention study occurred in the three purposefully selected K-12 districts in the Northeastern U.S.: Bayview and Hilltop from North state, and Bridgetown from South state (pseudonyms). Purposeful sampling permits the selection of groups of participants to establish comparisons between cases or subgroups (Teddlie & Yu, 2007). It also allows for the intentional selection of participants because of the existence of a central phenomenon (Creswell & Plano Clark, 2011). Within the context of this intervention study, each district had previously created leadership teams specifically to address the challenge of systemic innovation and made significant investments into technology as well as professional development, and yet had not achieved systemic diffusion of strategic vision.

In this intervention study, a secondary process evaluation assessed the fidelity of the program implementation and provided an explanation for the results of the outcome evaluation. Since the process evaluation questions included the frequency and quality of interactions that participants had with the digital toolkit, the data informed the assessment of the following outcome questions:

- RQ1: To what degree did using the digital resources affect the organizational learning capacity of the districts?
- RQ2: How did the language used by participants to describe innovative classroom practice to prepare students for the knowledge economy change as a result of using the resources?
- RQ3: How did engaging in the sociocultural activities with the resources affect communication between the participants within their districts?

Results

The three districts who participated in the intervention had similar demographics, and yet they possessed distinctly different characteristics that affected implementation. To accommodate the districts’ schedules, union requirements, and internal power dynamics, the researcher modified the design of the program and the toolkit to encourage participation. Though these changes affected the intervention fidelity, adapting the program to the realities of the context in each district afforded an opportunity to account for variability (LeMahieu et al., 2015). The rich descriptions from the multiple case studies then allowed the researcher to examine cause and effect relationships within each district (Martinson & O’Brien, 2010).

To address the first research question, the Organizational Learning Survey (OLS) (Goh & Richards, 1997) measured changes in organizational learning capacity through pre and post-tests. The quantitative survey data did not reveal any significant changes across the districts as measured by the nonparametric Wilcoxon Signed Rank Test.

Next, the researcher looked for changes in language to describe innovation of classroom practice. Though qualitative statements asking participants to define innovation were collected via pre and post-test surveys, the researcher chose to examine the qualitative data collected via the process evaluation due to a low response rate and high participant attrition. The researcher found that participants often used symbolic language that created an appearance of innovation (Bolman & Deal, 2008) but without defining the desired change or describing how it might be implemented. Qualitative analysis of data collected during face-to-face meetings as well as through the digital toolkit also revealed that few participants engaged in conversations about classroom practice.

Finally, to address whether engaging in the sociocultural activities with the digital toolkit affected the quantity and quality of communication between the participants within their districts, the researcher examined both the sociograms generated from the social network data collected from the SSSNQ (Pitts & Spillane, 2009) and the qualitative data from the process evaluation. Ultimately, given the high rates of participant attrition, the post-test data did not show a significant change in quantity or quality of communication. However, qualitative observations allowed the researcher to better understand the communication and power dynamics within each district.

Though the process evaluation indicated little use of the digital resources happened as intended, qualitative observations revealed that participants used the components of the resources either in a different format or as a verbal protocol to engage in the sociocultural activities. For example, both the Director of RTI and Elementary ELL Coordinator in Bayview used the Polarity Map to engage in joint work with colleagues (Honig, 2012; Honig & Rainey, 2014). In a different instance, the elementary coordinators and one of the elementary principals engaged in joint work and boundary-spanning while using the Think-Feel-Care resource as a protocol. While participants in Bridgetown rarely collaborated using the digital resources, responses to the prompts within the Essential Improvements tool revealed that the DLT coaches started to connect with their principals as well as each other. This final revelation not only indicated the presence of joint work (Honig, 2012; Honig & Rainey, 2014), but also boundary-spanning — the process of communicating across the layers of the hierarchy (Swinnerton, 2007).

Importance

Treatment theory describes the relationship between the inputs, activities, and outcomes of an intervention program (Leviton & Lipsey, 2007). According to the treatment theory for this intervention, use of the digital toolkit intended to encourage the sociocultural activities of joint work, boundary-spanning, and brokering (Honig, 2008; 2012; Honig & Rainey, 2014; Swinnerton, 2007). Consequently, the districts would improve their communication, develop shared language to describe innovation of classroom practice to prepare students for the knowledge economy, and increase their capacity for organizational learning. Though the outcome evaluation did not reveal a significant change in communication, language, or capacity for organizational learning, the researcher attributes this to the design of the intervention rather than the theory of treatment.

Qualitative data revealed that participants required additional modeling and support to use the digital resources. Though the digital resources contained videos and descriptions to provide just-in-time training per recommendations from the professional development literature (Dede, Ketelhut, Whitehouse, Breit, & McCloskey, 2008; Koehler & Mishra, 2005; Richardson, Flora, & Bathon, 2013; Rienties, Brouwer, & Lygo-Baker, 2013), participants indicated that they needed additional modeling and support to make effective use of the tools. In all three districts, the researcher had occasional opportunities to model the use of the digital resources with participants. After these sessions, participants commented that they could not have completed the thinking required by that digital tool without the researcher’s assistance as well as the presence of an objective facilitator.

Systemic change requires both the testing of ideas as well as ongoing learning through rapid cycles of inquiry (Perla, Provost, & Parry, 2013). Though the researcher used a multisite case study as a variant on an embedded mixed methods design (Creswell & Plano Clark, 2011) to measure the effects of this intervention, future studies might consider design-based research strategies as an alternative to account the for the variability of educational environments, study learning in context, and account for the complexities of the real world in practice (Collins, Joseph, & Bielaczyc, 2004). Using both quantitative and qualitative methods, design-based researchers observe components of a design in context (Collins et al., 2004).

Additionally, design-based research encourages collaboration with participants and facilitates ongoing improvement (Penuel, Fishman, Haugan Cheng, & Sabelli, 2011). During the intervention, the researcher modified the digital resources as well as the program to meet the needs of the participants and encourage participation. However, the participants did not feel as though they had ownership of the intervention. Instead, it was an external reform introduced by the researcher. Therefore, by engaging the participants in cycles of improvement such as the Plan-Do-Study-Act cycles promoted by improvement science (Cohen-Vogel et al., 2015), the digital resources and other intervention components could be developed iteratively and collaboratively with participants; prototyped and tested under varying conditions; and then refined to match the unique context and cultures of the districts.

When innovations successfully diffuse within the ecosystem of an organization, change agents help to develop the need for change, translate intention into action, and encourage adoption through social learning and modeling (Rogers, 2004b). Because the participants in the researcher’s intervention never had that opportunity, many did not feel as though they could adapt the digital resources to their specific context. If the ultimate goal is systemic innovation of classroom practice through the development of shared language and organizational learning, future studies should thus consider a more iterative, user-centered, design-based approach rather than the application of a single intervention (Bannan-Ritland, 2003).

References

Bannan-Ritland, B. (2003). The role of design in research: The integrative learning design framework. Educational Researcher, 32(1), 21–24. doi:10.3102/0013189X032001021

Bolman, L. G., & Deal, T. E. (2008). Reframing organizations: Artistry, choice, and leadership (4 ed). San Francisco: Jossey-Bass.

Cohen-Vogel, L., Tichnor-Wagner, A., Allen, D., Harrison, C., Kainz, K., Socol, A. R., & Wang, Q. (2015). Implementing educational innovations at scale: Transforming researchers into continuous improvement scientists. Educational Policy, 29(1), 257–277. http://doi.org/10.1177/0895904814560886

Collins, A., Joseph, D., & Bielaczyc, K. (2004). Design research: Theoretical and methodological issues. Journal of the Learning Sciences, 13(1), 15–42. http://doi.org/10.1207/s15327809jls1301_2

Creswell, J.W., & Plano Clark, V.L. (2011). Designing and conducting mixed methods research. Thousand Oaks, CA: Sage.

Daly, A. J., & Finnigan, K. S. (2010). A bridge between worlds: Understanding network structure to understand change strategy. Journal of Educational Change, 11(2), 111–138. doi:10.1007/s10833-009-9102-5

Daly, A. J., Finnigan, K. S., Moolenaar, N. M., & Che, J. (2014). The critical role of brokers in the access and use of evidence at the school and district level. In K.S. Finnigan & A. J. Daly (Eds.), Beginning the Journey: Research Evidence from the Schoolhouse Door to Capitol Hill (pp. 13-32). New York: Springer.

Daly, A. J., Moolenaar, N. M., Bolivar, J. M., & Burke, P. (2010). Relationships in reform: the role of teachers' social networks. Journal of Educational Administration, 48(3), 359–391. doi:10.1108/09578231011041062

Dede, C., Ketelhut, D. J., Whitehouse, P., Breit, L., & McCloskey, E. M. (2008). A Research Agenda for Online Teacher Professional Development. Journal of Teacher Education, 60(1), 8–19. http://doi.org/10.1177/0022487108327554

Dusenbury, L., Brannigan, R., & Falco, M. (2003). A review of research on fidelity of implementation: implications for drug abuse prevention in school settings. Health Education Research, 18(2), 237–256. doi:10.1093/her/18.2.237

Evans, L., Thornton, B., & Usinger, J. (2012). Theoretical frameworks to guide school improvement. NASSP Bulletin, 96(2), 154–171. doi:10.1177/0192636512444714

Frank, K. A., Zhao, Y., & Borman, K. (2004). Social capital and the diffusion of innovations within organizations: The case of computer technology in schools. Sociology of Education, 77(2), 148–171. doi:10.1177/003804070407700203

Goddard, R. D., Goddard, Y., Sook Kim, E., & Miller, R. (2015). A theoretical and empirical analysis of the roles of instructional leadership, teacher collaboration, and collective efficacy beliefs in support of student learning. American Journal of Education, 121(4), 501–530. doi:10.1086/681925

Goh, S., & Richards, G. (1997). Benchmarking the learning capability of organizations. European Management Journal, 15(5), 575–583. doi:10.1016/S0263-2373(97)00036-4

Goh, S., Quon, T. K., & Cousins, J. B. (2007). The organizational learning survey: A re-evaluation of unidimensionality. Psychological Reports, 101(3 I), 707–721. doi:10.2466/PR0.101.3.707-721

Honig, M. L. (2008). District central offices as learning organizations: How sociocultural and organizational learning theories elaborate district central office administrators' participation in teaching and learning improvement efforts. American Journal of Education, 114(4), 627–664. doi:10.1086/589317

Honig, M. L. (2012). District central office leadership as teaching: How central office administrators support principals' development as instructional leaders. Educational Administration Quarterly, 48(4), 733–774. doi:10.1177/0013161X12443258

Honig, M. L., & Rainey, L. R. (2014). Central office leadership in principal professional learning communities: The practice beneath the policy. Teachers College Record, 116(4), 1–48. doi:10.1111/17404 1467-9620

Honig, M., Venkateswaran, N., McNeil, P., & Twitchell, J. M. (2014). Leaders' use of research for fundamental change in school district central offices: Processes and challenges. In K.S. Finnigan & A. J. Daly (Eds.), Beginning the Journey: Research Evidence from the Schoolhouse Door to Capitol Hill (pp. 33-52). New York: Springer.

Koehler, M. J., & Mishra, P. (2005). Teachers learning technology by design. Journal of Computing in Teacher Education, 21(3), 94–102. http://doi.org/10.1080/10402454.2005.10784518

LeMahieu, P. G., Edwards, A. R., & Gomez, L. M. (2015). At the nexus of improvement science and teaching: Introduction to a special section of the journal of teacher education. Journal of Teacher Education, 66(5), 446–449. doi:10.1177/0022487115602125

Leviton, L. C., & Lipsey, M. W. (2007). A big chapter about small theories: Theory as method: Small theories of treatments. New Directions for Evaluation, 2007(114), 27–62. doi:10.1002/ev.224

Levy, F., & Murnane, R. J. (2013). Dancing with Robots. Retrieved from http://s3.amazonaws.com/content.thirdway.org/publishing/attachments/files/000/000/056/Dancing-With-Robots.pdf?1412360045

Martinez, M. R., McGrath, D. R., & Foster, E. (2016). How deeper learning can create a new vision for teaching. Retrieved from National Commission on Teaching & America's Future web site: http://nctaf.org/wp-content/uploads/2016/02/NCTAF-ConsultEd_How-Deeper-Learning-Can-Create-a-New-Vision-for-Teaching.pdf

Martinson, K. & O'Brein, C (2010). Conducting case studies. In J. Wholey, H. Hatry, & K. Newcomer (Eds.), Handbook of practical program evaluation (pp. 125-143). San Francisco, CA: Jossey-Bass.

McLendon, M. K., Cohen-Vogel, L., & Wachen, J. (2015). Understanding education policy making and polocy change in the American states. In B. S. Cooper, J. G. Cibulka, & L. D. Fusarelli (Eds.), Handbook of Education Politics and Policy (2nd ed., pp. 1–34). New York: Routeledge.

Meyer, H.-D. (2006). Gauging the prospects for change. In H.-D. Meyer & B. Rowan (Eds.), The new institutionalism in education (pp. 217–223). Albany.

Penuel, W. R., Riel, M., Krause, A., & Frank, K. A. (2009). Analyzing teachers' professional interactions in a school as social capital: A social network approach. Teachers College Record, 111(1), 124–163. Retrieved from https://www.researchgate.net/profile/Kenneth_Frank/publication/234609619_Analyzing_Teachers'_Professional_Interactions_in_a_School_as_Social_Capital_A_Social_Network_Approach/links/02e7e52cc07a74aa6f000000.pdf

Perla, R. J., Provost, L. P., & Parry, G. J. (2013). Seven propositions of the science of improvement. Quality Management in Health Care, 22(3), 170–186. http://doi.org/10.1097/QMH.0b013e31829a6a15

Pitts, V. M., & Spillane, J. P. (2009). Using social network methods to study school leadership. International Journal of Research & Method in Education, 32(2), 185–207. doi:10.1080/17437270902946660

Richardson, J. W., Flora, K., & Bathon, J. (2013). Fostering a school technology vision in school leaders. NCPEA International Journal of Educational Leadership Preparation, 8(1), 1–18. Retrieved from http://www.ncpeapublications.org/attachments/article/560/Richardson.pdf

Rienties, B., Brouwer, N., & Lygo-Baker, S. (2013). The effects of online professional development on higher education teachers' beliefs and intentions towards learning facilitation and technology. Teaching and Teacher Education, 29, 122–131. doi:10.1016/j.tate.2012.09.002

Rogers, E. M. (2004a). A Prospective and Retrospective Look at the Diffusion Model. Journal of Health Communication, 9(sup1), 13–19. doi:10.1080/10810730490271449

Rogers, E. M. (2004b). Diffusion of Innovations (3rd ed., pp. 1–236). London: The Free Press.

Senge, P. M., & Kim, D. H. (2013). From fragmentation to integration: Building learning communities. Reflections. Retrieved from http://search.ebscohost.com/login.aspx?direct=true&db=buh&AN=91705234&site=ehost-live&scope=site

Shadish, W. R., Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental designs for generalized causal inference. Boston: Houghton Mifflin.

Soulé, H., & Warrick, T. (2015). Defining 21st century readiness for all students: What we know and how to get there. Psychology of Aesthetics, Creativity, and the Arts, 9(2), 178–186. doi:10.1037/aca0000017

Spillane, J. P., Kenney, A. W., & Kim, C. M. (2012). An exploratory analysis of formal school leaders' positioning in instructional advice and information networks in elementary schools. American Journal of Education, 119(1), 73–102. doi:10.1086/667755

Swinnerton, J. (2007). Brokers and boundary crossers in an urban school district: Understanding central-office coaches as instructional leaders. Journal of School Leadership, 17(2), 195–221. Retrieved from https://journals.rowman.com

Teddlie, C., & Yu, F. (2007). Mixed methods sampling: A typology with examples. Journal of Mixed Methods Research, 1(1), 77–100. doi:10.1177/2345678906292430

Umekubo, L. A., Chrispeels, J. H., & Daly, A. J. (2015). The cohort model: Lessons learned when principals collaborate. Journal of Educational Change, 16(4), 451–482. doi:10.1007/s10833-015-9256-2

Weeres, J., & Kerchner, C. (1996). This time it's serious: Post-industrialism and the coming institutional change in education. In R. Crowson, W. Boyd, & H. Mahwhinney (Eds.), The politics of education and the new institutionalism: Reinventing the American school (pp. 135– 152). Bristol, PA: Falmer Press.

More [+]

Presenters

Dr. Beth Holland, University of Rhode Island

Dr. Beth Holland is writer, speaker, and professional learning instructor. She has spent the past 20 years working in and around K-12 education and writes regularly for Edutopia as well as EdTech Researcher at Education Week. Beth holds an Education Doctorate (EdD) in Entrepreneurial Leadership in Education from Johns Hopkins University, an Education Master's (EdM) in Technology Innovation & Education from Harvard University, as well as a Bachelor's degree (B.S.) in Communication Studies from Northwestern University.

People also viewed

From Then to Now: Changes in Professional Learning Networks Over Time
Khan Academy: Educational Equity, Rigor and Results
So You Think You're Smarter than an EDtech Nerd?