MORE EVENTS
Leadership
Exchange
Solutions
Summit
DigCit
Connect

A Framework for Districts to Measure and Improve Their Instructional Technology Use

Times and dates are displayed based on your device's time zone setting.

Listen and learn : Research paper
Roundtable presentation

Lisa Jobson  
Zareen Kasad  
John Seylar  
Dr. Mahsa Bakhshaei  
Dr. Christina Luke  
Kristin Kahlich  
Dr. Courtney Teague  

To help districts measure their teacher and student impactful use of technology, researchers, educators and administrators have been developing a rubric and associated survey. Learn about the rubric, its development process and suggestions for its use. Hear from practitioners about their experiences using the rubric and survey.

Audience: Chief technology officers/superintendents/school board members, Coaches, Professional developers
Attendee devices: Devices not needed
Topic: Professional learning
Grade level: PK-12
ISTE Standards: For Coaches:
Professional Development and Program Evaluation
  • Evaluate results of professional learning programs to determine the effectiveness on deepening teacher content knowledge, improving teacher pedagogical skills and/or increasing student learning.
For Education Leaders:
Visionary Planner
  • Evaluate progress on the strategic plan, make course corrections, measure impact and scale effective approaches for using technology to transform learning.
For Educators:
Learner
  • Set professional learning goals to explore and apply pedagogical approaches made possible by technology and reflect on their effectiveness.
Disclosure: The submitter of this session has been supported by a company whose product is being included in the session

Proposal summary

Framework

Several technology integration models (e.g., TPACK, SAMR, Technology Integration Matrix, or ISTE Standards) already exist and are being used in many districts with their original or an adjusted format or content. However, none of these models encompass all aspects of learning. While some are only about the use of technology in teaching practice (e.g., TPACK), others focus on technology integration without a clear connection to student learning. Similarly,other models are limited to specific aspects of student learning (e.g., the Triple E framework measures how significantly the technology affects student engagement and learning without regard for other dimensions of learning such as global skills).
We offer an agnostic model that encompasses the common criteria of the most popular existing models that focus on use of technology for enhancing the skills and knowledge students need to succeed in school and in life. An agnostic model creates unity out of diversity. Districts often employ a staggered release when introducing access to technology tools and relevant training. For example, they may begin with their elementary schools and then progress to their middle schools, or they may start with all their high schools. In these cases, some schools may be in early adoption of leveraging technology for teaching and learning and simpler models may seem more appropriate to them, whereas other schools may be at more advanced stages and require more in-depth models. An agnostic model could mean that within a district, and even within a school, frameworks could vary, but if the common goal is enhancing student learning, an agnostic model will allow for comparison and aggregation of data by district leaders and lead the whole district toward this common goal. Moreover, an agnostic model facilitates continuous and cohesive tracking of the common goals. An agnostic model is also more likely to provide a consistent and seamless measurement system within a district when leadership changes.
We believe that every student should possess strong content mastery as well as the skills and knowledge they need to succeed in work and life in a global, interconnected and constantly changing world (i.e. global skills). Therefore, our goal is to provide districts with a framework to evaluate:
The extent to which their students use technology in ways that enhance their engagement and learning.
The extent to which their students use technology in ways that enhance their global skills.
Also, we know that teachers have always held the key to student success. Therefore, we also aim at providing districts with a framework to evaluate:
The extent to which their teachers use technology in ways that enhance their teaching practices.

Methods

To create this framework, we started with identifying the most popular models for technology integration in teaching and learning. Eight frameworks were identified, which we could categorize under three categories:
Classroom-facing frameworks: These frameworks provide information and guidance on how technology should be incorporated into the design activity of teaching/learning. Two popular frameworks were identified under this category: SAMR and T3.
Teacher-facing frameworks: These frameworks provide information and guidance on how educators are incorporating technology into the specific aspects of their instruction. Three popular frameworks were identified under this category: TPACK, UNESCO ICT Framework, and ISTE Teachers Standards.
Student-facing frameworks: These frameworks provide information and guidance on how educators are incorporating technology into their instruction in ways that meet students' learning goals. Three popular frameworks were identified under this category: TIM, Triple E, and ISTE Students Standards.
Considering the fact that our goal is centered on enhancement of student engagement and learning, we focused on teacher-facing and student-facing frameworks that share the same goal (i.e. TPACK, UNESCO ICT Framework, TIM, Triple E and ISTE Teachers and students Standards). Through a systematic review (Khan et al., 2003) of these frameworks, we sought to understand which indicators intersect across these frameworks. In mapping the rubric domains, we were able to identify overarching themes to explore in greater detail. In the next phase, we conducted an international literature review to define each indicator as well as the associated scales.
Once the first version of the rubric was drafted by researchers considering the rules in the design of educational rubrics for student learning (Van Leusen, 2013; Brookhart, 2013; Nitko & Brookhart, 2007; Popham, 2000), we conducted a validation study using the expert technique with both researchers and practitioners (Steurer, 2011). Eight teachers and ten district leaders from public districts across the country were interviewed (using user testing methods) to share their thoughts and feedback. The teacher interviewees represented a variety of different subjects (e.g., mathematics and history), grade levels, and had experience with integrating meaningful technology use into their practice. District interviewees were curriculum or technology leaders, or held leadership positions within their district or instructional technology departments. For researchers, they were experts in instructional technology with in-depth experience in survey design.

Results

We are now at the stage of finalizing our validation study. In three weeks, we will run a survey associated with the rubric in six elementary, middle and high schools of different sizes in rural, suburban, and urban areas. We will run a factor analysis on the data to see the extent to which the rubric’s measures demonstrate strong reliability (Swisher, Beckstead, & Bebeau, 2004). We will upload the findings for the review committee for a final round of iterative feedback before finalizing both the rubric and survey.

Importance

The purpose of this research session is to share the process of developing, user testing and validating a rubric that educators can use to self-assess their growth in classroom technology use. We incorporated the rubric items into a survey study to better understand the perceptions of teachers regarding their ability to engage students in impactful technology use. We found that teachers’ reports of frequency of actual activities occurring in the classroom as captured by our newer version of the rubric were better predictors of impact than teacher self-ratings, and they were also more closely tied to participation in the coaching program. The resulting rubric and surveys from this study can help educators and researchers reflect on which are the most impactful technology uses in specific grades and subjects, where teachers think students need help, and how to increase impact through coaching and use of rubrics.

References

Brookhart, Susan M. (2013). How to create and use rubrics for formative assessment and grading. Alexandria, VA: Association for Supervision & Curriculum Development.
Darling-Hammond, L., Zielezinski, M. B., & Goldman, S. (2014). Using Technology to Support At-Risk Students’ Learning. Stanford Center for Opportunity Policy in Education. Retrieved from https://edpolicy.stanford.edu/sites/default/files/scope-pub-using-technology-report.pdf
Florida Center for Instructional Technology. (2019). TIM Evaluation Tools. Retrieved from https://fcit.usf.edu/matrix/evaluation-tools/
International Society for Technology in Education. (2000). ISTE national educational technology standards (NETS). Eugene, OR: International Society for Technology in Education.
Khan, K. S., Kunz, R., Kleijnen, J., & Antes, G. (2003). Five steps to conducting a systematic review. Journal of the royal society of medicine, 96(3), 118-121.
Koehler, M. & Mishra, P. (2009). What is Technological Pedagogical Content Knowledge (TPACK)?. Contemporary Issues in Technology and Teacher Education, 9(1), 60-70. Waynesville, NC USA: Society for Information Technology & Teacher Education.
Kolb, L. (2011). Triple E-Framework. Retrieved from https://www.tripleeframework.com/lesson-analysis-on-making-predictions.html.
Magana, A. J. (2019). Disruptive Classroom Technologies. Oxford Research Encyclopedia of Education.
Nitko, A. J., & Brookhart, S. M. (2007). Educational Assessment of Students (5th ed.). Upper Saddle River, NJ: Pearson Education.
OECD (2019). TALIS 2018 Results (Volume I): Teachers and School Leaders as Lifelong Learners, TALIS, OECD Publishing, Paris, https://doi.org/10.1787/1d0bc92a-en.
Popham, W. J. (2000). Modern educational measurement: Practical guidelines for educational leaders (3rd ed.). Boston: Allyn and Bacon.
Puentedura, R. R. (2013, May 29). SAMR: Moving from enhancement to transformation [Web log post]. Retrieved from http://www.hippasus.com/rrpweblog/archives/000095.html
PwC (2018). Technology in US schools: Are we preparing our kids for the jobs of tomorrow? Retrieved from https://www.pwc.com/us/en/about-us/corporate-responsibility/library/preparing-students-for-technology-jobs.html
Roland, J. (2009). The Best of Learning & Leading with Technology: Selections from Volumes 31-35. International Society for Technology in Education.
Steurer, J. (2011). The Delphi method: an efficient procedure to generate knowledge. Skeletal Radiol, 40(8), 959–61.
Swisher, L. L., Beckstead, J. W., & Bebeau, M. J. (2004). Factor analysis as a tool for survey analysis using a professional role orientation inventory as an example. Physical Therapy, 84(9), 784-799.
UNESCO (2018). UNESCO ICT Competency Framework for Teachers: Version 3. (2018). Paris: UNESCO.
US Department of Education. (2017). Reimagining the role of technology in education: 2017 National Education Technology Plan update. Office of Educational Technology. Washington, D.C.
Van Leusen, P. (2013). Assessments with rubrics. ASU TeachOnline. Retrieved from https://teachonline.asu.edu/2013/08/assessments-with-rubrics/

More [+]

Presenters

Photo
Dr. Mahsa Bakhshaei, Digital Promise

Mahsa has more than 10 years of research, evaluation and teaching experience in the field of education, working with academia, governments, and nonprofit organizations in the U.S. and Canada. Her research focuses on coaching and socio-educational development of immigrant-origin children and English Learners, and she has authored several journal articles, books, and book chapters on these topics. Mahsa’s research on teacher coaching recently received a best-paper award from the Society for Information Technology & Teacher Education (SITE) International Conference.

Photo
Dr. Christina Luke, Digital Promise
Photo
Kristin Kahlich, Google

People also viewed

Digital Tools to Promote Remote Lesson Study