Table1: Dismantling the Data Ivory Tower: Empowering Teachers with Data Driven Decision Making
Location: W179a, Table 1
Listen and learn : Research paper
Monday, June 25, 1:00–2:00 pm
Location: W179a, Table 1
Jay Manjunath Jason Tourville Amy Wang
Modern classrooms can and will produce incredible volumes of rich data. Such richness, when paired with Machine Learning and Professional Development, can bring data driven decision making to the classroom and drive student success. We present our work doing so and its impacts throughout the school.
|Attendee devices:||Devices not needed|
|Participant accounts, software and other materials:||Non|
|Subject area:||Career and technical education|
|ISTE Standards:||For Administrators:
|Disclosure:||The submitter of this session has been supported by a company whose product is being included in the session|
Marsh, Pane and Hamilton (2006) described data-driven decision making (DDDM) as the collection, analysis and application of various data from various sources in order to improve student performance and tackle student learning needs. Light, Wexler, & Heinze (2005) also described effective DDDM as a six step problem encompassing collecting data, organizing data, summarizing data, analyzing data, and synthesizing data into useful information ending with effective decision making.
Traditionally, both the technology and PD systems for DDDM in the classroom are developed separately from each other, usually with the technology developed first and PD developed around the technology. These development patterns mean that the technology being developed often rarely satisfies the educators’ needs, forcing teacher training and professional development (PD) to fulfill the technology’s requirements rather than the educator (Koehler, Mishra, Hershey, and Peruski, 2004).
Rather than viewing DDDM as a two step problem, this work views bringing DDDM to the classroom as an ecosystem of technology and continuous PD. Creating this ecosystem requires close contact and constant feedback between technologists, coaches, teachers, and students. To overcome this problem, we reimagined the development of PD and our analytics Dashboard as not two separate systems but one whole system. In doing so, we opened our DDDM development two framework possibilities.
The first was to use a modified design team approach. The design team approach as originally conceptualized, aims to produce courses more closely aligned to the needs of students by re-imaging course design as a collaboration between the course creators (teachers) and their target audience (students) (Koehler, Mishra, Hershey, and Peruski, 2004). For us, the design team approach was modified to occur specifically between technologists, coaches, and teachers. Through constant contact between the three interacting groups throughout the development process, we hoped to achieve a instructor DDDM system built on seamless integration of technology and PD requirements.
Furthermore, we placed the design team approach within a lean startup method of iterative development. The lean startup method espouses an idea of product development through iterative and continuous experimentation (Ries, 2008). In this work, that meant starting with hypothesis driven design at each iteration, validating each iterations’ hypothesis, reworking the next iterations’ hypothesis based on the previous one’s results and retesting, and finally expanding both development and system rollout when certain iteration goals were met.
The ultimate DDDM system developed was evaluated under an action research framework. The action research framework for evaluating questions in education research is a process which allows participants to study their problems scientifically in order to guide, correct, or modify their decisions and actions (Corey, 1953). Working with a partnering online school, we utilized action research to understand how our DDDM support was impacting the school’s teachers and students. In doing so, we attempt to clarify the success of our ML + PD system’s ability to bring DDDM to the classroom and its resulting impact on both teacher perspectives of their work and its impact on student outcomes.
The beginning of this project began with rigorous application of lean startup methodology to iteratively create an analytics dashboard powered by ML and DM and PD series using multiple feedback loops with experimental teacher users. In development there were three tracks. The first was machine learning system that could prioritize rank students by academic riskiness. The second was a dashboard that displayed student information in addition to summary data points and prioritize students by ML-determined riskiness order. The third was PD development that would guide teachers through their daily workflow using the dashboard. All three tracks were developed in conjunction and validated through prototyping using the track’s field-standard metrics.
Once the three track DDDM system reached their field-standard metric goals for full user rollout, it was iteratively rolled out to all teachers at a fully online high school. Complete rollout to all teachers at the high school was completed by week three of the 2017-2018 academic year. Throughout the first half year of dashboard rollout, teacher education on the dashboard will continue and be iteratively improved. So far, this education has involved group lectures on dashboard navigation and usage. Teachers were given an innovation configuration map to refer to for on-the-fly usage. Furthermore, every three weeks, teachers met one on one with a data coach to check in on their development, help them with their current class, and assess and adjust their further PD needs. At the same time, ML optimizations for predicting student riskiness will continue in an A/B testing scenario. We expect the most complete, testable DDDM system to be operational by February 6, 2018.
The fully testable DDDM system will be in usage at the school throughout the 2017-2018 school year and will impact all students actively taking coursework. Evaluation of the system has already started and by the end of the academic year, the final DDDM system will be evaluated with data collected over 9 class sessions, encompassing over 50 teachers, encompassing around 450 courses and close to 6,000 students.
Several types of data are being collected to evaluate the DDDM system’s impact. Survey data is regularly collected at periodic intervals to understand teacher satisfaction and perceived ease of use for the dashboard and their dashboard training. Qualitative interviews are also performed on a regular basis, often reserved during the last section of personal one on one data sessions teachers have with their coaches. All courses will have student academic data and attendance data collected. After the evaluation period, evaluation data will be compared against the historical performance at the school from two years prior and controlled for course performance and teacher effects.
Data collection and analysis of findings are expected to be completed over the 2017-2018 academic year. By the ISTE conference we will have enough numerical and qualitative data to understand how the developed DDDM system affects teacher perceptions of the system, their use of the system, and the full system’s impact on student outcomes.
At the time of writing this proposal, we have preliminary teacher impressions of the DDDM system. Survey results indicate that the teachers find the dashboard easy to use (avg score = 2.09 out of 10) and easy to navigate through (avg score = 1.63 out of 10). Some qualitative analysis of the dashboard indicate a split between teachers on how much utility they are getting from the system. Some find the dashboard very helpful, with one writing “It makes it much easier to access data” and another “It follows along with what I am seeing in my classroom.”. At the same time, some teachers found the dashboard misaligned with their views of classroom needs with one going as far to say “ I used the Dashboard to get individual student information to learn about the dashboard. Since the users' meeting, I have not used the dashboard.”.
The big data world has come and is here to stay. As the number of technological tools for the classroom expands, so do the possibilities of gleaning important insights on all parts of educational endeavors. The challenge for the modern classroom taking advantage of DDDM by harnessing the power of analytics and artificial intelligence and providing continued professional development to drive student success.
The work presented here illuminates one school’s journey in bringing teacher focused DDDM to the classroom. It will show how quickly or slowly the DDDM paradigm takes to permeate throughout an entire school and how that process affects student outcomes. In doing so, it will add to a limited but growing body of research on the place of DDDM in schools.
This work represents just the first step in an iterative process. After all, the proposed DDDM system was a first step and optimized to understand student risk but not optimized to understand student interventions. In other words, it is a first iteration system that helps teachers understand who needs help when but it does not tell teachers what sort of help would help a student most. Optimizing student interventions to improve student success is another, exciting research topic that we hope to tackle in future iterations.
Aragon, S. Johnson, E. (2008). Factors Influencing Completion and Noncompletion of Community College Online Courses. The American Journal of Distance Education, 22:146-158.
Celio, M. B., & Harvey, J. (2005). Buried reasure:Developing an Efective Management Guide rom Mountains o Educational Data, Seattle, Wash.:Center on Reinventing Public Education.
Chapman, P. (1999); The CRISP-DM User Guide
Corey, S. M. (1953). Action research to improve school practices. New York: Teachers College Press.
Ingram, D., Louis, K. S., & Schroeder, R. G. (2004). Accountability policies and teacher decision making: Barriers to the use of data to improve practice. Teachers College Record, 106(6), 1258-1287. DOI: 10.1111/j.1467-9620.2004.00379.x
Koehler, M. J., Mishra, P., Hershey, K., & Peruski, L. (2004). With a little help from your students: A new model for faculty development and online course design. Journal of Technology and Teacher Education, 12(1), 25-55.
Light, D., Wexler, D.H., & Heinze, J. (2005, March). Keeping teachers in the center. A framework of data-driven decision-making. A paper presented at the annual meeting of the Society for Information Technology and Teacher Education, Phoenix, AZ.
Mandinach, E.B., & Gummer, E. (2013). A systematic view of implementing data literacy in educator preparation. Educational Researcher, 42(1), 30-37. doi: 10.3102/0013189X12459803
Marsh, J.A., Pane, J.F., & Hamilton, L.S. (2006). Making sense of data driven decision making in education. Santa Monica, CA: RAND Corporation.
Popham, W. J., “Te Merits o Measurement-DrivenInstruction,” Phi Delta Kappan, Vol. 68, 1987,pp. 679–682.
Popham, W. J., K. I. Cruse, S. C. Rankin, P. D.Sandier, and P. L. Williams, “Measurement-DrivenInstruction: It’s on the Road,” Phi Delta Kappan,Vol. 66, 1985, pp. 628–634
Ries, E. (2008) Startup Lessons Learned. The lean startup. startuplessonslearned.com.
Schifter, C. C., Natarajan, U., Ketelhut, D. J., & Kirchgessner, A. (2014). Data-Driven Decision Making: Facilitating Teacher Use of Student Data to Inform Classroom Instruction. Contemporary Issues in Technology and Teacher Education, 14(4).
Staples, A., Pugach, M. C., & Himes, D. J. (2005). Rethinking the technology integration challenge: Cases from three urban elementary schools. Journal of Research on Technology in Education, 37(3), 285-311.