Helping Teachers Use Game Play Data for Formative Assessment and Differentiation: Concurrent presentation 2 of 3

Location: Room 217A, Table 2

Listen and learn

[Listen and learn : Research paper]

Tuesday, June 27, 11:45 am–12:45 pm
Location: Room 217A, Table 2

Dr. James Diamond   Heather KIM  
We will present findings from a pilot study with six middle school science teachers who used a video game about argumentation for 1-2 weeks in class. Our discussion will focus on how teachers interpreted game play data to make inferences about student learning and make changes to instruction.

Skill level: Beginner
Attendee devices: Devices useful
Attendee device specification: Tablet: iOS
Participant accounts, software and other materials: OPTIONAL 1: Participants may choose to download the free game—Mars Generation One Argubot Academy—from the iTunes App Store. The game can only be played on iOS devices.

OPTIONAL 2: Participants may choose to view examples of the "teacher data dashboard" here: https://www.glasslabgames.org/games/AA-1

Focus: Digital age teaching & learning
Topic: Games and simulations
Grade level: 6-8
Subject area: STEM/STEAM
ISTE Standards: Teachers : Design and develop digital age learning experiences and assessments
Teachers : Engage in professional growth and leadership

Digital tote resources

Proposal summary

Framework

The following conceptual frameworks broadly frame the study: teacher pedagogical content knowledge (Shulman, 1986; McNeill & Knight, 2013); data-driven decision making (Mandinach, 2012; Mandinach, Honey, & Light, 2006; Means, Chen, DeBarger, & Padilla, 2011); formative assessment (Heritage, Kim, Vendlinksi, & Herman, 2009); game-based learning (Barab, Gresalfi, Ingram-Goble, 2010; Shute & Fe, 2012; Shaffer, 2007; Squire, 2006); and educative curricula (University of Chicago, n.d.). More generally, the research is framed by teachers’ technological pedagogical content knowledge (TPCK), though that construct is not operationalized in our data collection instruments (described briefly below) (Voogt, Fisser, Pareja Roblin, Tondeur, & can Braak, 2013).

The five main frameworks noted above inform our hypotheses about factors that might influence teachers’ facility to use game play data, within the limits of this study. That is, these factors are likely to influence whether and how teachers can make use of gameplay to make inferences about student learning: teachers’ knowledge about a specific topic, as well as their repertoire of “instructional moves” to help their students learn about the topic; teachers’ facility with and practices for using data to make inferences about student learning and change instruction as needed; teachers’ understanding of well-designed games might operationalize targeted learning objectives; and the design of educative curricula, or “Curriculum materials for Grades K–12 that are intended to promote teacher learning in addition to student learning” (Davis & Krajcik, 2005). We summarize each of these briefly below.

Pedagogical content knowledge (PCK). Pedagogical content knowledge is a teacher’s knowledge of subject matter that is necessary to help students learn that knowledge (Shulman, 1986). In the area of scientific argumentation, McNeill and Knight (2013) refined the construct further, arguing that pedagogical content knowledge of argumentation “encompasses using knowledge of scientific argumentation to make sense of the structural and dialogic aspects of students’ oral and written discourse” (p. 940).

Data-driven decision making and formative assessment. While tools such as immersive games can provide teachers with large amounts of formatted data in a timely manner, these innovations often are not necessarily “usable” (Fishman, Marx, Blumenfeld, Krajcik, & Soloway, 2004). Many teachers are not trained to use data systematically during their pre-service education (Mandinach, Gummer, & Muller, 2011), and their access to professional development to build data literacy skills is often limited (Mandinach, 2012). Mandinach (2012) called this sense-making process “pedagogical data literacy” and argued that it was “more than simply looking at the numbers or statistics collected through analysis. It refers to a teacher’s ability to transform the numbers and statistics into instructional strategies that meet the needs of specific students” (p. 76).

Game-based learning. Researchers have argued that well-designed digital games are examples of “cognitively oriented technology innovations” (Fishman, Marx, Blumenfeld, Krajcik, & Soloway, 2004), or tools that can promote and support the more complex cognitive activities associated with scientific inquiry (Barab, Gresalfi, Ingram-Goble, 2010; Shute & Fe, 2012; Shaffer, 2007; Squire, 2006). Yet, there is little research on how teachers can use—or learn to use—learners’ experiences with those practices in games to make decisions about instruction beyond the game.

Educative curricula. A great deal of teachers’ knowledge of how to teach is situated in their clinical practice (Putnam & Borko, 2000) and because curriculum materials are so central to teacher practice (Ball & Cohen, 1996), in principle they can be designed to support student learning and teacher learning.  Schneider and Krajcik (2002) described five design considerations to support the development of educative curricula, addressing factors including teacher prior knowledge, attending to practical constraints, and integrating teaching learning into student lessons. Davis, Palincsar, Arias, Bismack, Marulis, and Iwashyna (2014) have added a sixth consideration, noting that, “in investigating how teachers use the educative features as tools and how students respond to this instruction, the goal of our work is to use process data to guide the development and revision of educative features” (p. 48).

Methods

The sample for the fall 2016/spring 2017 pilot studies will consist of 8–10 middle-grades science teachers, each of whom will implement a 1.5-week supplemental mini-unit on argumentation, in addition to their regular unit on specific science content (using their regular curriculum materials). (The implementation period will be four weeks during the fall 2017 impact study and will include a capstone debate project.) Participants will be located in the NYC metropolitan area; western Massachusetts; and the Los Angeles and San Francisco metropolitan areas. Teachers and students will all be in public schools that demonstrate a range in terms of socioeconomic status, urbanicity, and geographical regions. During the mini-unit implementation, teachers and students will switch between game play and group activities that use the claims-evidence-reasoning framework for argumentation in the classroom. Researchers will observe each of the pilot classes two times during the pilots.

Students will play the game individually on iPads during class for 2–3 periods. The video game operationalizes targeted learning objectives related to constructing arguments, roughly based on a claims-evidence-reasoning framework. In two planning periods during the 1.5-week implementation, teachers will review gameplay data in the dashboard and make determinations about (1) where students are on a learning progression about argumentation; and (2) what changes they will make, if any, to instruction in order to help students advance.

Teachers (all of whom have enough iPads for 35–30 students in the school) will participate in a one-hour, live webinar to familiarize them with the practices of argumentation; introduce them to the game and the data dashboard; and walk them through each session of the mini-unit. They will then begin the mini-unit within one or two days following the webinar.

Data sources and data collection methods for the two pilot studies will include:

(1) Log data for teacher use of dashboard functionality
(2) Observations of teachers' classroom routines to look for signs of differentiation, using a modified MAP (Measures of Academic Progress) protocol
(3) Think-aloud sessions during which teachers interpret gameplay data while reviewing the dashboard, using a protocol developed by the research team
(4) Fidelity of implementation checklists, using a checklist developed by the research team
(5) Assessments of student argumentation using ETS’s CBAL instrument, which functions as a scaffolded learning task, as well as an assessment of critical thinking
(6) Pre- and post- teacher assessment of teachers’ pedagogical content knowledge, using an instrument developed by McNeill, González‐Howard, Katsh‐Singer, and Loper (2016)
(7) Pre- and post- teacher assessment of teachers’ beliefs about the value of teaching argumentation, using an instrument developed by McNeill, González‐Howard, Katsh‐Singer, and Loper (unpublished)
(8) Pre- and post-assessment of teachers’ formative assessment practices, using an instrument developed by Herman, Osmundson, Dai, Ringstaff, and Timms (2015)

We will analyze teacher observations and think-aloud session data by coding thematically in order to learn how successful teachers were with interpreting and utilizing the dashboard data, and what dashboard features were particularly helpful or unhelpful. We will analyze teacher responses to the fidelity checklist descriptively and compare them to the teacher observation and think-aloud session data in order to explore the extent to which teacher-level factors (i.e., their beliefs and practices around argumentation and data-driven decision-making) are associated with their ability to interpret and use data from the teacher dashboard. Finally, we will conduct descriptive analyses of all of the measures listed above to gain preliminary evidence of promise for this pilot study.

Results

As noted, the pilot study will provide us with evidence for the usefulness of the data collection instruments that we have identified. More significantly, however, we expect the findings from these pilot studies—particularly from data gathered through the classroom observation and think-aloud protocols—to give us insight into how teachers make sense of gameplay data. Results from the measures related to pedagogical content knowledge, formative assessment, and teacher beliefs about argumentation might allow us to draw tentative conclusions about associations between those independent factors and the quality of teachers' formative assessment practices. These preliminary findings will contribute new knowledge to the field of educational technology research by documenting teachers' sense-making processes when connecting gameplay data to targeted learning objectives.

Finally, the pilot findings will also enable the research and development teams to continue to refine the data dashboard so that it is useful and educative for teachers with a range of experiences and competencies. Following those modifications, the team will be prepared to use the dashboard as part of the fall 2017 impact study.

Importance

This project responds to the National Science Foundation’s challenge to develop and research tools that provide teachers with dynamic diagnostic information about student learning in formats, and with supports, that ensure that assessment data are not only valid but relevant, valuable and actionable for teachers. This project and its findings will contribute to that broader discussion, both in research literature and in practitioner communities, offering a model for translating rigorous but complex data on student performance into meaningful, actionable and new evidence to inspire and inform teachers in their day to day work with their students.

The research findings from the final impact study will shed light on whether a treatment group of science teachers who use the game and revised dashboard with educative features as part of 4-week supplemental unit on argumentation are more likely than a comparison group of peers who do not have access to the educative features to use the data effectively for formative assessment. More broadly, the study will contribute new knowledge about how to prepare teachers to use well-designed educational games as new modes of performance-based assessment. This project will also generate broadly available digital tools for practitioners and new knowledge about supporting game-based formative assessment that will be of interest to multiple research communities.

References

Ball, D. L., & Cohen, D. K. (1996). Reform by the book: What is: Or might be: The role of curriculum materials in teacher learning and instructional reform? Educational researcher, 25(9), 6–14.

Davis, E. A., & Krajcik, J. S. (2005). Designing educative curriculum materials to promote teacher learning. Educational researcher, 34(3), 3–14.

Davis, E., Palincsar, A. S., Arias, A. M., Bismack, A. S., Marulis, L., & Iwashyna, S. (2014). Designing educative curriculum materials: A theoretically and empirically driven process. Harvard educational review, 84(1), 24–52.

Fishman, B., Marx, R. W., Blumenfeld, P., Krajcik, J., & Soloway, E. (2004). Creating a framework for research on systemic technology innovations. Journal of the learning sciences, 13(1), 43-76.

Heritage, M., Kim, J., Vendlinski, T., & Herman, J. (2009). From evidence to action: A seamless process in formative assessment? Educational measurement: issues and practice, 28(3), 24–31.

Mandinach, E. B. (2012). A perfect time for data use: Using data-driven decision making to inform practice. Educational psychologist, 47(2), 71–85.

Mandinach, E. B., Gummer, E. S., & Muller, R. D. (2011, May). The complexities of integrating data-driven decision making into professional preparation in schools of education: It’s harder than you think. In Report from an invitational meeting. Alexandria, VA: CNA Analysis & Solutions.

Mandinach, E. B., Honey, M., & Light, D. (2006, April). A theoretical framework for data-driven decision making. In annual meeting of the American Educational Research Association, San Francisco, CA.

McNeill, K. L., González‐Howard, M., Katsh‐Singer, R., & Loper, S. (2016). Pedagogical content knowledge of argumentation: Using classroom contexts to assess high‐quality PCK rather than pseudoargumentation. Journal of research in science teaching, 53(2), 261–290.

McNeill, K. L., & Knight, A. M. (2013). Teachers’ pedagogical content knowledge of scientific argumentation: The impact of professional development on K–12 teachers. Science Education, 97(6), 936–972.

Means, B., Chen, E., DeBarger, A., & Padilla, C. (2011). Teachers' Ability to Use Data to Inform Instruction: Challenges and Supports. Office of Planning, Evaluation and Policy Development, US Department of Education.

Putnam, R. T., & Borko, H. (2000). What do new views of knowledge and thinking have to say about research on teacher learning? Educational researcher, 29(1), 4–15.

Schneider, R. M., & Krajcik, J. (2002). Supporting science teacher learning: The role of educative curriculum materials. Journal of science teacher education, 13(3), 22–245.

Shulman, L. S. (1986). Those who understand: Knowledge growth in teaching. Educational Researcher, 15(2), 4–14.
University of Chicago (n.d.).

Voogt, J., Fisser, P., Pareja Roblin, N., Tondeur, J., & van Braak, J. (2013). Technological pedagogical content knowledge–a review of the literature. Journal of computer assisted learning, 29(2), 109–121.

More...

Presenters

Dr. James Diamond, EDC

Heather KIM, EDC

Technology-charged
learning starts here

San Antonio

June 25-28, 2017

© 2017 International Society for Technology in Education (ISTE), All Rights Reserved