Get ready for ISTELive 21! Launch the site now.
Creative Constructor
Lab Virtual
Leadership Exchange
at ISTELive 21
Edtech Advocacy &
Policy Summit

Crystal Code: Examining the Impact of Computational Modeling on Scientific Systems Thinking

Participate and share

Participate and share : Poster
Poster presentation


Wednesday, December 2, 4:30–5:30 pm PST (Pacific Standard Time)

Ha Nguyen  
Dorthy Schmidt  
Rossella Santagata  

Learn about a unit on decomposition that integrates computational modeling and science for sixth-graders. The treatment group (paper brainstorm and computer modeling) showed a significant relative increase in citing evidence and causal links within the system, compared to the group that used paper modeling.

Audience: Curriculum/district specialists, Teachers, Technology coordinators/facilitators
Skill level: Beginner
Attendee devices: Devices not needed
Topic: Innovative learning environments
Grade level: 6-8
Subject area: STEM/STEAM, Science
ISTE Standards: For Educators:
Designer
  • Explore and apply instructional design principles to create innovative digital learning environments that engage and support learning.
Facilitator
  • Model and nurture creativity and creative expression to communicate ideas, knowledge or connections.
For Students:
Computational Thinker
  • Students break problems into component parts, extract key information, and develop descriptive models to understand complex systems or facilitate problem-solving.
Additional detail: Graduate student

Proposal summary

Framework

Modeling & Systems Thinking
The goal of science education is to help students develop integrated understanding and connect local observations to core ideas and concepts across disciplines (Hmelo-Silver & Pfeffer, 2004; Novak & Krajcik, 2019). Constructing models in science education is linked to the development of representational skills and higher-order thinking (Wilensky & Papert, 2010). Modeling presents an opportunity for learners to articulate and simulate the multiple interrelations among elements within systems (Jacobson & Wilensky, 2006), which are often abstract and challenging to comprehend (Wilensky & Resnick, 1999).
Practices with modeling include using models to test hypotheses and solutions (Sengupta et al., 2013; Xiang & Passmore, 2015), assessing models’ predictions against real-world data or expert simulations (Hasan & Biswas, 2017; Sengupta et al., 2013), and designing models by creating conceptual links (Louca & Zacharia, 2008; Weintrop et al., 2016). A focus of these environments is student-driven scientific practices (Wagh, Cook-Whitt, & Wilensky, 2017). Learning gains are found to be greater when students engage in self-driven exploration and think-aloud protocols, as opposed to watching the simulations passively (Wieman, Adams, & Perkins, 2008). Researchers have explored several systems such as Netlogo (Wilensky & Resnick, 1999) and Sagemodeler (Damelin et al., 2017), and found that computer modeling practices enrich students’ conceptual understanding through iterative construction of student-led, evidence-based explanations of phenomena (Novak & Krajcik, 2019; Rosenberg & Lawson, 2019).
Co-design Dynamics with Teachers
Technology-mediated innovations that solely focus on implementation fidelity and overlook contextual factors or teachers’ pedagogical approaches are not sustainable (Fishman, Penuel, & Yamaguchi, 2006; Means et al., 2001). Co-design—engaging multiple stakeholders in curricular design—shifts the focus from implementation to integrity (Penuel, Roschelle, & Schechtman, 2007).
This study examines the development of the modeling unit from a co-design perspective (Penuel et al., 2007). Co-design approaches attend to the contexts of teachers’ work to reflect their values and problems of practice (Friedman, 1996; Penuel et al., 2007). The process focuses on practice and collaboration, employing participatory design approaches (Shrader, Williams, Lachance-Whitcomb, Finn, & Gomez, 2001). A focus on problems of practice centers the design around the tool’s challenges and affordances within contexts (Cober, Tan, Slotta, So, & Könings, 2015).

Methods

Study Context
This study presents a research-practice partnership between a school district, education researchers, environmental biology scientists, and state park educators in California to develop a middle school CS curriculum over one year. The design question was how to combine data practices with computer modeling to study plant restoration at the state park. The students experimented with how different mulch conditions (no mulch, ‘woody’ mulch, and ‘straw-like’ much) affected decomposition and soil moisture. The goal is to support development of systems thinking—the ability to make predictions and test models of complex natural systems. The project promotes community science (McKinley et al., 2017), which engages participants in distributed knowledge and scientific practices while contributing to conservation efforts.
Procedure.
Scientists, park educators, and researchers collaboratively outlined the curriculum. At the first co-design meeting, the teachers reflected on their students’ and their own limited exposure to technologies. The group consequently revised the curriculum to include creating paper diagrams before the computer models as a scaffolding activity. Following the co-design phase, teachers received professional development before field testing to familiarize themselves with the modeling activities. Park educators and researchers were present in the classrooms to assist teachers and iteratively design the lessons during implementation.
Participants
Participants are 119 sixth graders (91% Hispanic, 92% on Free and Reduced Lunch, 43% Female) in eight classes taught by two teachers, Annie and Peter (pseudonyms). Annie switched from a business career to teaching, and had been a teacher for 19 years. Peter had been teaching for three years as his first job. The teachers taught a total of eight periods (Annie 3, Peter 5), each with a mix of students at different English and science proficiencies.
The classes were randomly divided into treatment (n = 60) and control groups (n = 59). Students in the treatment groups brainstormed the system elements on paper and modeled the dynamics on computer on SageModeler (Damelin et al., 2017). The interface builds in features for students to create components, simulate the relations among them, and construct different model types (e.g., static diagram, dynamic time-series). Students in the control group brainstormed and sketched the systems on paper.
Analytic Strategies
This study drew from pre and posttest data on students’ systems thinking (n = 119), along with classroom observation and field notes from the modeling lessons across the eight classes.
RQ1: Episodes of co-design were coded for activities based on Cober et al. (2015) and Kuhn & Muller (1993). The episodes are identified as units of talk that involved multiple parties (e.g., teacher- researcher-educator) about an issue in practice, with reasoning, explanations, and justifications. An episode can receive multiple codes. The codes are (1) Theorizing about how learning components are connected, (2) Contextual inquiry of activities’ affordances, (3) Collaborative design of learning activities, and (4) Collaborative reflection of learning activities.
RQ2: Pre and posttests were coded for System Components, Evidence, and Causal Coherence based on prior frameworks (Kang, Thompson, & Windschitl, 2014). Components encompass the biotic, abiotic, and processes in the system. Evidence is the extent to which students use evidence from data (graphs provided by the scientists or data they have collected) to support their answers. Causal coherence is the extent to which students articulate the complex cause-effect and systems links. A claim that receives low scores for causal coherence is “Microorganisms eat nutrients”. A claim that receives a high score is “Since leaf mulch is sparser and holds soil moisture more, it would have microorganisms to break down nutrients. The decomposition will increase the nutrients would go to soil helping plants grow”. Two coders coded 15% of the data and reached substantial agreement on all three dimensions, Cohen’s k = (.73, .92, .88), and the first author coded the rest of the data.
Scores were standardized & used in a linear mixed model to predict the post-test scores for students nested within the eight classrooms. Post-test score for a student was predicted by her treatment at the class level (computer or paper modeling), controlling for her pretest score, gender, English learner status, and the effect of being in a certain class. We used Kenward-Roger adjustments to account for possible biases resulting from the small number of clusters.

Results

RQ1. The co-design episodes reveal that iterative co-design among multiple stakeholders supported implementation for teachers unfamiliar with the modeling tool. All participants engaged in reflection and co-planning as the lesson unfolded in each period, instead of retrospective examination at the end of the project. Teachers more frequently initiated inquiries of whether the activities were relevant to the learning contexts and co-planning of instructional activities, compared to the researcher and park educators. There was more balance in collaborative reflection, with teachers always initiating this discourse type.
Collaborative reflection and co-design went hand-in-hand. In three episodes, participation in co-planning and reflection helped teacher build observations of how students’ systems thinking could be better connected beyond the curriculum. For example, when teacher Peter first taught the computer modeling lesson, he noticed that the model he created was incoherent, as not every component in the system was linked. He quickly approached the park educator to test different models within the lesson. The teacher initiated using the new model and went on to explore other features of the tool, such as simulation. He modeled the use of these additional features in the following lessons to help students build a deeper understanding of the system mechanisms.
RQ2. Overall, students in the treatment group (i.e., paper brainstorm + computer modeling) showed a significant relative increase in citing Evidence and Causal Coherence, compared to the treatment group (i.e., paper brainstorm + paper poster), controlling for pretest scores, gender, and English language learner (ELL) status (b = .47, t = 2.00, p < .05; b = .54, t = 2.02, p < .05). Gender and ELL status were not significant predictors. Additional regression analyses showed no significant difference in changes from pre- to post-tests at the teacher level.
The effects observed from the computer modeling, relative to the paper modeling, may be due to two reasons. First, the tool’s interface explicitly guides students to search for components and specify the relations among them. Second, the tool gives students the opportunity to iteratively test their hypotheses about how the system functions and change their models based on results from the simulations. Both affordances make the interrelations among system components more salient and help students develop their conceptual understanding towards an ‘expert’ stance—noticing patterns, functions, and mechanisms within systems, as opposed to novices who only notice components (Hmelo-Silver & Pfeffer, 2004).
Additionally, classroom observations suggested that linguistic and scientific scaffolding likely mediated the tool’s affordances. Both teachers modeled to students how to create components and links in the modeling interfaces before letting students build their own models. This helped students overcome the challenges in tool use and focus on the iterative model building.

Importance

The co-design trajectories in this study contribute to our understanding of emergent curriculum development. The co-design process was primarily emergent—responding dynamically to situational factors as they occurred. Co-design allowed the participants to make immediate adjustments to correct misunderstanding. The iterative co-planning that occurred within lessons is unique, as studies have mostly documented co-planning prior to or after longer durations of implementation (Coburn & Stein, 2010).
This study provides promising evidence of use of computer modeling in science classrooms in enriching students’ systems thinking, particularly their ability to build coherent, complex causal claims and cite evidence to support these claims. The tool appeared to benefit learners across gender and English proficiency levels. This study adds to the emerging base of experimental studies that explore the benefits of computer modeling, while attending to varied pedagogical approaches across classrooms. Future research could explore teachers’ pedagogical strategies in relation to the tool’s implementation in a variety of contexts.
Results from our study have practical implications for how curriculum designers, teachers, researchers, and community partners could engage in co-design. Findings echo the literature on principles for collaborative partnership and technology integration: trust and knowledge of partners’ expertise, capacity building, and shared space for collaborative problematizing (Engle, 2010).

References

Cober, R., Tan, E., Slotta, J., So, H. J., & Könings, K. D. (2015). Teachers as participatory designers: Two case studies with technology-enhanced learning environments. Instructional Science, 43(2), 203-228.
Coburn, C. E., & Stein, M. K. (2010). Research and practice in education: Building alliances, bridging the divide. Rowman & Littlefield Publishers.
Computer Science Teacher Association (2017). CSTA K-12 Computer Science Standards. https://c.ymcdn.com/sites/www.csteachers.org/resource/resmgr/Docs/Standards/CSTA_K-12_CSS.pdf
Damelin, D., Krajcik, J. S., Mcintyre, C., & Bielik, T. (2017). Students making systems models. Science Scope, 40(5), 78.
Dickes, A. C., Sengupta, P., Farris, A. V., & Basu, S. (2016). Development of mechanistic reasoning and multilevel explanations of ecology in third grade using agent‐based models. Science Education, 100(4), 734-776.
Engle, R. A. (2010). The middle-school mathematics through applications project In Coburn, C. E., & Stein, M. K. (Ed.). Research and practice in education: Building alliances, bridging the divide (pp. 19-36). Rowman & Littlefield Publishers.
Fishman, B. J., Penuel, W. R., Allen, A. R., Cheng, B. H., & Sabelli, N. O. R. A. (2013). Design-based implementation research: An emerging model for transforming the relationship of research and practice. National society for the study of education, 112(2), 136-156.
Fishman, B. J., Penuel, W. R., & Yamaguchi, R. (2006). Fostering innovation implementation: Findings about supporting scale from GLOBE. In S. A. Barab, K. E. Hay & D. T. Hickey (Eds.), Proceedings of the 7th International Conference of the Learning Sciences (Vol. 1, pp. 168–174). Mahwah, NJ: Erlbaum.
Fretz, E. B., Wu, H. K., Zhang, B., Davis, E. A., Krajcik, J. S., & Soloway, E. (2002). An investigation of software scaffolds supporting modeling practices. Research in Science Education, 32(4), 567-589.
Friedman, B. (Ed.). (1996). Human values and the design of computer technology. New York: Cambridge University Press.
Hasan, A., & Biswas, G. (2017). Domain specific modeling language design to support synergistic learning of STEM and computational thinking. Siu-cheung KONG The Education University of Hong Kong, Hong Kong, 28.
Hmelo‐Silver, C. E., & Pfeffer, M. G. (2004). Comparing expert and novice understanding of a complex system from the perspective of structures, behaviors, and functions. Cognitive science, 28(1), 127-138.
Jacobson, M.J., & Wilensky, U. (2006). Complex systems in education: Scientific and educational importance and implications for the learning sciences. The Journal of the Learning Sciences, 15(1), 11-34. https://doi.org/10.1207/s15327809jls1501_4
Kang, H., Thompson, J., & Windschitl, M. (2014). Creating opportunities for students to show what they know: The role of scaffolding in assessment tasks. Science Education, 98(4), 674-704.
Kuhn, S., & Muller, M. J. (1993). Participatory design. Communications of the ACM, 36(6), 24- 29.
Louca, L. T., & Zacharia, Z. C. (2008). The use of computer‐ based programming environments as computer modelling tools in early science education: The cases of textual and graphical program languages. International Journal of Science Education, 30(3), 287-323
McKinley, D. C., Miller-Rushing, A. J., Ballard, H. L., Bonney, R., Brown, H., Cook-Patton, S. C., ... & Ryan, S. F. (2017). Citizen science can improve conservation science, natural resource management, and environmental protection. Biological Conservation, 208, 15- 28.
Means, B., Penuel, W. R., Crawford, V. M., Korbak, C., Lewis, A., Murphy, R. F. et al. (2001). GLOBE Year 6 evaluation: Explaining variation in implementation. Menlo Park, CA: SRI International.
NGSS Lead States (2013). Next generation science standards: For states, by states. The National Academies Press Washington, DC.
Novak, A. M., & Krajcik, J. S. (2019). A Case Study of Project‐Based Learning of Middle School Students Exploring Water Quality. The Wiley Handbook of Problem‐Based Learning, 551-572.
Penuel, W. R., Roschelle, J., & Shechtman, N. (2007). Designing formative assessment software with teachers: An analysis of the co-design process. Research and practice in technology enhanced learning, 2(01), 51-74.
Rosenberg, J. M., & Lawson, M. A. (2019). An Investigation of Students’ Use of a Computational Science Simulation in an Online High School Physics Class. Education Sciences, 9(1), 49.
Sengupta, P., Kinnebrew, J. S., Basu, S., Biswas, G., & Clark, D. (2013). Integrating computational thinking with K- 12 science education using agent-based computation: A theoretical framework. Education and Information Technologies, 18(2), 351–380.
Shrader, G., Williams, K., Lachance-Whitcomb, J., Finn, L. E., & Gomez, L. (2001, April). Participatory design of science curricula: The case for research for practice. In Annual Meeting of the American Educational Research Association, Seattle, WA.
Wagh, A., Cook‐ Whitt, K., & Wilensky, U. (2017). Bridging inquiry-based science and constructionism: Exploring the alignment between students tinkering with code of computational models and goals of inquiry. Journal of Research in Science Teaching, 54(5), 615–641. https://doi.org/10.1002/tea.21379
Weintrop, D., Beheshti, E., Horn, M., Orton, K., Jona, K., Trouille, L., & Wilensky, U. (2016). Defining Computational Thinking for Mathematics and Science Classrooms. Journal of Science Education and Technology, 25(1), 127–147.
Wieman, C. E., Adams, W. K., & Perkins, K. K. (2008). PhET: Simulations that enhance learning. Science, 322(5902), 682–683.
Wilensky, U., & Papert, S. (2010). Restructurations: Reformulations of knowledge disciplines through new representational forms. Constructionism.
Wilensky, U., & Resnick, M. (1999). Thinking in levels: A dynamic systems approach to making sense of the world. Journal of Science Education and Technology, 8(1), 3–19.
Xiang, L., & Passmore, C. (2015). A framework for model-based inquiry through agent-based programming. Journal of Science Education and Technology, 24(2-3), 311-329.

More [+]

Presenters

Photo
Ha Nguyen, University of California-Irvine
Graduate student

Photo
Dorthy Schmidt, University of California-Irvine
Undergraduate student

People also viewed

7 Digital Glide Paths To Digital Learning Online
Energizing Language Teaching and Learning with VR Games and AI Characters
Using Scratch and Physical Computing to Support Student Writing

Testimonials