Get ready for ISTELive 21! Launch the site now.
Creative Constructor
Lab Virtual
Leadership Exchange
at ISTELive 21
Edtech Advocacy &
Policy Summit

Integrating Science, Computational Thinking, and Data Science for Middle School Science Classrooms

Listen and learn

Listen and learn : Research paper
Lecture presentation


Friday, December 4, 12:45–1:30 pm PST (Pacific Standard Time)
Presentation 3 of 3
Other presentations:
Motivating STEM+C Learning With Social Impact of Cybersecurity and Digital Forensics
Promoting Math Teachers’ Confidence and Self-Perceptions of Efficacy with Educational Technology

Dr. Eric Greenwald  
Dr. Ari Krakowski  

We present research findings on an innovative middle school science learning experience designed to confront barriers to broader participation in computer science and position coding as a tool for doing science. We discuss instructional strategies contributing to observed learning gains and explore challenges with integrating computer science in science classrooms.

Audience: Chief technology officers/superintendents/school board members, Curriculum/district specialists, Teachers
Attendee devices: Devices useful
Attendee device specification: Smartphone: Windows, Android, iOS
Laptop: Chromebook, Mac, PC
Tablet: Android, iOS, Windows
Topic: Computer science & computational thinking
Grade level: 6-8
Subject area: Science
ISTE Standards: For Students:
Computational Thinker
  • Students break problems into component parts, extract key information, and develop descriptive models to understand complex systems or facilitate problem-solving.
  • Students formulate problem definitions suited for technology-assisted methods such as data analysis, abstract models and algorithmic thinking in exploring and finding solutions.
  • Students collect data or identify relevant data sets, use digital tools to analyze them, and represent data in various ways to facilitate problem-solving and decision-making.
Additional detail: Session recorded for video-on-demand
Disclosure: The submitter of this session has been supported by a company whose product is being included in the session

Proposal summary

Framework

Our pedagogical framework grounds Mitch Resnick’s coding to learn approach to CS education (Resnick, 2012) in situative learning theories and, in particular, the construct of legitimate peripheral participation (LPP, Lave and Wenger, 1991). Coding to learn contrasts with more traditional CS instruction, in which programming is presented as a discrete skill to learn for its own sake, akin to typing, and instead repositions the learning goals such that programming and computational thinking are practices and cognitive tools to learn on the way to achieving other meaningful goals. Coding to learn, in which learning is contextually embedded in authentic tasks, is a natural fit for situative theories, like LPP, which posit that knowledge is constructed through activity and in relation to others—thus, learning is “situated” in the activity, context, and community in which it occurs (see also Greeno, 1998). LPP's foundational construct of a community of practice (Lave and Wenger, 1991; Wenger, 1998), where individuals share a repertoire of knowledge and practices to address a shared set of problems, informs the intervention’s commitment to authentic and collaborative problem solving. Further, with the internship model, in conjunction with task-embedded supports and a gradual release of responsibility, students progress along a trajectory from peripheral to more central participation in the practice of coding scientific simulations. Initially, students use their scientific content knowledge to understand a simulated phenomenon and its underlying code. Eventually, students work toward more central participation, as they use newfound programming practices to improve and further develop the simulation as an investigative model of a scientific phenomenon. Finally, the LPP framework is tightly aligned with the intervention’s more distal goal of broadening participation in STEM, offering a clear vehicle for students to explore and come to identify with CS as a “possible self” (Markus & Nurius, 1986). Markus’ theory also dovetails with the widely reported distal impacts of students’ early perceptions of STEM subjects as a determinant of whether students enter STEM professions (Tai, Liu, Maltese & Fan, 2006). Our intervention, therefore, aims to support students’ identification with CS occupations as they do legitimate work within the CS community: it will be expressly designed to encourage a broad range of students to see themselves, at least potentially, as both scientists and programmers, and to envision CS as an appealing world to inhabit.

Methods

The mixed-methods research agenda for this project is guided by four questions: 1) What specific design features and instructional strategies of the CS Internships are most important for broadening student participation in CS?; 2) What aspects of the CS Internships are most important to support sustainability of CS and science integration?; 3) What factors, design features, and practices are most important for supporting productive student engagement in, and teacher facilitation of, collaboration and discourse (both in-person and digital) in STEM?; and 4) What aspects of student understanding may be revealed when students are able to manipulate the code behind scientific models?

Our research includes both qualitative and quantitative methods.
Qualitative methods include data collected through classroom observations during piloting, and interviews and focus groups during both piloting and broad implementation of Internship 1 and Internship 2. Classroom observation field notes and focus group protocols include questions related to the relationship between specific factors and practices and student outcomes. We identify some factors and practices a priori, based on the theories informing our design. We also look for variation in the data (Bryk, 2014) both within and across sites in order to identify additional possible factors. For example, we compare lessons with particularly high student engagement levels to lessons with lower student engagement levels. We also examine variation in engagement levels by gender within individual lessons in order to attempt to discern factors that could explain those differences. We also look for variation across sites to see if there are differences in implementation that could point to specific factors or practices as being important for positive outcomes.
     
Quantitative methods: In order to gather evidence for how student understanding develops at the intersection of CS and science, participating students (average of 100 per teacher for a total n=4000) complete pre/post surveys during research trials to establish a baseline and document changes over time in. Specifically, students complete pre/post assessments to track CS and CT concepts and practices. The measure, Assessment of Computational Thinking (Witherspoon, Higashi, Schunn, Baehr, & Shoop, 2017; Witherspoon, Schunn, Higashi, & Shoop, 2018), includes items from the Principled Assessment of Computational Thinking project (Snow, Rutstein, Bienkowski, and Xu, 2017). Students also complete a survey with scales for disposition toward CS, including: the Competency Belief Survey for Students scale, a subset of the Science Learning Activation Assessment and has 8 items with good reliability (α = 0.87), acceptable model fit (RMSEA = 0.073, CFI = 0.976, TLI = 0.967), and is measurement invariant; the Values Science Survey is a subset of the Science Learning Activation Assessment and has 8 items with good reliability (α = 0.87), acceptable model fit (RMSEA = 0.127, CFI = 0.930, TLI = 0.901), and is measurement invariant. The scale has been revised to capture perceived value of CS. In conjunction with these measures, students also complete a brief (5 min), daily engagement survey (Engagement in Science Learning Activities Survey), high scores on which have been demonstrated to be predictive of content learning gains (Chung, Cannady, Schunn, Dorph, & Bathgate, 2016).
     
Learning Analytics. Because much of student work for the units involves students’ capturable interactions in digital environments, there is tremendous opportunity to learn from the massive data set these interactions generate: data currently being analyzed include both event logs (e.g., "clickstream" data for student problem-solving processes and student work products), text generated in the digital discussion forums, and backend metadata. This enables a variety of analyses that will provide evidence for promising instructional features, models of student understanding and learning trajectories at the intersection of CS and science, as well as highly nuanced information about equitable participation and task engagement. This attention to both product and process data is critical for authentically assessing practices (both CS practices and SEPs). Our approach to analysis will include ground-up machine learning approaches, such as LDA, as well as a rule-based Dynamic Belief Network (DBN; Nicholson & Brady, 1994). Integrating features of two commonly used analytics approaches, Bayesian Networks and Bayesian Knowledge Tracing, a DBN formalizes connections among concepts in each unit, and integrates information from various points in the learning process. This approach will build a networked model of assessed constructs, which can help reveal students’ abilities to connect ideas, using data from thousands of students to identify relationships among concepts that may not be immediately obvious, providing a rich picture of how student understanding develops.

Results

Some analyses are complete, and others are ongoing. We will report findings from a research trial with 1 unit from the spring of 2019 (complete), as well as emerging findings from an upcoming research trial of the second unit. Data collection for the second research trial will be complete by the spring of 2020.

Completed Analyses:
Using an external measure of computational thinking, we found a statistically significant*** overall learning gain from pre to post. For n=391 students, the mean gain was 0.353 (out of 10 possible points), with an effect size of 0.20 and a p value < .001. Learning gains were particularly pronounced for students in the lowest quartile of performance (effect size =1.09) and for female students overall (effect size =0.26). The measure (see Snow, Rutstein, Bienknowski, & Xu, 2017 for background on items) has demonstrated strong evidence of validity for use in middle-school settings (see Witherspoon, Higashi, Schunn, Baehr, & Shoop, 2017 for technical report on measurement validity evidence).

We also conducted subgroup analyses to test our hypothesis that the learning experiences can contribute to improved outcomes for female students. Results suggest the units hold promise for broadening participation in STEM by narrowing a measured performance gap in CT through the course of instruction. Specifically, prior to instruction, there was a statistically significant difference between gender groups as determined by one-way ANOVA (F(3,387) = 4.547, p = .004). A Tukey post hoc test revealed that performance on the CT measure prior to instruction was statistically significantly lower for female students (5.07 ± 1.75 out a possible 10 points, p = .015) than for male students (5.65 ± 1.95). However, at the conclusion of instruction, the difference in performance by gender was much smaller and no longer statistically significant, with female post-instruction scores averaging 5.52 ± 1.84 compared to males’ scores averaging 5.86 ± 1.93.

We also expect to report on analogous findings from our second Coding Science Internship unit, data collection for which will be complete in the spring of 2020.

Importance

The project is continuing to provide valuable insights into how computational tools may be leveraged to support deeper science learning, and provide student experiences that mirror the computational work of modern science. It extends prior work aimed at incorporating coding and computational thinking into the school in the following ways: an explicit focus on collaborative discourse and collaborative problem solving; just-in-time teacher learning via an educative curriculum to support system capacity and broader impact; and a research and development model that explicitly incorporates school, district, and state policy level stakeholders in the design process, in order to build an understanding of how the intervention, and those like it, can be successfully and sustainably implemented. The project also advances research on how students' computer science knowledge and practices develop within the context of science learning experiences and improve student dispositions toward STEM and CS-related occupations.

References

Anderson, N., Lankshear, C., Timms, C., & Courtney, L. (2008). ‘Because it’s boring, irrelevant and I don’t like computers’: Why high school girls avoid professionally-oriented ICT subjects. Computers & Education, 50(4), 1304-1318.
Applebee, A. N., Langer, J. A., Nystrand, M., & Gamoran, A. (2003). Discussion-based approaches to developing understanding: Classroom instruction and student performance in middle and high school English. American Educational Research Journal, 40(3), 685-730.
Bandura, A. (1986). The explanatory and predictive scope of self-efficacy theory. Journal of social and clinical psychology, 4(3), 359-373.
Barab, S. A., & Landa, A. (1997). Designing effective interdisciplinary anchors. Educational leadership, 54(6).
Barab, S., & Squire, K. (2004). Design-based research: Putting a stake in the ground. The journal of the learning sciences, 13(1), 1-14.
Basu, S., Kinnebrew, J. S., Dickes, A., Farris, A., Sengupta, P., Winger, J., & Biswas, G. (2012, January). A science learning environment using a computational thinking approach. In 20th International Conference on Computers in Education, ICCE 2012.
Basu, S., Biswas, G., Sengupta, P., Dickes, A., Kinnebrew, J. S., & Clark, D. (2016). Identifying middle school students’ challenges in computational thinking-based science learning. Research and practice in technology enhanced learning, 11(1), 13.
Blikstein, P., & Wilensky, U. (2009). An atom is known by the company it keeps: A constructionist learning environment for materials science using agent-based modeling. International Journal of Computers for Mathematical Learning, 14(2), 81-119.
Blumenfeld, P., Fishman, B. J., Krajcik, J., Marx, R. W., & Soloway, E. (2000). Creating usable innovations in systemic reform: Scaling up technology-embedded project-based science in urban schools. Educational psychologist, 35(3), 149-164.
Brown, A. L. (1992). Design experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings. The journal of the learning sciences, 2(2), 141-178.
Bruner, J. (1986). Actual minds, possible worlds. Cambridge, MA, US: Harvard University Press.
Buffum, P. S., Frankosky, M., Boyer, K. E., Wiebe, E., Mott, B., & Lester, J. (2015, August). Leveraging collaboration to improve gender equity in a game-based learning environment for middle school computer science. In 2015 Research in Equity and Sustained Participation in Engineering, Computing, and Technology (RESPECT) (pp. 1-8). IEEE.
Bureau of Labor Statistics, U.S. Department of Labor, Occupational Employment Statistics, retrieved from www.bls.gov/oes/ July, 1, 2019.
Carter, Lori. "Why students with an apparent aptitude for computer science don't choose to major in computer science." ACM SIGCSE Bulletin. Vol. 38. No. 1. ACM, 2006.
Cazden, C. B., & Beck, S. W. (2003). Classroom discourse. Handbook of discourse processes, 165- 197.
Charleston, L. J. (2012). A qualitative investigation of African Americans' decision to pursue computing science degrees: Implications for cultivating career choice and aspiration. Journal of Diversity in Higher Education, 5(4), 222.
Chung, J., Cannady, M. A., Schunn, C., Dorph, R., & Bathgate, M. (2016). Measures technical brief: Engagement in science learning activities.
Cochran-Smith, M., & Lytle, S. L. (1999). Chapter 8: Relationships of knowledge and practice: Teacher learning in communities. Review of research in education, 24(1), 249-305.
Cognition and Technology Group at Vanderbilt (CTGV). (1990). Anchored instruction and its relationship to situated cognition. Educational Researcher, 19(6), 2-10.
College board, 2017 Program Summary Report retrieved from: https://secure-media.collegeboard.org/digitalServices/pdf/research/2016/Program-Summary-Report-2016.pdf
Collins, A. (1992). Toward a design science of education. In New directions in educational technology (pp. 15-22). Springer, Berlin, Heidelberg.
Côté, J. E., & Levine, C. G. (2014). Identity, formation, agency, and culture: A social psychological synthesis. Psychology Press.
Côté, J. E., & Schwartz, S. J. (2002). Comparing psychological and sociological approaches to identity: Identity status, identity capital, and the individualization process. Journal of adolescence, 25(6), 571-586.
Creswell, J. W., & Clark, V. L. P. (2017). Designing and conducting mixed methods research. Sage publications.
CSTA (2013). Bugs in the System: Computer science teacher certification in the U.S. New York, NY: Computer Science Teachers Association.
Dedoose Version 8.7.27, web application for managing, analyzing, and presenting qualitative and mixed method research data (2019). Los Angeles, CA: SocioCultural Research Consultants, LLC www.dedoose.com.
Denner, J., Werner, L., & Ortiz, E. (2012). Computer games created by middle school girls: Can they be used to measure understanding of computer science concepts?. Computers & Education, 58(1), 240-249.
Denning, P. J. (2017). Computational Thinking in Science. American Scientist, 105(1), 13.
Doyle, W., & Ponder, G. A. (1977). The practicality ethic in teacher decision-making. Interchange, 8(3), 1-12.
Driver, R., Leach, J., & Millar, R. (1996). Young people's images of science. McGraw-Hill Education (UK).
DuBow, W. & Pruitt, A.S. (2018). NCWIT Scorecard: The Status of Women in Technology. National Center for Women and Information Technology. Retrieved from https://www.ncwit.org/sites/default/files/resources/ncwit_executive_summary_scorecard_05132019.pdf. Boulder, CO: NCWIT.
Duschl, R. A., Schweingruber, H. A., & Shouse, A. W. (Eds.). (2007). Taking science to school: Learning and teaching science in grades K-8 (Vol. 49, No. 2, pp. 163-166). Washington, DC: National Academies Press.
Eagen, W. M., Ngwenyama, O., & Prescod, F. (2006, November). The design charrette in the classroom as a method for outcomes-based action learning in IS design. In Proceedings of the Information Systems Education Conference, ISECON (Vol. 23).
Fereday, J., & Muir-Cochrane, E. (2006). Demonstrating rigor using thematic analysis: A hybrid approach of inductive and deductive coding and theme development. International journal of qualitative methods, 5(1), 80-92.
Fortus, D., & Krajcik, J. (2012). Curriculum coherence and learning progressions. In Second international handbook of science education (pp. 783-798). Springer, Dordrecht.
Franklin, D., Conrad, P., Aldana, G., & Hough, S. (2011, March). Animal tlatoque: attracting middle school students to computing through culturally-relevant themes. In Proceedings of the 42nd ACM technical symposium on Computer science education (pp. 453-458). ACM.
Gal-Ezer, J., & Stephenson, C. (2014). A tale of two countries: Successes and challenges in K-12 computer science education in Israel and the United States. ACM Transactions on Computing Education (TOCE), 14(2), 8.
Goode, J. and Margolis, J. (2011). Exploring computer science: A case study of school reform. ACM Transactions on Computing Education 11(2).
González, N., Andrade, R., Civil, M., & Moll, L. (2001). Bridging funds of distributed knowledge: Creating zones of practices in mathematics. Journal of Education for students placed at risk, 6(1-2), 115-132.
González, N., Moll, L. C., & Amanti, C. (Eds.). (2006). Funds of knowledge: Theorizing practices in households, communities, and classrooms. Routledge.
Google (2015). Searching for computer science: Access and barriers in U.S. K-12 education. Retrieved from https://services.google.com/fh/files/misc/searching-for-computer-science_report.pdf.
Graham & Latulipe, (2003). CS girls rock: sparking interest in computer science and debunking the stereotypes. ACM SIGCSE Bulletin, 35(1), 322-326.
Greeno, J. G. (1998) The situativity of knowing, learning and research. American Psychologist, 53, 5-26.
Greenwald, E. & Krakowski, A., (2019a). Coding Science Internships: Enabling Broader Participation in Computer Science. Presented at the International Society for Technology in Education, Philadelphia, 2019.
Greenwald, E. & Krakowski, A., (2019b). Coding Science Internships: Enabling Broader Participation in Computer Science Through Meaningful Integration with Core Academic Coursework. Unpublished Manuscript, University of California, Berkeley.
Grover, S., & Pea, R. (2013). Computational thinking in K–12: A review of the state of the field. Educational Researcher, 42(1), 38-43.
Grover, S., Pea, R. & Cooper, S. (2014). Remedying Misperceptions of Computer Science among Middle School Students. In Proceedings of the 45th ACM technical symposium on Computer science education, 343-348.
Hanks, B., Fitzgerald, S., McCauley, R., Murphy, L., & Zander, C. (2011). Pair programming in education: a literature review. Computer Science Education, 21(2), 135-173.
Howard, G. S. (1980). Response-shift bias: A problem in evaluating interventions with pre/post self-reports. Evaluation Review, 4(1), 93-106.
Jona, K., Wilensky, U., Trouille, L., Horn, M. S., Orton, K., Weintrop, D., & Beheshti, E. (2014). Embedding computational thinking in science, technology, engineering, and math (CT-STEM). In future directions in computer science education summit meeting, Orlando, FL.
Kim, Y., & Searle, K. (2017). Empowering student voice through interactive design and digital making. Computers in the Schools, 34(3), 142-151.
Klatt, J., & Taylor-Powell, E. (2005). Synthesis of literature relative to the retrospective pretest design. Paper presented at the 2005 Joint CES/AEA Conference, Toronto, Canada.
Knuuttila, T., & Boon, M. (2011). How do models give us knowledge? The case of Carnot’s ideal heat engine. European journal for philosophy of science, 1(3), 309.
Kules, B. (2016). Computational thinking is critical thinking: Connecting to university discourse, goals, and learning outcomes. Proceedings of the Association for Information Science and Technology, 53(1), 1-6.
Kwon, K., & Cheon, J. (2019). Exploring Problem Decomposition and Program Development through Block-Based Programs. International Journal of Computer Science Education in Schools, 3(1), n1.
Ladson-Billings, G. (1995). Toward a theory of culturally relevant pedagogy. American educational research journal, 32(3), 465-491.
Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. Cambridge university press.
Lee, I., Martin, F., & Apone, K. (2014). Integrating computational thinking across the K--8 curriculum. Acm Inroads, 5(4), 64-71.
Lee, I., Martin, F., Denner, J., Coulter, B., Allan, W., Erickson, Malyn-Smith, J. & Werner, L. (2011). Computational thinking for youth in practice. Acm Inroads, 2(1), 32-37.
Lehrer, R., & Schauble, L. (2006). Cultivating model-based reasoning in science education. Cambridge University Press.
Liebenberg, J., Mentz, E., & Breed, B. (2012). Pair programming and secondary school girls’ enjoyment of programming and the subject Information Technology (IT). Computer Science Education, 22(3), 219-236.
Lopez, F. A. (2017). Altering the trajectory of the self-fulfilling prophecy: Asset-based pedagogy and classroom dynamics. Journal of Teacher Education, 68(2), 193-212.
Lopez, S. J., & Louis, M. C. (2009). The principles of strengths-based education. Journal of College and Character, 10(4).
Louca, L. T., & Zacharia, Z. C. (2012). Modeling-based learning in science education: cognitive, metacognitive, social, material and epistemological contributions. Educational Review, 64(4), 471-492.
MacKinnon, D. P., Fairchild, A. J., & Fritz, M. S. (2007). Mediation analysis. Annu. Rev. Psychol., 58, 593-614.
Malone, K. L., Schunn, C. D., & Schuchardt, A. M. (2018). Improving Conceptual Understanding and Representation Skills Through Excel-Based Modeling. Journal of Science Education and Technology, 27(1), 30-44.
Margolis, J., Ryoo, J. J., Sandoval, C. D., Lee, C., Goode, J., & Chapman, G. (2012). Beyond access: Broadening participation in high school computer science. ACM Inroads, 3(4), 72.
Markus, H., & Nurius, P. (1986). Possible selves. American psychologist, 41(9), 954.
Martin, R. C. (2002). Agile software development: principles, patterns, and practices. Prentice Hall.
Messitidis, L. (2018). Designing Learning Environments for Cultural Inclusivity: Case Studies with three Instructional Designers and a Teacher Exploring Their Practises (Doctoral dissertation, Concordia University).
Moll, L. C., Amanti, C., Neff, D., & Gonzalez, N. (1992). Funds of knowledge for teaching: Using a qualitative approach to connect homes and classrooms. Theory into practice, 31(2), 132-141.
National Research Council. (2011). Learning science through computer games and simulations. National Academies Press.
National Science Board. 2018. Science and Engineering Indicators 2018. NSB-2018-1. Alexandria, VA: National Science Foundation. Available at https://www.nsf.gov/statistics/indicators/.
NGSS Lead States. (2013). Next generation science standards: For states, by states. National Academies Press. NGSS Lead States. 2013.
Nystrand, M. (1997). Opening Dialogue: Understanding the Dynamics of Language and Learning in the English Classroom. Language and Literacy Series.
Osborne, J. (2014). Teaching scientific practices: Meeting the challenge of change. Journal of Science Teacher Education, 25(2), 177-196.
Paris, D. (2012). Culturally sustaining pedagogy: A needed change in stance, terminology, and practice. Educational researcher, 41(3), 93-97.
Paris, D., & Alim, H. S. (2014). What are we seeking to sustain through culturally sustaining pedagogy? A loving critique forward. Harvard Educational Review, 84(1), 85-100.
Passmore, C., Gouvea, J. S., & Giere, R. (2014). Models in science and in learning science: Focusing scientific practice on sense-making. In International handbook of research in history, philosophy and science teaching (pp. 1171-1202). Springer, Dordrecht.
Pea, R. D. (1993). Practices of distributed intelligence and designs for education. Distributed cognitions: Psychological and educational considerations, 11, 47-87.
Penuel, W. R., Roschelle, J., & Shechtman, N. (2007). Designing formative assessment software with teachers: An analysis of the co-design process. Research and practice in technology enhanced learning, 2(01), 51-74.
Project Lead the Way and Burning Glass Technologies (2019). The Power of Transportable Skills. Retrieved from https://www.pltw.org/the-power-of-transportable-skills.
Resnick, M. (2013). Learn to code, code to learn. EdSurge, May, 54.
Robins, A., Rountree, J., & Rountree, N. (2003). Learning and teaching programming: A review and discussion. Computer science education, 13(2), 137-172.
Rommes, E., Overbeek, G., Scholte, R., Engels, R., & De Kemp, R. (2007). ‘I'm not Interested in Computers’: Gender-based occupational choices of adolescents. Information, Community and Society, 10(3), 299-319.
Ryoo, J. J., Margolis, J., Lee, C. H., Sandoval, C. D., & Goode, J. (2013). Democratizing computer science knowledge: Transforming the face of computer science through public high school education. Learning, Media and Technology, 38(2), 161-181.
Schwartz, S. J., Côté, J. E., & Arnett, J. J. (2005). Identity and agency in emerging adulthood: Two developmental routes in the individualization process. Youth & Society, 37(2), 201-229.
Schwarz, C. V., & White, B. Y. (2005). Metamodeling knowledge: Developing students' understanding of scientific modeling. Cognition and instruction, 23(2), 165-205.
Scott, A., Koshy, S., Rao, M., Hinton, L., Flapan, J., Martin, A., McAlear, F. (2019). Computer Science In California’s Schools: An Analysis of Access, Enrollment, and Equity. CSforAll Kapor Center Report.
Sengupta, P., Kinnebrew, J. S., Basu, S., Biswas, G., & Clark, D. (2013). Integrating computational thinking with K-12 science education using agent-based computation: A theoretical framework. Education and Information Technologies, 18(2), 351-380.
Shrader, G., Williams, K., Lachance-Whitcomb, J., Finn, L. E., & Gomez, L. (2001, April). Participatory design of science curricula: The case for research for practice. In Annual Meeting of the American Educational Research Association, Seattle, WA.
Snow, E., Rutstein, D., Bienkowski, M., & Xu, Y. (2017, August). Principled assessment of student learning in high school computer science. In Proceedings of the 2017 ACM Conference on International Computing Education Research(pp. 209-216). ACM.
Songer, N. B., Lee, H. S., & McDonald, S. (2003). Research towards an expanded understanding of inquiry science beyond one idealized standard. Science Education, 87(4), 490-516.
Star, S. L., & Griesemer, J. R. (1989). Institutional ecology, translations' and boundary objects: Amateurs and professionals in Berkeley's Museum of Vertebrate Zoology, 1907-39. Social studies of science, 19(3), 387-420.
Strauss, K., Griffin, M. A., & Parker, S. K. (2012). Future work selves: How salient hoped-for identities motivate proactive career behaviors. Journal of applied psychology, 97(3), 580.
Sutton, S. E., & Kemp, S. P. (2002). Children as partners in neighborhood placemaking: lessons from intergenerational design charrettes. Journal of Environmental Psychology, 22(1-2), 171-189.
Sundeen, T. H., & Sundeen, D. M. (2013). Instructional technology for rural schools: Access and acquisition. Rural Special Education Quarterly, 32(2), 8-14.
Swanson, H., Anton, G., Bain, C., Horn, M., & Wilensky, U. (2019). Introducing and Assessing Computational Thinking in the Secondary Science Classroom. In Computational Thinking Education (pp. 99-117). Springer, Singapore.
Tan, L., Yuan, D., Krishna, G., & Zhou, Y. (2007, October). /* iComment: Bugs or bad comments?*. In ACM SIGOPS Operating Systems Review (Vol. 41, No. 6, pp. 145-158). ACM.
Tyler-Wood, T., Knezek, G., & Christensen, R. (2010). Instruments for assessing interest in STEM content and careers. Journal of Technology and Teacher Education, 18(2), 345-368.
Tytler, R., Symington, D., Kirkwood, V., & Malcolm, C. (2008). Engaging students in authentic science through school--community links: learning from the rural experience. Teaching Science: The Journal of the Australian Science Teachers Association, 54(3).
Verdin, D., Godwin, A., & Capobianco, B. (2016). Systematic review of the funds of knowledge framework in STEM education.
Victor, B., Learnable programming. worrydream.com/LearnableProgramming, Sept. 2012.
Vihavainen, A., Vikberg, T., Luukkainen, M., & Pärtel, M. (2013, July). Scaffolding students' learning using test my code. In Proceedings of the 18th ACM conference on Innovation and technology in computer science education (pp. 117-122). ACM.
Webb, D. C., Repenning, A., & Koh, K. H. (2012, February). Toward an emergent theory of broadening participation in computer science education. In Proceedings of the 43rd ACM technical symposium on Computer Science Education (pp. 173-178). ACM.
Weintrop, D., Beheshti, E., Horn, M., Orton, K., Jona, K., Trouille, L., & Wilensky, U. (2016). Defining computational thinking for mathematics and science classrooms. Journal of Science Education and Technology, 25(1), 127-147.
Wenger, E. (1998). Communities of practice: Learning as a social system. Systems thinker, 9(5), 2-3.
Werner, L. L., Hanks, B., & McDowell, C. (2004). Pair-programming helps female computer science students. Journal on Educational Resources in Computing (JERIC), 4(1), 4.
White, B. Y. (1993). ThinkerTools: Causal models, conceptual change, and science education. Cognition and instruction, 10(1), 1-100.
Wilensky, U., & Reisman, K. (2006). Thinking like a wolf, a sheep, or a firefly: Learning biology through constructing and testing computational theories—an embodied modeling approach. Cognition and instruction, 24(2), 171-209.
Wilson, B. C. (2002). A study of factors promoting success in computer science including gender differences. Computer Science Education, 12(1-2), 141-164.
Wing, J. M. (2006). Computational thinking. Communications of the ACM, 49(3), 33-35.
Wing, J. M. (2008). Computational thinking and thinking about computing. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 366(1881), 3717-3725.
Witherspoon, E. B., Higashi, R. M., Schunn, C. D., Baehr, E. C., & Shoop, R. (2017). Developing computational thinking through a virtual robotics programming curriculum. ACM Transactions on Computing Education (TOCE), 18(1), 4.
Witherspoon, E. B., Schunn, C. D., Higashi, R. M., & Shoop, R. (2018). Attending to structural programming features predicts differences in learning and motivation. Journal of Computer Assisted Learning, 34(2), 115-128.
Yadav, A., Hong, H., & Stephenson, C. (2016). Computational thinking for all: pedagogical approaches to embedding 21st century problem solving in K-12 classrooms. TechTrends, 60(6), 565-568.
Yaşar, Ş., Baker, D., Robinson‐Kurpius, S., Krause, S., & Roberts, C. (2006). Development of a survey to assess K‐12 teachers' perceptions of engineers and familiarity with teaching design, engineering, and technology. Journal of Engineering education, 95(3), 205-216.
Yoon, S., Evans, M. G., & Strobel, J. (2014). Validation of the Teaching Engineering Self‐Efficacy Scale for K‐12 teachers: A structural equation modeling approach. Journal of Engineering Education, 103(3), 463-485.
Zimmerman, H. T., & Weible, J. L. (2017). Learning in and about rural places: Connections and tensions between students’ everyday experiences and environmental quality issues in their community. Cultural Studies of Science Education, 12(1), 7-31.

More [+]

Presenters

Photo
Dr. Eric Greenwald, Lawrence Hall of Science

Eric Greenwald is Director of Assessment and Analytics at the University of California's Lawrence Hall of Science. He leads several federally funded STEM education research projects, with a focus on the intersection of science and computational thinking. Previously, he was a policy analyst at SRI where he focused on STEM teaching and learning, assessment, and measurement development. He holds a PhD in Curriculum and Teacher Education from Stanford University, a Masters in Science Education from Teachers College Columbia University, and a BA in Chemistry from Indiana University. He taught math and science in public high schools for 6 years.

Photo
Dr. Ari Krakowski, Lawrence Hall of Science, UC Berkeley

People also viewed

A Deeper Dive: Designing the AR Experience in the Classroom
Meeting ISTE Standards for Students Through a Project-Based Approach
Utilizing OER Resources Via Google Classroom in Secondary Science Classroom

Testimonials