Document Type

Article

Publication Date

2019

Publication Title

Journal of Science Education and Technology

Publisher

Springer

Abstract

In recent decades, computational tools and methods have become pervasive in mathematical and scientific fields (National Research Council, 2010a). Tools such as mathematical and statistical models have expanded the range of phenomena that are explored and have become necessary for analyzing increasingly large data sets across disciplines (National Academy of Sciences, National Academy of Engineering, & Institute of Medicine, 2007). With these advances, entirely new fields such as computational statistics, neuroinformatics, and chemometrics have emerged. The varied applied uses of computational tools across these fields have shown that future scientists will not only need to know how to program, but also be knowledgeable about how information is stored and managed, the possibilities and limitations of computational simulations, and how to choose, use, and make sense of modeling tools (Foster, 2006).

As a result of these changes, science, technology, engineering, and mathematics (STEM) education communities have recognized the importance of integrating Computational Thinking (CT) into school curricula (National Research Council, 2012; NGSS Lead States, 2013), and there are several important efforts underway to more closely integrate CT skills and practices into mainstream science and mathematics classrooms such as Bootstrap (Schanzer, Fisler, & Krishnamurthi, 2018; https://www.bootstrapworld.org), GUTS (Lee et al., 2011; https://teacherswithguts.org), and CT-STEM (Swanson, Anton, Bain, Horn, & Wilensky, In Press; https://ct- stem.northwestern.edu). However, while much of the research on CT and CT in STEM has focused on creating generally agreed-upon definitions and CT curricula (Shute, Sun, & Asbell-Clarke, 2017), few studies have empirically tested assessments or used contemporary learning sciences methods to do so (Grover & Pea, 2013). In this paper, we outline the assessment approach for a ten-day biology unit with computational thinking activities. We examine both high school student pre-post responses as well as responses to embedded assessments throughout the unit. We explain how we coded responses for CT-STEM discourse elements and then quantitatively measured the development of students’ CT-STEM practices over time. We identify two groups of students: those who had positive gains on pre- post tests and those who had negative gains on pre-post tests, and we examine how each group’s CT-STEM practices developed as they engaged with the curricular unit.

Comments

The published version of this article can be found here: https://link.springer.com/journal/10956

A subscription may be required to view this version.

Available for download on Friday, October 09, 2020

Share

COinS