Menu
INTRODUCTION
Creative thinking is the way of thinking that leads to the generation of valuable and original ideas [1]-[5]. All children are capable of engaging in creative thinking and practicing creativity in everyday activities. Policy makers and educators consider creativity as one of the critical skills for success in college, career and life. Creative thinking can be taught and assessed to the point of competency (e.g., along with specific levels of proficiency) through creative play activities and/or in conjunction with one or more academic domains. Yet, there is a lack of valid, reliable, fair and scalable assessment instruments that can be used to inform the development of creative skills across formal informal educational settings. Assessing levels of proficiency requires validation of the assessment approach to creative thinking across a wide range of settings, cultures and languages, which has not been attempted prior to the current Programme for International Student Assessment (PISA) 2022 effort [6], [7].
The PISA assessment examines students’ capacities to generate diverse and original ideas, and to evaluate and improve ideas, across a range of contexts or ‘domains’ [6]. The assessment includes four domains: written expression, visual expression, social problem solving and scientific problem solving. In each of these domains, students engage with open tasks that have no single correct response but multiple correct responses. They are either asked to provide multiple, distinct responses, or to generate a response that is not conventional. These responses can take the form of a solution to a problem, of a creative text or of a visual artifact. This in turn has demanded more complex scoring methods, based on rubrics and sample responses that were informed by the collection and analysis of responses of many students around the world [8].
Building on this pioneer work and expanding into the formative learning through play setting, novel task models are being designed to allow for both the learning and measuring of creative thinking skills with improved authenticity, engagement, and effectiveness. These task models focus on tools, functionalities, and environments that allow us to replicate the real-world creative activities and processes involved with creative thinking. These new environments allow participants to explore the skills in more authentic environments in which the creative thinking skills are being used in context, in concert, and in a coordinated way, to create novel ideas, solutions, and artifacts. These new environments are also intended to facilitate the learning and formative assessment of creative thinking skills. By moving from formal and familiar assessment environments in which students are traditionally encouraged to seek expected and correct solutions, following a set of linear series of processes, the intention is that these new environments will allow for and encourage students to move beyond a mindset in which the expected and correct solutions are preferred, to an environment in which they are encouraged to think more broadly, flexibly, and outside of the conventions of traditional expectations. The intention is that these novel environments will reflect the dynamic, non-linear, adaptable, and iterative ways in which the creative thinking process occurs in real-world creative thinking experiences and across a range of domains [9], [10]. The intention is also that these dynamic and novel environments will enhance the motivation of participants, improving the learning and measuring of creative thinking skills [4] and the likelihood that the skills will transfer to real-world applications [11], [12]. Perhaps most importantly, these novel task environments allow for expanded opportunities for participants to learn and demonstrate their creative thinking skills. The engaging functionalities and tools involved with these task models open the doors for successful learning and demonstration of these skills to participants that are more adept at “showing” than “telling,” in the visual sense. They also open the door for learners who are more adept at “telling” through a contextualized dialog or conversation wherein the creative thinking process is providing the environmental context than “telling” through the selection of a response in a decontextualized context wherein the skill is being explored in isolation and outside of the process in which it will be applied.
ASSESSMENT OF CREATIVE THINKING SKILLS
The formative assessment of creative skills and learning progressions were developed based on prior research conducted by the LEGO Foundation [13]-[16] and key insights from the PISA 2022 [6]-[8]. Researchers from The LEGO Foundation have co-authored white papers positing a creative process model [13], exploring playful learning pedagogies that promote creative skills development [14], and tool development for the promotion of agency while Learning through Play [15]. A team of renowned creativity researchers consulted on the creation of a simplified model of how learners Connect-Explore-Transform during the creative process. Large-scale internal and external validation studies helped develop a quality of Learning through Play experiences framework where depth of learner agency is evaluated for each of the five characteristics of Learning through Play (Joyful, Actively engaging, Iterative, Socially interactive, and Meaningful). Ultimately, the framework provides meaningful indicators to help facilitate agency as learners move from Passive, Responding, Owning, Recognising, and Transferring [15]. Lastly, the link between a spectrum of play facilitation aligns with creative ideation as more diverse and original ideas diverge from explicit instructions and towards free play [16].
The formative assessment of creative skills and the underpinning learning progression are focused on developing the following skills:
- Creative process
- Generating original ideas
- Generating diverse ideas
- Evaluating and improving ideas
- Recognizing and transferring knowledge
The progressions are defined by goals or learning objectives within a benchmark. Originally, we developed skills within two benchmarks (e.g. 7- to 9-year-olds and 10- to 12-year-olds). In the second iteration of the learning progression, we merged the benchmarks, because the earlier benchmark established a floor or foundational standard for the creative thinking skill (e.g., creative process).
The learning progression was intentionally built to support learners to develop skills from concrete to more abstract ideas by revisiting ideas within and across activities and experiences. Standards were defined for each skill. Each standard was labeled using practices and processes previously defined and studied by The LEGO Foundation [14], [16]. For example, connecting, exploring, and transforming are all experiences that are essential throughout the creative process [13]. Additionally, other practices, such as free play, owning, and guided discovery, are at the forefront of the Learning through Play process and tool and they are tagged throughout the learning progression. The milestones are analogous to standards or performance level descriptors. The standards in each benchmark are similar, but progressively build in complexity. This allows students to revisit the standard in a new context and complexity as they gain experience and foundational concepts and skills.
PARTICIPANTS
Overall, 211 children ages 6-12 participated in the study, of which 121 are younger than 9 years-old. All the participants were engaged in the SuperSkills digital application that included creative thinking assessment authentically embedded into eight Learning through Play activities. Each activity included a design challenge (e.g., Paper Planes, Obstacle Course, Flipbook Animations) that combined participants’ creations in their physical space and uploaded pictures of their creations/artifacts to the digital application.
Each activity included three levels. Participants started their experience by following the instructions to design their first creation (Level 1). Next, the participants were asked to test or experiment with their creation and come up with ideas for improvements (Level 2). Finally, the participants were invited to rethink their design idea and come up with original ideas or solutions that not many children will think about (Level 3). Assessment questions were seamlessly embedded before, during and after each activity to measure creative thinking skills development process. It should be noted that selected activities invited participants to collaborate with their peers (e.g., family, friends, classmates).
ASSESSMENT INSTRUMENTS
While participants were engaged with the Learning through Play activities they responded to the following assessment item types:
Image/text: Participants upload a picture of their creations/artifacts and/or provide a description of their creations/artifacts. These questions were scored by two trained research assistants using rubrics adapted from the PISA 2022 approach [8].
Slider: Participants rate an answer option on a scale by dragging a slider. These questions were logged automatically based on Likert scales.
Multiple choice: Participants select one or more than one option from multiple answer options. These questions were scored automatically based on predefined correct responses.
RELIABILITY , ITEM DIFFICULTY AND ITEM DISCRIMINATION
The Image/Text items required that two independent coders review and code each student’s responses at a credit level of 0, 1, or 2, thus generating inter-rater reliability and other metrics. Quadratic Kappa was used to measure inter-rater agreement for all human-scored items. Based on the fact that the average Quadratic Kappa across coder pairs was 0.92, we concluded that inter-rater reliability was good.
Reliability of Slider questions were analyzed as measured by Cronbach’s . Items were analyzed separately by skill. This being a collective measure of items (outcome agreement among items in a set), we use as an individual item measure the amount by which decreases if the item is withheld from the set. High values in each skill ensure overall reliability. The data consists of 2,066 responses from 211 participants.
For performance assessment items based on artifacts submitted by students we used a 2-parameter Item Response Theory model, trained on the set of 211 participants responding to 28 items with 5 skills, the median number of responses per question being 86. For greater interpretability we converted the outputs - discrimination a and difficulty b - from their traditional infinite ranges to the [0; 1] range by a:=(a), b:=(b), where is the cumulative standard normal distribution. We impose item quality conditions a0.15 and 0.05b0.95, which are satisfied by 16 out of 28 items. These items are generally easy but highly discriminating: the discrimination varied in the range [0.601; 1.000], difficulty - [0.053; 0.236].
RESULTS
Proficiency estimation is done using a proprietary multi-skill model based on the exponential kernel regression of scores that includes the fading of skills with time and the transfer of knowledge from interactions (I. Rushkin, Y. Rosen 2022). This model takes into account multiple-skill alignments of items, complexity and difficulty. As participants have more interaction events with data, the confidence level of proficiency estimates gradually increases. By convention, proficiency is on the 0-100 scale.
As a post-model visualization convention, we split the range of proficiency into 5 bands: far below proficiency [0; 25], below proficiency (25; 50], approaching proficiency (50; 75], at proficiency (75; 90], exceeding expectations (90; 100]. We've estimated the distribution of participants across these bands, and its development with the increasing number of events. Students are characterized by their proficiency on the “ancestor” skill that covers all 5 skills. We can see the overall trend of participants moving up from the “approaching skill” cohort to the “at skill” cohort. A small cohort of participants exceeding expectations is also beginning to develop.
For 3 out of 5 individual skills the trend is qualitatively similar to the overall picture, skill “Creative Process”. However, the skill “Generate original ideas” shows a static picture. The same is true for the skill “Recognize and transfer”, which also has a significantly smaller number of participants with 15 interaction events.
This study provided preliminary evidence on the promise of a new formative assessment of creative thinking to drive the development of 'hard-to-measure' skills while children are engaged in Learning Through Play activities.
The implications of this assessment framework and findings are far-reaching, as the learning progressions and pedagogical influence on how we operationalized creative thinking skills allows for more appropriate feedback opportunities for educators and learning designers in their refinement of learning experiences to deliver optimal outcomes.
[1] T. M. Amabile, M. G. Pratt, “The Dynamic Componential Model of Creativity and Innovation in Organizations: Making Progress, Making Meaning.” Research in Organizational Behavior, vol. 36, no. 36, 2016, pp. 157–183.
[2] E. L. Grigorenko, L. Jarvin, M. Tan, R. J. Sternberg, “Something new in the garden: Assessing creativity in academic domains.” Psychology Science, vol. 36. no. 2, 2008, pp. 295. 2008.
[3] J. C. Kaufman, J. Baer, “Beyond New and Appropriate: Who Decides What Is Creative?” Creativity Research Journal, vol. 24, no. 1, Jan. 2012, pp. 83–91.
[4] Y. V. Rosen, M. Tager, Computer-based performance assessment of creativity skills: a pilot study. Boston, MA: Pearson. MA, 2013.
[5] R. J. Sternberg, T. I. Lubart, “An Investment Theory of Creativity and Its Development.” Human Development, vol. 34, no. 1, 1991, pp. 1–31.
[6] OECD, Draft framework for the assessment of creative thinking in PISA 2021. Paris, France: OECD Publishing. 2019.
[7] Y. V. Rosen, K. Stoeffler, V. Simmering, “Imagine: Design for Creative Thinking, Learning, and Assessment in Schools.” Journal of Intelligence, vol. 8, no. 2, 15 Apr. 2020, p. 16.
[8] Y. V. Rosen, K. Stoeffler, M. Lumb, C-Y. Huang, N-R. Huh, PISA 2022 Creative Thinking Field Trial Research Report. Iowa City, IA: ACT, 2021.
[9] M. D. Mumford, T. McIntosh. “Creative Thinking Processes: The Past and the Future.” The Journal of Creative Behavior, vol. 51, no. 4, Dec. 2017, pp. 317–322.
[10] M. Botella, T. Lubart, “From dynamic processes to a dynamic creative process.” in Dynamic Perspectives on Creativity: New Directions for Theory, Research, and Practice in Education. Springer, 2019, pp. 261-278.
[11] R. A. Beghetto, J. C. Kaufman, J. Baer, J. Teaching for Creativity in the Common Core Classroom. New York, Teachers College, Columbia University, 2015.
[12] J. C. Kaufman, J. Baer, “Beyond New and Appropriate: Who Decides What Is Creative?” Creativity Research Journal, vol. 24, no. 1, Jan. 2012, pp. 83–91.
[13] LEGO Foundation. What We Mean by Creativity. Billund, Denmark: LEGO Foundation, 2012.
[14] J. Zosh, B. Hassinger-Das, M. Laurie. Learning through Play and the Development of Holistic Skills Across Childhood. Billund, Denmark: LEGO Foundation
[15] L. van Beeck. Guidelines for general use of the Learning through Play Experience Tool. Melbourne, Australia: ACER, 2020.
[16] H. Jansen, A. Pyle, J. M. Zosh, H. B., Ebrahim, A. Z., Scherman, J. Reunamo, B. K. Hamre. Play facilitation: The science behind the art of engaging young children. Billund, Denmark: LEGO Foundation, 2019.
[17] I. Rushkin, Y. V. Rosen, Multidimensional Mastery Tracing for Learning. New York, NY: BrainPOP, 2021.