MORE EVENTS
Leadership
Exchange
Solutions
Summit
DigCit
Connect
Change display time — Currently: Mountain Daylight Time (MDT) (Event time)

Implementing the Iceberg Framework: Teaching Critical Evaluation of New Classroom Technologies

,
Colorado Convention Center, 108/10/12

Roundtable presentation
Listen and learn: Research paper
Save to My Favorites

Presentations with similar research topics are each assigned to round tables where hour-long discussions take place. Roundtables are intended to be more collaborative discussions about research.
This is presentation 2 of 6, scroll down to see more details.

Other presentations in this group:

Presenters

Photo
Assistant Professor
University of Oklahoma
Amy Mueller is an assistant professor of learning technologies at the University of Oklahoma in the Instructional Leadership and Academic Curriculum program. Amy teaches courses on educational and instructional technologies to pre-service and practicing educators. Previously, Amy taught 4K for Head Start and worked as a K-5 Technology Teacher at a dual language immersion program in a diverse, public, urban title-1 school. Amys’ research interests include: culturally and linguistically responsive and sustaining education, Indigenous education, liberatory education, elementary education, games-based learning, maker education, STEM education, digital literacy, multiliteracies, design based research, and codesign & community action research.

Session description

In this research study, we present the results of our effort to prepare pre-service teachers (PSTs) to think critically about the educational technology used in their classrooms. In this presentation, we will share the Technoskepticism Iceberg Framework (Authors, 2023) with educators and administrators to use in their own practice.

Framework

Critical thinking is a broad construct that includes skills and dispositions relevant to making sound decisions, including conducting rigorous analyses, maintaining an open mind, and employing metacognition and self-reflection (Dwyer et al., 2014; Facione, 1990; Halpern, 2014; Lai, 2011). It is a slower and more deliberate mode of thinking (Bonnefon, 2016; Stanovich, 2016; Stupple et al., 2017). As applied to educational technologies, critical thinking means avoiding getting swept up in rhetorical hype about educational “solutions” (Ciccone, 2022; Cuban & Jandric, 2015; Heath et al., 2022; Macgilchrist, 2021; Sims, 2017; Teräs et al., 2020). Rather, it means taking the time to examine the complex, unintended, and often undesired effects of technologies on schools, teachers, and students (Garcia & Nichols, 2021; Heath & Segal, 2021; Macgilchrist, 2019; Manolev et al., 2019; Selwyn, 2016; Selwyn et al., 2021; Williamson, 2017; Zembylas, 2021).
To support PSTs’ critical thinking, we used the Technoskepticism Iceberg (Authors, 2023) as a scaffolding tool (Figure 1). We define “technoskepticism” as a disposition toward and practice of asking

critical questions about technology (Authors, 2023). Our conceptualization is informed by a wide range of critical studies of technology, including scholarship from the academic fields of Technology Studies (e.g., Feenberg, 1999), critical design studies (e.g., Costanza-Chock, 2020; D’ignazio & Klein, 2020), and critical examinations of digital technologies (e.g., Benjamin, 2019; Broussard, 2023; Noble, 2018; O’Neil, 2017). As a scaffold, the Iceberg directs attention to aspects of technology that often go unnoticed. It prompts PSTs to examine educational technology along three dimensions: technical (its material structure and function), psycho-social (its effects on how people think and interact), and political (how decisions are made about it). Most crucially, it guides PSTs to think about educational technologies as entities connected to sociotechnical systems and values rather than “just tools.” In these ways, the Technoskepticism Iceberg Framework aims to help PSTs look beyond the imagined promises of educational technologies and critically examine their complex effects.

More [+]

Methods

The present research is a transformative mixed methods study (Teddlie & Tashakkori, 2009) that is part of a larger design experiment (Cobb et al., 2003) to understand how PSTs critically examine educational technologies and how teacher education can develop critical practices. During Spring 2023, we developed and implemented a set of instructional activities in two educational technologies courses (taught by Authors 1 & 3). In those activities, PSTs (n = 63) used the Iceberg Framework to critically examine technologies that they will likely encounter in schools. Both courses examined Class Dojo and Teachers Pay Teachers; Codable Robots and Lumio were each examined by only one of the courses. Consistent with design experiment methodology, our goal here is not to document whether those learning activities “worked” but to develop our theoretical understanding of teachers’ learning and decision-making in ways that inform our subsequent teacher education work (Cobb et al., 2003). Two research questions guided our study:
1. To what extent did PSTs demonstrate technoskeptical thinking in their responses to the learning activities we developed?
2. To what extent did the learning activities and Iceberg framework serve as scaffolds for technoskeptical inquiry?

More [+]

Results

Data Collection & Analysis
This study analyzes PSTs’ responses to the task of using the Iceberg to examine different educational technologies. Specifically, PSTs were asked to evaluate the technology (including benefits and concerns) and indicate how and if they might use it in their classroom. Data were collected during the Spring Semester of 2023, and Table 1 contextualizes the site characteristics.

Table 1
Participant and Study Context Characteristics
 Site Details Total PSTs PST Programs Demographics
Southwest Site 
● Large public research university
● “Teaching with Technology” course
● Course occurs one-two semesters before student teaching 
20 Total
19 undergraduate
1 graduate  Early Childhood Education
Elementary Education
Secondary Education Majority of the participants were white presenting with a quarter of the students identifying as Native American.
6 male identifying, 13 female identifying, and 1 nonbinary identifying student.

Northeast Site
● Small teaching college
● “Critical Media Literacy” course
● Course occurs two semesters before student teaching 43
all undergraduate Elementary Education
Majority of participants identified as White and female

We analyzed PSTs’ responses by characterizing common discourses within those responses, the alignment of those discourses with the Iceberg Framework, and the extent to which the responses demonstrated technoskeptical analysis. We transformed PSTs’ qualitative responses into quantitative categories (Teddlie & Tashakkori, 2009) to identify patterns and trends across the different educational technologies they examined.
To assess the degree of technoskepticism in PSTs’ responses, we utilized qualitative content analysis (Cho & Lee, 2014). We began with initial a priori categories based on our definition of technoskepticism and aligned with the Technoskepticism Iceberg. Our analysis was responsive to our data in that we used the a priori categories as a starting point; as we tested those categories against the data, we modified the number of categories and their descriptions to capture distinctions between PSTs’ responses (Schreier, 2012). To illuminate how students were engaging with the framework itself, student responses were also explored using open and axial coding, respectively (Charmaz, 2006). During open coding, each section of student text was coded to the theme of the text such that each argument or idea within each student’s response was coded. These initial open codes were then grouped into thematic concepts, which were then aligned with the Iceberg Framework.
Table 2 shows the categories that were developed to characterize the level of technoskepticism in each response. The four ordinal categories indicate an increasing level of technoskepticism, but we emphasize a crucial separation between the two lower and two higher categories. Responses categorized as 1 or 2 represent analyses that remained only at the “surface” layer of the Iceberg in that they approached the technologies as tools. Responses categorized as 3 or 4 included analyses that explored the deeper layers of the Iceberg (the entire response need not have shown this level of depth). The top two categories differ only in the extent to which the analysis explained the issues raised.

Table 2
Coding Framework Used to Analyze Degree of Technoskepticism in PSTs’ Responses

 DESCRIPTION EXEMPLAR
4
 The analysis centers on and describes unintended or unwanted effects of the technology. The analysis examines and explains how and why the technology will interact with existing relevant systems (school systems, economic systems, classroom social environments, etc.) to produce those unintended/unwanted consequences (and does not require the reader to infer those mechanisms).
 ClassDojo rewards students with points for completing tasks programmed in by the teacher or parents, depending on if a student’s parent pays for the extra software. The whole class’s points are put on the board for everyone to see, and students watch themselves and their peers actively gain and lose points throughout the day. The negative consequences that this incurs heavily outweighs the benefits of easily motivating students to complete tasks. For one, publicly humiliating someone by taking away their points in front of the class is more likely to discourage their learning. Another negative consequence that may happen is “Pavloving” students using this program. With points extrinsically motivating the completion of tasks, a growth mindset cannot be fostered and significant learning can’t take place.
3
 Analysis goes beyond tools and identifies the technology's unintended and/or unwanted effects. The concerns are not simply confined to how well the technology “works” in terms of its intended effects. However, the analysis does not fully explain how the technology will interact with systems and values to produce those effects.  I think ClassDojo is a great method of managing behavior for a younger set of students, but I don’t think it is something I would use in a high school setting. Implementing this service in a high school classroom could make students feel as if they are being treated younger than their age, and I want to be able to have a more trusting relationship with them. There also comes the issue of a “point hierarchy”, as points being visible to the entire class could encourage comparison and make some students feel bad about their classroom behavior.
2 The analysis raises questions about the quality of the technology in relation to its intended effects (how well the technology “works”). However, the analysis regards the technology as a tool and does not consider any unintended and/or unwanted effects.  Thinking about my own classroom, I would like to use Teachers Pay Teachers. I think that it is a great resource to have and can help take both teaching and learning to the next level. This could also be a beneficial resource to use in my classroom because it can give me creative ideas that I may not have thought of before. However, we learned in class that anyone can sign up for this website and sell lesson plans. Having this in mind, I will need to be extra sure to deeply look at each of the lesson plans on the website and make sure that it looks reliable before bringing it into my classroom.
1 The analysis uncritically describes the intended uses and benefits of the technology.

The response might consider the monetary cost of the technology but treats it as a narrow issue of cost-versus-benefit. Lumio is a website I would use for my future classroom. I don’t think I would utilize it for in- person classes but if I were to have to do distance learning, I could see myself using it. I’m usually really picky about the technology involved in my classroom as I plan on teaching preschool and do not like the idea of using technology in the classroom. However, for distance learning, I do like the idea of having games or videos I can upload of myself. I would probably use this site to upload a read-aloud or connect one from youtube.

When looking at student discourses of the two overlapping topics from each class, Teachers Pay Teachers and Class Dojo, it became clear that students were discussing topics across the Technoskepticism Iceberg Framework. However, as illustrated by Table 3, some layers and dimensions,

Table 3
The Technoskepticism Iceberg Layers & Dimensions in Student Responses with Frequency

  Technical Dimension
Focus on the ways that technologies are structured in material terms and how they function. Psycho-Social Dimension
Focuses on the ways that technologies affect, and are affected by, how people think, act, and relate to one another. Political Dimension
Focus on who makes decisions about how technologies are designed and deployed and how those decisions are made.
Tools Layer Focuses on the visible, immediate, and obvious uses and effects of technology. Technologies are regarded as tools with well-defined uses and outcomes. Value for Cost (6) Communication Tool (7)
Parent-Teacher Relationships (6)
Student Motivation (5)
Classroom Collaboration (1)
Inspiration Resource (8)
 Credibility (15)
Bias in Design (1)

Systems Layer Focus on the ways that technologies are embedded in and interact with systems. The properties of those systems, including their biases, influence how technologies are created and used. Technologies, in turn, affect those systems in unanticipated and often subtle ways. Parental Usage & Access (3)
System Access (3)
Multilingual Accessibility (1)
Concerns around Inclusion (5)
 Classroom Emotional Environment (7)
Cultural Appropriateness (3)
Reinforce Inequities (5)
Parental Overinvolvement (10)
Developmental Appropriateness (2)
Bullying
Inter-Intrapersonal Competition (8)
Quality of Materials (17)
Student Emotional Well-Being (19)
Classroom Hierarchy (3)
Surveillance Culture (1)
Educator Professionalism (3)
 Security & Privacy (4)
Marketing Ethics (3)
Creator Pay (11)

Values Layer Focus on the values embedded in systems that interact with technology. The way we use, design, think about, and make decisions about technology are never value-neutral or “purely rational,” but reflect ideas about what constitutes a good life and the common good. Copyright & Plagiarism (7) Intrinsic v. Extrinsic Rewards (8) Educator Workloads (7)
Lack of Resources (7)
Educator Pay & Expenses (3)
Educator Burnout (2)
Educator Well-Being (2)

Particularly the psycho-social dimension, were discussed more than others. Further, student discourse varied greatly between the different technologies being examined, however, themes of equity permeated both classes and technologies (Figure 2). Teachers Pay Teachers student discourse focused

Figure 2: Student discourses using the Iceberg Framework.
on the themes of educator well-being, quality and reliability, and economic considerations. However, in Class Dojo, student discourse revolved around parental involvement, classroom community, and student development. We posit these may be due to the varying nature of the technologies combined with the students' experiences as PSTs.
Additionally, PST response data was coded with the priori categories, we looked for patterns that emerged between the different technologies that PSTs analyzed. We were interested to see whether the responses became more technoskeptical later in the course. However, we also suspected that some technologies lend themselves to technoskeptical inquiry more than others. We combined data across the two course contexts for Teachers Pay Teachers and Class Dojo as we detected no significant differences in the distribution of those responses (on a Chi-Squared test of independence, we found p = .859 for Teachers Pay Teacher and p = .071 for Class Dojo). Table 4 is arranged such that the

Table 4
Distribution of Iceberg Responses at Each Technoskepticism Level by Counts (and %)

TechnoskepticismRating Teachers Pay Teachers  Class Dojo Codable Robots  Lumio
1 11 (19%) 9 (14%) 19 (42%) 9 (50%)
2 24 (41%) 7 (11%) 11 (24%) 6 (33%)
3 17 (29%) 26 (41%) 12 (27%) 2 (11%)
4 7 (12%) 21 (33%) 3 (7%) 1 (6%)

technologies examined earlier in the semester are on the left, and those examined later in the semester are on the right. Although we hoped that students’ responses might become more technoskeptical from the beginning to the end of the course, this was not evident in our results. Instead, we found that the more influential factor was the technologies under examination.
Class Dojo seemed to lend itself to technoskeptical inquiry. Most students identified unintended effects of Class Dojo (74% categorized as “3” or “4”) that would make them wary of using it in their classrooms without adaptation. In contrast, the other technologies did not elicit such inquiries nearly as often. For the case of Teachers Pay Teachers, PSTs often pointed out that the lessons on the site might not be of the highest quality (a concern categorized as “2”). But they rarely addressed deeper concerns about the platform itself and the effects it might have on instructional decisions.

More [+]

Importance

Technoskepticism is not a “natural” way of thinking, especially for educational technologies (Krutka et al., 2019; Schroeder & Curcio, 2022). Overall, our findings indicate that the Iceberg framework and our instructional activities supported some degree of technoskeptical thinking, though it depended greatly on the technology under study. A conjecture suggested by our results that is worthy of further research is that technoskepticism occurs differently for distinct categories of educational technologies, and PSTs would benefit from scaffolds that are attuned to those categories. Overtly social technologies, like Class Dojo, may require less scaffolding in general because their unintended and unwanted consequences are easier for PSTs to imagine and describe, and PSTs’ corresponding discourse revolved around topics often discussed in teacher certification programs. Platforms like Teachers Pay Teachers are potentially more challenging to critique because technoskeptical thinking requires PSTs to examine the platform itself rather than any unintended consequences.
More broadly, our findings indicate both the promise of our approach to promoting PSTs’ technoskeptical thinking and the need for targeted support and guidance. The Iceberg provides shared conceptual language and prompts for technoskeptical inquiry. Yet we also found that PSTs had difficulty applying specific Iceberg dimensions (e.g., differentiating between psycho-social and political aspects) and did not always move past the “Tools” layer. As we continue our overarching design experiment, we need to develop more detailed scaffolds to support PSTs’ technoskeptical inquiries, especially for technologies with less obvious unintended consequences. One way to do this is by providing PSTs with a more detailed description of the Iceberg, with example questions to use as a template for their inquiries. Another is to include more modeling of technoskeptical inquiry throughout the course. We will be implementing these approaches in the next phase of our work.
A lingering question is how PSTs’ technoskeptical thinking skills develop over time. In this study, we could not address this because our findings were heavily influenced by the nature of the technologies being examined. As we continue this line of work, we may also consider PSTs’ prior experiences with specific technologies and how this connects with their future teaching contexts (e.g., elementary versus secondary). Each of these factors is worthy of further exploration.

More [+]

References

The present research is a transformative mixed methods study (Teddlie & Tashakkori, 2009) that is part of a larger design experiment (Cobb et al., 2003) to understand how PSTs critically examine educational technologies and how teacher education can develop critical practices. During Spring 2023, we developed and implemented a set of instructional activities in two educational technologies courses (taught by Authors 1 & 3). In those activities, PSTs (n = 63) used the Iceberg Framework to critically examine technologies that they will likely encounter in schools. Both courses examined Class Dojo and Teachers Pay Teachers; Codable Robots and Lumio were each examined by only one of the courses. Consistent with design experiment methodology, our goal here is not to document whether those learning activities “worked” but to develop our theoretical understanding of teachers’ learning and decision-making in ways that inform our subsequent teacher education work (Cobb et al., 2003). Two research questions guided our study:
1. To what extent did PSTs demonstrate technoskeptical thinking in their responses to the learning activities we developed?
2. To what extent did the learning activities and Iceberg framework serve as scaffolds for technoskeptical inquiry?

More [+]

Session specifications

Topic:
Project-, problem- & challenge-based learning
Grade level:
Community college/university
Audience:
Teachers, Teacher education/higher ed faculty, Technology coordinators/facilitators
Attendee devices:
Devices not needed
Participant accounts, software and other materials:
I will bring my own laptop. Connection with a USB-C or HD cable will suffice.
Subject area:
Inservice teacher education, Preservice teacher education
ISTE Standards:
For Coaches:
Learning Designer
  • Collaborate with educators to design accessible and active digital learning environments that accommodate learner variability.
For Education Leaders:
Empowering Leader
  • Support educators in using technology to advance learning that meets the diverse learning, cultural, and social-emotional needs of individual students.
For Educators:
Leader
  • Advocate for equitable access to educational technology, digital content and learning opportunities to meet the diverse needs of all students.