Digital Tools to Promote Remote Lesson Study
Listen and learn : Research paper
Presentation 2 of 2
A Framework for Districts to Measure and Improve Their Instructional Technology Use
Dr. Gabriel Matney Jeremy Nadler Nancy Patterson Dr. Joanna Weaver Dr. Allison Goedde
The transition to remote teaching in 2020 paved the way for the “new normal” of digital instructional delivery for continued improvement in teacher education. Bowling Green State University's lesson study research team used online tools to engage in the lesson study cycle. This presentation highlights collaboration, benefits and contribution to professional development practices.
|Audience:||Professional developers, Teacher education/higher ed faculty, Curriculum/district specialists|
|Attendee devices:||Devices useful|
|Attendee device specification:||Smartphone: Android, iOS, Windows
Laptop: Chromebook, Mac, PC
Tablet: Android, iOS, Windows
|Participant accounts, software and other materials:||Zoom|
|Subject area:||Higher education, Inservice teacher education|
|ISTE Standards:||For Educators:
|Additional detail:||Preservice teacher presenter|
The notion of technological pedagogical content knowledge (TPACK) as an approach to transform teaching was strengthened by a number of studies using structural equation statistical models that illustrated how different subdomains of TPACK influenced each other to different degrees, with second-level subdomains such as technological content knowledge (TCK) and technological pedagogical knowledge (TPK) playing a much larger role than TK, CK, and pedagogical knowledge (PK) in most cases (Dong et al., 2015; Koh et al., 2013; Pamuk et al., 2015). In these studies, the lack of direct influences of TK and PCK on TPACK suggested that preparing teachers to integrate technology is more than simply developing their isolated technology skills. Furthermore, the teachers’ access of technology does not equate to effective teaching, nor does it affirm that teachers will utilize the technology to enhance teacher capacity or support student learning (Voithofer and Nelson, 2020).
The research explored correlates directly to our research question which was: During remote instruction, in what ways do digital tools promote the LS process and teacher educator and teacher candidate learning?
In this study, we extend the work of LS to examine a research team’s perception of the LS process. We focused on the theme of High Leverage Practices (HLP), specifically Leading a Discussion, when both the collaboration of the research, planning, and the implementation of instruction took place remotely. The five authors of the manuscript participated in LS research and designed this study as a qualitative look at the required use of digital and virtual tools in the LS process. We are from a midwest university in the United States of America (USA) and are teacher educators (TEs) with different content expertise who came together as a CoP to support one another in the development of HLPs in the new online virtual setting, specifically looking at the HLP Leading a Discussion.
We have different backgrounds, come from different areas of the USA, and teach different content areas that include: Integrated Language Arts, Mathematics, Technology Education, Career Tech Workforce Education, and Integrated Social Studies (listed to align with author order). We all teach in the same college and department, and our teacher education class sizes range from 18 to 38 and consist of students who are 77% white, 8% black, and 1% Asian and 14% other. We believe our diversity and expertise will be beneficial for our professional development around the difficult notion of implementing HLPs in an online setting. The researchers of this qualitative case study collected data on each facet of the LS process. In our context, this consisted of three sets of meeting notes during the study phase and planning, lesson plan artifacts from the planning phase, student products and digital field notes from the teaching phase, reflection notes during the individual and team reflections, and research notes from video analysis of the Zoom meetings in all four phases of the LS process. The digital field notes were taken synchronously during classroom discussions and breakout sessions. Reflective notes were taken after the conclusion of class in Google Docs. Video recordings were uploaded to GoReact for analysis, and additional field notes were taken during reflective debriefing among LS team members.
This qualitative case study (Stake, 1995; Yin, 2014) uses participatory action research (Kemmis and McTaggart, 2000) and data triangulation to verify the findings. Multiple sources of data were collected, and during the debriefing meetings and follow-up discussions, member checking occurred to verify the interpretation of the data. Furthermore, because the participants were the researchers, they were involved in all phases of this study from the design of the project to the conclusion. Members cross-checked the themes and patterns from the data collected and used the inductive approach to open-coding utilizing grounded theory (Kolb, 2012; Strauss and Corbin, 2008). While coding data, members used a systematic procedure (Hatch, 2002), modified for this study.
Researchers independently read all of the data then co-identified frames of analysis related to the research question.
Themes were created based on semantic relationships within the frames of analysis.
Notable themes were identified and assigned a code.
Researchers reread the data clarifying notable themes.
Researchers agreed upon which themes were supported by data and identified examples of non-fit or counter arguments.
Based on the four themes that emerged from the data collected, the findings were organized accordingly. These four themes include: (a) digital tools promote learning, (b) digital tools promote discussion, (c) digital tools limit instruction, and (d) digital tools expedite debriefing.
Digital Tools Promote Learning
Based on the data collected and analyzed, it was apparent that the technology used during this online LS experience impacted the learning for the candidates and teacher educators, supporting elements of the TPACK model, pedagogical content, technological delivery and content knowledge. For example, Nancy, the instructor and research member stated, “[The students were] thinking about race and racism not technology--it is the technology that facilitated breakthroughs, and the discussion format and documents showed students how long [racism] was going on.” Furthermore, Nancy said that the source analysis tool chosen was used effectively by the teacher candidates. She stated that they enjoyed using source analysis, as well as found it effective during discussion and felt that the digital format was natural because they are native technology users. According to Nancy, the teacher candidates also learned how to use a Google Doc template to structure individual and/or group notes. When she initially misunderstood the implementation process, they seamlessly adopted the practice.
Not only were the teacher candidates learning content through digital tools, but we also learned to facilitate the content through integration of technological platforms during the LS online process. Ultimately this experience provided an opportunity for us to model best technology integration practices for teaching and learning.
Another example of facilitating an experience with TPACK was integration and teacher candidate engagement through Zoom chat. They were encouraged to post comments and questions during instruction, enabling student voice. In addition to empowering teacher candidates through Zoom, Nancy learned how to use Padlet and said, “Padlet was seamless, and the number of students in each breakout room gave opportunity for individuals to contribute to the discussion.” She continued, “Through this process, I learned we didn’t have the right essential questions to create critical discussions focusing on the main topic. This is a positive because without the LS process through a digital platform, I wouldn’t have discovered that.” In other words, reflection on the essential question was evaluated due to candidates’ Padlet postings but that instructional discovery would have been missed in a face-to-face context.
Allison said that she was able to use a Google Sheet effectively to identify and mark off the elements of HLP leading a discussion that were covered during class. The Google Sheet was shared with LS team members and provided a mechanism for the observers to recognize and remind themselves of HLP practices that were potentially being observed by the team. These templates were accessible online and synchronously used with the lesson delivery. As a result, the we learned how to use the online observation tools to provide more thorough, reflective feedback.
Digital Tools Promote Discussion
The digital tools used during the research lesson LS promoted learning and discussion among teacher candidates. Padlet was a platform that enabled them to record and share discussion highlights, as well as provided evidence of their productive discussions. All four observers mentioned the effective use of Padlet to report the highlights of group discussions. Joanna stated, “There was a student recorder in each group who noted key content and thoughts using the Google Doc template structured with the images to compare and contrast for discussion.” Gabriel, Allison, and Jeremy confirmed this, and Allison continued, “At the end of class, candidates were asked to submit a value statement in the Zoom chat space prior to leaving the session, and the recording and representing of key content of discussion was through the use of notes on a templated document preserved for post-class review by Nancy.”
Jeremy and Gabriel did an analysis and said, “They [teacher candidates] couldn’t stop talking and the images sparked conversation--The majority of the content being discussed was on task, and they kept noticing things [in the images], and the recorder typed and talked simultaneously.” One researcher commented of the groups they observed that “every single person contributed and what was discussed was in-depth and meaningful.”
Because of technology, critical conversation seemed to ensue. For example, Allison stated that “conversation has changed because of technology. In contrast, the students were able to enlarge the images, and they could look up the picture and know the context. If they had been face-to-face, they wouldn’t have zoomed in on the picture.” Nancy and Joanna also mentioned the students closely examining the pictures and finding policemen in the pictures as well mentioning the organization and disorganization of the protesting groups. Joanna stated, “This close analysis provided evidence of the candidates’ deeper understanding and learning that was taking place.”
In addition to observing teacher candidates’ analysis of the organization of the protest groups, Joanna mentioned the media platforms that were used by students that promoted discussion. She stated:
A variety of media platforms were provided to the students to respond to the essential
question and compare and contrast images in lesson 2 and 3. Candidates were able to
discuss how the images were in response to limited rights or supporting rights. The candidates went into their breakout groups and responded to each other with prompts candidates needed to address and then put it on the Padlet template. During this time, they could enlarge the photos and notice the nuances within the images and ask questions to each other for clarification.
Jeremy confirmed Joanna’s statement: “Zoom (synchronously) was used for all aspects of the discussion, and the Canvas modules were accessed throughout the morning. An open-ended question was posed prior to each of the two breakout sessions, and posted in the chat, effectively framing the discussions.”
Digital Tools Limit Instruction
Time used to figure out technology was an issue during the lesson. Joanna stated, “Initially it took about 6 minutes to get the technology all set up and then someone kept getting kicked out, so there was another minute and half gone. Once the group got going, it was amazing. But it took at least 10 minutes to get going.” Gabriel agreed saying, “Some of the time was lost in setting up as both the teacher and students had to facilitate technology to get to the material and begin thinking and analyzing the images.” The issues the teacher candidates were facing here were technology-related and the researchers agreed that in a face-to-face setting with print materials, the groups would have engaged in the work more quickly.
Even if some groups were on task with the content of the discussion, it was difficult for them to achieve a deeper level possibly due to their comfort level in their group or online. Allison mentioned: “Getting them to a deeper level was not observed. I don’t know if we should structure subgroups to break it out timewise; they just looked at the images--compare and contrast the demographics of demonstrators and maybe we need to prompt them to go deeper--high school level you would have to prompt them.” Gabriel added, “The students were also asked to comment about them, but that fell really flat,” and Joanna responded, “In my second group, the female student remained silent until the end while the males dominated the discussion. She clearly didn’t seem comfortable responding in this group. My first group had a different female student who could not get into the group due to technology issues which prevented her from contributing to the group discussion.”
While the members reflected on the discussions, Allison observed, “Candidates struggled with image context, identifying details of historical significance that related to time period, and they struggled with connecting past image/present image with emotions of today's protests. They did not appear to be able to discuss how they feel. They had an off-task discussion of COVID spread among the student population. Concerns that peers are not taking responsibility to quarantine and communicate about health concerns.” Jeremy added, “Group six struggled to engage in meaningful discussion, evident by long periods of silence. If not for direct engagement from the recorder [a teacher candidate in their group] by posing questions and probing responses, little if any discussion likely would’ve occurred.”
Therefore, some of the discussions were hindered by technology issues. Jeremy stated, “Some students didn’t have the ability to zoom in--some couldn’t see the images. We could have a conversation about equity and equality because they don’t have the same technology.” Joanna reiterated his point:
The areas of struggle included a candidate who kept getting thrown out of the Zoom
meeting and couldn't engage in the discussion until the end of the group session. That did
detract from the group conversation as the candidates shared their condolences with the
Digital Tools Expedite Debriefing
Based on the data we collected online through observation notes, checklists, and reflections, the debriefing created an expedited analysis because all researchers had access to all the synchronous materials immediately upon completion of the lesson. In addition, a deeper analysis of the lesson, instruction, and student learning took place because researchers could revisit the lesson and breakout room discussions through the video format--GoReact. Furthermore, the online format expedited the debriefing because the researchers could connect synchronously via Zoom and asynchronously analyze versus having to find a time and place to meet in person to discuss and reflect in a more traditional setting.
During step 4 of the LS, the debriefing session resulted in an online wishlist regarding revisions on the lesson, and additional questions developed among our team. The reflection of our LS team was pre-organized through reflection on four questions related to the LS research question we created in Google Docs. Following individual reflection in this template, we discussed our reflections together. Next, we allowed authentic sharing through a stream of consciousness with deep analysis of the LS process,what we were experiencing and observing during the lesson. For example, we asked questions and followed with possible solutions: Based on the students who had the presence of mind to do so, how do we facilitate all student application of the necessary technology skills to access the images central to the discussion, with the expectation that they will zoom in on the details?
After reflecting on the question, we responded that we need to be sure that all students can manipulate the pictures to make them larger and that the links are more accessible instead of having to locate them. Nancy stated that one possible solution might be to collate the images into one digital asset that may be acquired from within the Canvas course shell. Students need to be able to see all the images and the analysis in one place in order to corroborate. Alliosn added that by doing this, we can encourage more students to engage in the use of technology which will further promote discussion and learning. This modification may also help students focus on discussing their individual thoughts on the topic rather than recording what they perceived to be the “right answer” or “what the instructor wants.” Jeremy added that perhaps selecting and sequencing some groups to share thoughts about particular images would help to facilitate a larger group discussion. Additional suggestions were shared, including that in the preparation portion of the unit, we could tell students to use empathy, that all have a voice, all should feel safe, and that there are no right or wrong answers.
Conversely, Nancy added that maybe we cannot expect students to be discussion makers but expect them to dig deeper at another time. Another possibility offered by Gabriel was that maybe we needed to provide more time for analysis, but that created another question: How much time do we give students in total to accomplish the analysis? What is clear is that students need additional time to think about and grapple with the images. Through this debriefing and analysis, we processed our next steps and continued revising the lesson.
The findings and debriefing provided an answer to the research question. Digital tools used to implement the online lesson study process were effective and changed how online instructional planning can be researched, analyzed, and written collaboratively with a LS research team focused on student learning. The online process promoted collaborative lesson study and learning. The study affirmed the research that stated that LS reduces teacher isolation and builds a collaborative community of teachers who strive for positive student outcomes, and it increases instructional motivation (Chang, 2009; Lewis, 2002; Perry and Lewis, 2008; Lewis and Perry, 2015; Stokes et al., 2019; Uchiyama and Radin, 2009), extending it further by aligning and including the remote, synchronous LS process using best practices of online learning.
No matter the distance, educators across schools, universities, or districts can integrate online LS into remote teacher education programs and remote courses while engaging students in learning using additional digital tools that promote engagement, peer interaction, and student voice. There were definitely technical and timing issues along the way, but the synchronous observation process worked well. The instructor could move the researchers in and out of the breakout rooms that were being observed for student learning and the links to access documents were successfully retrieved by the research team and teacher candidates.
We contend that a remote, synchronous, instructional delivery format of Lesson Study (LS) has the potential to augment the impacts of a traditional, face-to-face high impact practice. Ultimately, evidence gathered in this study supports the notion that through the integration of digital tools in the virtual lesson study process, there was a high degree of rigor and relevance in a period in time that our face-to-face interactions were limited due to the coronavirus pandemic.
Chang, M.L. (2009). An appraisal perspective of teacher burnout: examining the emotional work of teachers. Educational Psychology Review, Vol. 21 No. 3, pp. 193–218.
Cooper, S., Wilkerson, T., Eddy, C., Kamen, M., Marble, S., Junk, D., and Sawyer, C. (2011).
Lesson study among mathematics educators: Professional collaboration enabled through a virtual faculty learning community. Learning Communities Journal, Vol. 3, pp. 21-40.
Corbin, J., and Strauss, A. (1990). Grounded theory research: Procedures, canons, and
evaluative criteria. Qualitative Sociology, Vol. 13 No. 1, pp. 3-21.
Dong, Y., Chai, C. S., Sang, G.-Y., Koh, J. H. L., and Tsai, C. C. (2015). Exploring the profiles and interplays of pre-service and in-service teachers’ technological pedagogical content knowledge (TPACK) in China. Educational Technology and Society, Vol. 18 No. 1, pp. 158–169.
Foulger, T. S., Graziano, K. J., Schmidt-Crawford, D., and Slykhuis, D. A. (2017). Teacher educator technology competencies. Journal of Technology and Teacher Education, Vol. 25 No. 4, pp. 413–448.
Hatch, J. A. (2002). Doing qualitative research in educational settings. Albany: State University
of New York Press.
Henning, J. E. (2005). Leading discussions: Opening up the conversation. College Teaching,
Vol. 53 No. 3, pp. 90-94, DOI: 10.3200/CTCH.53.3.90-94
Hoadley, C. (2014). ‘What is a community of practice and how can we support it?’, in D
Jonassen and S Land (eds), Theoretical foundations of learning environments, pp. 287-300. New York. Routledge.
Kemmis, S., and McTaggert, R. (200). Participatory action research. In N. K. Denzin and Y. S.
Lincoln (Eds.), Handbook of qualitative research (2nd ed., pp. 567-605). Thousand Oaks,
Koh, J., Chai, C., and Tsai, C.-C. (2013). Examining practicing teachers’ perceptions technological pedagogical content knowledge (TPACK) pathways: A structural equation modeling approach. Instructional Science, Vol. 41 No. 4, pp. 793–809. https://doi.org/10.1007/s11251-012-9249-y
Kolb, S. (2012). Grounded theory and the constant comparative method: Valid research strategies
for educators. Journal of Emerging Trends in Educational Research and Policy Studies, Vol. 3 No. 1, pp. 83-86.
Lave, J., and Wenger, E. (1991). Situated learning: Legitimate peripheral participation.
Cambridge, UK. University Press.
Lewis, C., (2002). Lesson study: a handbook of teacher-led instructional change. Philadelphia:
Research for Better Schools.
Lewis, C. and Hurd, J. (2011). Lesson study step by step: How teacher learning communities
improve instruction. Portsmouth: NH. Heinemann.
Lewis, C. and Perry, R. (2015). A randomized trial of lesson study with mathematical resource
kits: analysis of impact on teachers’ beliefs and learning community. (In: E. Cai and
Middleton, eds. Design, results, and implications of large-scale studies in mathematics
education). New York: Springer, pp. 133–155.
McDonald, J., and Cater-Steel, A. (eds). ( 2017). Implementing communities of practice in
higher education: Dreamers and schemers. Singapore. Springer.
Pamuk, S., Ergun, M., Cakir, R., Yilmaz, H. B., and Ayas, C. (2015). Exploring relationships among TPACK components and development of the TPACK instrument. Education and Information Technologies, Vol. 20 No. 2, pp. 241–263. https://doi.org/10.1007/s10639-013-9278-4
Perry, R. and Lewis, C., 2008. What is successful adaptation of lesson study in the US? Journal
of Educational Change, Vol. 10 No. 4, pp. 365-391.
Prestera, G. E., and Moller, L. A. (2001). Exploiting opportunities for knowledge-building in
asynchronous distance learning environments. Quarterly Review of Distance Education,
Vol. 2 No. 2, pp. 93-104.
Soto, M., Gupta, D., Dick, L., and Appelgate, M., (2019). Bridging distances: Professional
development for higher education faculty through technology facilitated lesson study. Journal of University Teaching and Learning Practice, Vol. 16 No. 3.
Shrivastava, P. (1999). Management classes as online learning communities. Journal of
Management Education, Vol. 23 No. 3, pp. 691-702.
Strauss, A., and Corbin, J. (2008). Basics of qualitative research: Grounded theory procedures
and techniques. (3rd ed.). Newbury Park, CA: Sage.
Stake, R. E. (1995). The art of case study research. Thousand Oaks, CA: Sage.
Stokes, L. R. E., Suh, J. M., and Curby, T. W. (2019). Examining the nature of teacher support
during different iterations and modalities of lesson study implementation. Professional
Development in Education, doi: 10.1080/19415257.2019.1634623
and techniques. (3rd ed.). Newbury Park, CA: Sage.
Uchiyama, K. P., and Radin, J. L. ( 2009). Curriculum mapping in higher education: A vehicle
for collaboration. Innovative Higher Education, Vol. 33 No. 4, pp. 271-280.
Voithofer, R., and Nelson, M. J. (2020). Teacher Educator Technology Integration Preparation Practices Around TPACK in the United States. Journal of Teacher Education, Preprints.
Wenger, E., McDermott, R. A., and Snyder, W. (2002). Cultivating communities of practice: A
guide to managing knowledge. Boston, MA. Harvard Business Press.
Yin, R. K. (2014). Case study research (5th ed.). Thousand Oaks, CA: Sage.