Evaluation of Edtech for Struggling and Non-Proficient Readers in Middle School
Listen and learn : Research paper
Friday, December 4, 11:30 am–12:15 pm PST (Pacific Standard Time)
Presentation 3 of 3
Technology and College Access: Exploring the Relationship Between the Internet & College Knowledge
iPad Classrooms After a Year of Implementation: What Happens First?
Dr. Lisa Hurwitz Paul Macaruso
We present two studies evaluating the Lexia PowerUp Literacy educational technology program. PowerUp was designed to support nonproficient and struggling readers at the secondary level. In both studies, PowerUp contributed to significant gains on standardized ELA assessments.
|Audience:||Chief technology officers/superintendents/school board members, Curriculum/district specialists, Principals/head teachers|
|Attendee devices:||Devices not needed|
|Topic:||Distance, online & blended learning|
|Subject area:||Language arts|
|ISTE Standards:||For Educators:
|Additional detail:||Session recorded for video-on-demand|
|Disclosure:||The submitter of this session has been supported by a company whose product is being included in the session|
|Influencer Disclosure:||This session includes a presenter that indicated a “material connection” to a brand that includes a personal, family or employment relationship, or a financial relationship. See individual speaker menu for disclosure information.|
By the end of 8th grade, over 60% of students in the United States fail to meet reading proficiency standards set forth by the US Department of Education and Institute of Education Sciences (NAEP, 2018). These English Language Arts (ELA) gaps have negative downstream effects on students’ performance across content areas (including social studies, science, etc.), which require students to read informational texts such as textbooks and produce written essays (Schiefele et al., 2012).
The Simple View of Reading (Gough & Tunmer, 1986) served as a theoretical framework for the current studies. This theory proposes that reading difficulties result from poor decoding (the ability to efficiently map letters onto sounds) and/or language comprehension (e.g., academic vocabulary, background knowledge, grammatical awareness, listening skills). Experts have identified a need to better support students in gaining these skills, particularly language comprehension (Hogan et al., 2014).
Past research has shown that well-designed interventions using educational technology can help struggling readers advance their literacy skills (e.g., Lenhard et al., 2013; Macaruso & Rodman, 2009; Potocki et al.; for review, see Cheung & Slavin, 2012). Moreover, such technology can be highly appealing to students (Chen et al., 1980) and teachers (Kim et al., 2006) alike. However, not all educational technology products succeed in their mission to teach struggling readers (Strong et al., 2011). For technology to be effective, it must not only engage students but be build off of established learning science theories (Hirsh-Pasek et al., 2015).
To that end, PowerUp was created. PowerUp is intended to strengthen core English language arts instruction for a wide range of struggling and non-proficient readers in grades 6 and above. Aligned with the Simple View of Reading, PowerUp promotes a) decoding through a Word Study strand; b) word function and sentence structure through a Grammar strand; and c) listening and reading comprehension of increasingly complex texts through a Comprehension strand.
An auto-placement tool determines the appropriate start level within each of these strands, and the online program continuously adapts based on student responses to include extra scaffolding and explicit instruction when necessary (as recommended by Roberts et al., 2016). As such, the program provides a form of personalized learning and facilitates differentiated instruction. Based on their placement within the program, PowerUp gives students recommendations for how many minutes they should complete within each instructional strand each week, and allows students to track their own progress towards meeting these goals. Students can choose which strand to engage in each session. Student choice and game-like elements (such as winning “streaks”, polls, and timed tasks) foster autonomy, competence, and relatedness (Ryan, Rigby, & Przybylski, 2006), which motivate and engage struggling and non-proficient adolescent readers. PowerUp also recommends offline lessons and activities for teachers to deliver to further support students’ instruction with in-person social scaffolding (Hirsh-Pasek et al., 2015) The present studies aimed to assess PowerUp’s effectiveness.
Study 1 took place at a predominantly White middle school in rural Ohio. This is the only middle school in a small district with two elementary schools and one high school. The district receives approximately $300,000 in Title I funding.
There were 33 students in 8th grade included in the present analyses based on meeting the following criteria: they scored in the non-proficient range on the Ohio State ELA test at the end of 7th grade (Spring 2017), and they completed the Ohio State ELA test at the end of 8th grade (Spring 2018).
All students had access to PowerUp throughout the 2017-2018 school year. They used PowerUp for an average of 11.30 weeks (SD = 4.65).
The Ohio State ELA test for both 7th and 8th grade students assessed proficiency in the following domains: Reading for Information, Reading for Literature, and Writing.
Study 2 took place in a mid-sized suburban school district enrolling 4,582 K-12 students. The state had identified 62% of all students in this district as being economically disadvantaged, and all K-8 schools within the district receive school-wide Title I support. At the middle school level, 70% of students were reading below grade level at the start of the research study according to district administrator reports.
The district nominated 10 teachers from their two middle schools to participate in the study. Each teacher instructed a homeroom class for students identified as in need of Tier 2 reading support. Each class contained a mix of students in grades 6-8.
The analyses in Study 2 were based on 155 students who completed the district’s literacy progress monitoring assessment in the Fall and Spring of the study year. Students in this study were racially-ethnically diverse. Forty-eight percent were identified as Black, 44% White, 3% Latinx, and 5% multi-ethnic.
After students had taken their fall ELA progress monitoring assessment, 7 teachers with 105 students were randomly assigned to use PowerUp, and 3 teachers with 50 students were assigned to a control group that would continue to deliver instruction through their traditional curriculum. The district requested that more students be in PowerUp classes than traditional classes. They historically struggled to accelerate students’ reading proficiency and were eager to try PowerUp with as many students as possible.
On average, students in the PowerUp treatment group used the program for 16.60 weeks (SD = 3.33) during the 2018-2019 school year.
Literacy achievement was tested in the fall and spring with the STAR Reading™ assessment (STAR), a computer-adaptive test that students typically complete in about 15-minutes. It measures achievement in word knowledge and skills, comprehension strategies and constructing meaning, analyzing literary text, understanding authors’ craft, and analyzing argument and evaluating text.
We compared students’ scaled scores on the state ELA test before and after they gained access to PowerUp. Scores increased significantly, t(32) = 2.20, p = .035, partial η2 = .123. Eleven of the 33 students improved to a degree such that they ended the year scoring within the test’s “proficient” range.
In the Fall, students in the PowerUp and traditional curriculum classes earned similar scores on STAR (M = 460.03 for PowerUp students and M = 456.79 for traditional curriculum students). In the Spring, however, PowerUp students scored about 40 points higher on STAR than students in the control group (M = 485.51 for PowerUp students and M = 445.22 for traditional curriculum students), Cohen’s d = .27. That is the equivalent of a roughly 10 percentile difference between the PowerUp and control conditions (What Works Clearinghouse, 2014). Results from a multi-level model indicate that this effect was statistically significant after accounting for the nested structure of the dataset (students nested within classes) and controlling for a series of dichotomized and grand mean centered demographic covariates, 𝛾01 = 64.28, SE = 26.05, p = .039.
The combined results of these studies suggest that a technology-based intervention can support literacy growth of readers in middle school. In particular, the findings of Study 2 -- a randomized-control study in which struggling and non-proficient readers in 6th through 8th grade made significantly greater gains on a standardized ELA assessment than students receiving traditional instruction -- provides strong evidence for this claim. In addition, outcomes in Study 2 show that readers from diverse racial-ethnic backgrounds benefit from the intervention program. These findings confirm that educational technology programs can be effective, provided that the technology is based upon strong underlying learning science theory (Hirsh-Pasek et al., 2015).
Future research should continue assessing the effectiveness of PowerUp in different contexts and try to disentangle the program elements most directly linked to effects. For example, it is unclear from the present research whether the program’s built in game-like features and/or administration of offline lessons contributed to students’ success. Given the encouraging results presented herein and the importance of equipping students with strong literacy skills, such future research is warranted.
Chen, M. (1980). Television, science, and children: Formative evaluation for 3-2-1 Contact. Journal of Educational Technology Systems, 9, 261-276.
Cheung, A. C., & Slavin, R. E. (2011). How features of educational technology applications affect student reading outcomes: A meta-analysis. Educational Research Review, 7, 198-215.
Fearn, L., & Farnan, N. (2007). When is a verb? Using functional grammar to teach writing. Journal of Basic Writing, 63-87.
Gough, P. B., & Tunmer, W. E. (1986). Decoding, reading, and reading disability. Remedial and Special Education, 7, 6-10.
Hirsh-Pasek, K., Zosh, J. M., Golinkoff, R. M., Gray, J. H., Robb, M. B., & Kaufman, J. (2015). Putting education in “educational” apps: Lessons from the science of learning. Psychological Science in the Public Interest, 16, 3-34. doi:10.1177/1529100615569721
Hogan, T.P., Adlof, S.M., & Alonzo, C. (2014). On the importance of listening comprehension. International Journal of Speech-Language Pathology, 16, 199–207.
Kim, A. H., Vaughn, S., Klingner, J. K., Woodruff, A. L., Klein Reutebuch, C., & Kouzekanani, K. (2006). Improving the reading comprehension of middle school students with disabilities through computer-assisted collaborative strategic reading. Remedial and special education, 27, 235-249.
Lenhard, W., Baier, H., Endlich, D., Schneider, W., & Hoffmann, J. (2013). Rethinking strategy instruction: Direct reading strategy instruction versus computer-based guided practice. Journal of Research in Reading, 36, 223-240.
Macaruso, P., & Rodman, A. (2009). Benefits of computer-assisted instruction for struggling readers in middle school. European Journal of Special Needs Education, 24, 103-113.
National Assessment of Educational Progress (NAEP). (2018). NAEP Reading Report Card. Retrieved from https://www.nationsreportcard.gov/reading_2017/#nation/scores?grade=8
Roberts, J. D., Chung, G. K., & Parks, C. B. (2016). Supporting children’s progress through the PBS KIDS learning analytics platform. Journal of Children and Media, 10, 257-266.
Ryan, R. M., Rigby, C. S., & Przybylski, A. (2006). The motivational pull of video games: A self-determination theory approach. Motivation and emotion, 30, 344-360.
Schiefele, U., Schaffner, E., Möller, J., & Wigfield, A. (2012). Dimensions of reading motivation and their relation to reading behavior and competence. Reading Research Quarterly, 47, 427-463. doi:10.1002/RRQ.030
Strong, G. K., Torgerson, C. J., Torgerson, D., & Hulme, C. (2011). A systematic meta‐analytic review of evidence for the effectiveness of the Fast ForWord language intervention program. Journal of Child Psychology and Psychiatry, 52, 224-235.
What Works Clearinghouse. (2014). Procedures and standards handbook (3rd ed.). Washington, DC: What Works Clearinghouse, Institute of Education Sciences, US Department of Education.