Increasing Engagement Through Virtual Reality & UDL for Students With Learning Disabilities
Participate and share : Poster
Dr. Jessica Hunt Rohana Swihart Michelle Patterson Tahnee Wilder Dr. Matthew Marino
Universal Design for Learning can enhance executive function performance, setting goals, planning and strategy development, organizing information and resources, and monitoring progress for students with learning disabilities. We will showcase practical strategies for enhancing executive function while providing multiple means of engagement, expression and representation.
|Audience:||Teachers, Curriculum/district specialists, Technology coordinators/facilitators|
|Attendee devices:||Devices not needed|
|Participant accounts, software and other materials:||Presentation information will be linked at http://www.modelmemath.org/|
|Topic:||Games for learning & gamification|
|Subject area:||Math, Special education|
|ISTE Standards:||For Educators:
|Additional detail:||Graduate student|
Universal Design for Learning is a framework for the design and implementation of efficacious instructional materials. The instructor must identify variability within the students prior to the class and proactively circumvent barriers inhibiting learning (see Figure 2). Instruction is guided by three principles: (a) multiple means of engagement (i.e. considering how to engage students in multiple ways), (b) multiple means of representation (i.e. providing content in multiple formats), and (c) multiple means
of action and expression (i.e. providing opportunities for students to demonstrate their understanding in students’ community who can provide meaningful feedback about the most appropriate course of action.
High Quality Research Design
We will employ a pretest > intervention > posttest design for a nine-week pilot test of the curriculum during year three of the project. Classrooms will be randomly assigned to either the intervention or the control group using matched pairs based on the classroom mean scores from students’ prior year state mathematics test scores . Students will be matched with peers at the state level. Matching ensures students are comparable across intervention conditions on relevant characteristics
For our analyses, we will compare group differences in engagement, conceptual understanding, and STEM interest at pre-, mid-, and posttest, which will require a repeated measures MANOVA with a between-within student interaction. Based on Cohen’s13 recommendation for a medium effect size of f = .25, with ɑ and power set at .05 and .80, respectively and oversampling 20%, we will recruit at least 136 students (68 per state, and 34 for each condition at each state).
Testing will occur with approximately 136 4th–6th grade individuals across two locations (North Carolina and Florida). Test sites will include students with LD, defined as, “a disorder in one or more of the basic psychological processes involved in understanding or in using language, spoken or written, that may manifest itself in the imperfect ability to listen, think, speak, read, write, spell, or to do mathematical calculations, including conditions such as perceptual disabilities, brain injury, minimal brain dysfunction, dyslexia, and developmental aphasia. Specific learning disability does not include learning problems that are primarily the result of visual, hearing, or motor disabilities, of intellectual disability, of emotional disturbance, or of environmental, cultural, or economic disadvantage” (IDEA, Sec 300.8 (c) (10)). The intervention will occur in tier 2 classroom settings, which commonly include 5-10 students with one teacher. The classrooms will include English Language Learners and students from diverse backgrounds.
We will utilize a combination of measures to address the research questions. Each measure is described and justified below.
Engagement survey. The Engagement in Science Learning Activities12 survey was written for use with 10 to 14-year-old students immediately after an in-class or informal activity. It is used to measure a student’s cognitive, behavioral, and affective engagement. Participants respond on a Likert-type scale ranging from 1 (YES!) to 4 (NO!). “Because no particular assumptions are made about a task structure other than there is a particular task that should have been completed (p. 1)”, valid inferences can be made regarding overall mathematics engagement during an activity using responses across all of the items. Both Cronbach’s and the polychoric coefficients yielded acceptable reliability when using all eight scale items (0.80 and 0.85, respectively). Exploratory factor analysis from an SEM bifactor model yield a satisfactory fit (CFI = 0.992; TLI = 0.982; RMSEA = 0.069). Analysis suggests that average scores from the survey can be treated as continuous dependent or independent variables for t-tests, ANOVAs, and regression-type analyses.
Conceptual understanding. We will triangulate three measures to address research question 2. Conceptual understanding will be measured before, midway, and after the intervention using the Test of Fraction Schemes, a clinical interview, and problem solutions generated during the games.
Curriculum-based measure. The Fraction Schemes test is a proximal measure of the effects of the intervention on students’ fraction conceptions. It consists of 12 items; 4 items for each of the following fraction concepts: unit fraction concept, partitive fraction concept, and splitting that underlies the recursive fraction and iterative fraction concepts. Internal consistency reliability for the test was reported as 0.70 for the unit fraction concept and splitting. Criterion-related validity with an associated clinical interview score was reported with coefficients of 0.52 (p < 0.01) for splitting and 0.58 (p < 0.01) for the unit fraction concept. Following Wilkins et al., test scores will be supplemented by student interviews with an additional six items to assess the nature of students’ pre-post fraction concepts.
STEM and ICT interests. Because the ultimate purpose of the intervention is to increase students’ STEM and ICT interests, we will utilize the Upper Elementary School (4-5) and Middle High School (6-12) Student Attitudes Toward STEM (S-STEM) (Friday Institute for Educational Innovation19 supplemented by semi-structured focus group interviews with a subsample of 40 students (i.e., 20 ModelME & 20 BAU) to measure changes in students’ self-reported STEM interests before and after the intervention. The S-STEM was developed as part of an NSF funded research program and measures students’ confidence and self-efficacy in STEM subjects, 21st century learning skills, and interests in STEM careers. It contains 56 items across six constructs: math attitudes (8 items), science attitudes (9 questions), engineering and technology attitudes (9 questions), 21st century learning attitudes (11 items), interest in STEM career areas (12 items), and 7 “About You” items that measure short-term expectations for course success and exposure to STEM careers. Responses are supported by a five-point Likert scale, with response options ranging from “strongly disagree” (1) to “strongly agree” (5). Higher scores reflect the greater perceived value of participants. Cronbach’s of the S-STEM ranged from 0.84 to 0.86 for the grade 4-5 subscales and 0.89 to 0.91 for the middle high school subscales.
Gameplay analytics. Click trails will be used to determine in-game engagement. We will use 60 second game intervals without activity as the basis for engagement based on Dr. Marino’s prior research. We recognize this time period may need to be adjusted based on the change from science to mathematics.
An identical procedure will be utilized at each research site. An implementation guide will be developed for the research team and teachers who will implement the simulation software with students.
ModelME Game-enhanced Curriculum
Research team members will participate in an implementation fidelity training. A research question protocol will be followed for interviews. Students will play through the simulated careers in their tier 2 classrooms. Random usernames/numbers will be generated for each student. Teachers will enter demographic information about the students. The teachers will then remove the students’ actual name so no personally identifiable information is collected. Variables of interest include: 1) gender, 2) English language learner status (coded yes, no), 3) preference toward simulations (coded by the number of hours the student plays video games each week), 4) race, and 5) ethnicity.
Assessment procedure. Consenting students will take the measures of engagement, conceptual understanding, and STEM/ICT interests. Testing will occur outside the students’ classrooms in a convenient location (e.g., the school media center). A member of the research team will administer and score all pretest measures. Measures will be given to students as a group; the engagement and interest surveys will be administered in one 50-minute class period and the fractions tests will be administered in three 50-minute class periods. Testing days will be consecutive. Research team members will be trained to tell students that they will answer questions about their perspectives and interest in STEM/ICT and fraction knowledge and to do their best. The research team will give no other direction. Items will be read aloud to students who require assistance. Posttest procedures will mirror pretest procedures.
Instruction procedure-Intervention. The intervention will commence across the research sites for nine weeks for the experimental group. Nine weeks is considered the minimum best-practice quality
Teacher training and fidelity measures. After the beta build, we will recruit teachers who deliver tier 2 instruction and provide professional development related to the curriculum during 3-hour sessions over 4 days. Day 1 will open with the purpose of the study, the logic model, and the target population. Teachers will become familiar with the design of the gaming environment, its features, and the overall learning trajectory that grounds the task sequence and challenges. Over the next 2 days, each task type and its supported student actions will act as structural guides. The guides are designed to help teachers deepen their understanding of how students might engage in the required actions necessary for cognitive advance. Video will be used to illustrate cognitive prompting tailored to student actions across the tasks in previous studies and product field testing. Teachers will work with each other to discern the intricate details and nuances apparent in students’ activity and prompts utilized by the researcher to support noticing and reflecting actions. Teachers will engage in similar analysis of student conversations during social rehearsal and talk moves used by the teacher to support student interaction.
On the 4th day, teachers will practice using the instruction protocol through role playing. Teachers will form small groups with peers who will act as students, hypothetically engage in the tasks, and provide mocked student actions and game prompts from the career coach. Teachers will continue to observe details and nuances apparent in students’ activity and prompts. They will then engage the group in social rehearsal and use talk moves taught during the training to facilitate student conversation. They will also practice evaluating whether or not the gaming system should move an intervention group forward to the next task type or continue with the current challenge based on mocked student gameplay and post challenge briefs. We will videotape this day to support initial fidelity checks.
Researchers will use the previous 2 days of hypothetical engagement, student actions and explanations, and group social rehearsals to check initial fidelity. The checklist will be based on an existing checklist developed during PI Hunt’s CAREER program and will be refined for ModelME. The checklist currently contains 10 items based on adherence to the steps of the instruction protocol. As the program is implemented, a researcher at each site will use the checklist during 4 random fidelity checks on each interventionist (2 for each condition) and measure their integrity to the intervention.
We anticipate approximately 1.5 weeks of gameplay for indicator for technology-based interventions each of the game levels. Participating teachers will form small groups of 5-10 students at each site. Instruction will occur in the same classroom that was designated for the pretest. Each session will last for 30 minutes. Procedures used by each teacher will be guided by the instruction protocol and training manual described earlier.
We will conduct 2 sets of data analysis: (1) assess pre-mid-post changes in self-reported engagement, conceptual understanding, and self-reported STEM and ICT interest, (2) assess between- and within-student fluctuations in time stamped behaviors over the learning sessions.
To investigate changes in students’ reported engagement, their conceptual understanding, and reported STEM and ICT interest from pretest, mid-study, to posttest between conditions, we will conduct repeated measures MANOVAs with time as the within-subjects factor and condition as the between- subjects factor condition (ModelME vs. BAU), and the interaction between time and condition on engagement scores, conceptual understanding scores, and STEM/ICT interest scores. As we will use a matched design, there should be no significant differences in demographic variables. However, if there are differences, we will use those variables as covariates in our analyses (i.e., repeated measures MANCOVAs). We will also run linear regressions (single and multiple) to determine if engagement, conceptual understanding, and STEM and ICT interest at pretest are predictive of these measures at mid-study or posttest, or if measures at mid-study are predictive of measures at post-test.
. We will examine the main effect of time (pre vs. mid. vs. post), the main effect of
To investigate the trace data (i.e., click trails during the game) and how
they fluctuate across sessions or games, we will conduct multilevel modeling and educational data mining analyses. As the study includes a nested design (students within tiers within schools), we can investigate the between- and within-student, tier,
or school variance in time spent engaging in game activities or game levels, and their association with our posttest measures. The trace data will generate a stacked dataset with multiple rows of data per participant, where one row can represent a game session or game level, as opposed to aggregating all this information into a single row (i.e., mean game session duration). In addition, we will use sequential pattern mining to examine if we can detect sequences of in-game actions that are common across participants, and then determine the frequencies of those sequential patterns per student and compare
During years 1 and 2, the project team will iteratively build and test the curriculum using the Lean Startup model . A pilot study conducted in tier 2 intervention classes during year 2 will lead to a final revision of
the curriculum and gold build of the game. A nine-week study of ModelME will be completed in year 3. The study will utilize a quasi-experimental design to compare students across business-as-usual (BAU) and ModelME conditions to address the following research questions:
RQ1. Are there statistically significant differences in levels of engagement when students are compared between the treatment (ModelME) and control (BAU) conditions?
RQ2. Are there statistically significant differences in pretest and posttest conceptual understanding of fractions, as measured by game performance and traditional assessments, between students in the treatment (ModelME) and control (BAU) conditions? If so, what is the nature of the differences?
RQ3. Are there statistically significant differences in pretest and posttest career interests in STEM or ICT between students in the treatment (ModelME) and control (BAU) conditions?
The project team hypothesizes students in the ModelME intervention will demonstrate higher levels of engagement, exhibit greater conceptual understanding, and report higher interest in STEM and ICT careers than students in the BAU condition after nine weeks of the curriculum. There may be many sub-questions emerging from this initial inquiry. The team will proactively archive all gameplay, curriculum, and assessment data for further analysis during future research. Main products from this award include the development and iterative refinement of the game-based curriculum, pilot testing with students and their teachers, and ultimately a viable, web-based software and curriculum package. Students with LD will benefit from accessible and supportive gameplay for their unique learning needs. Teachers will benefit from a proactive curriculum addressing diversity in the student population. Targeting students in tier 2 interventions will maximize the potential for the curriculum to have broader impacts with students with disabilities and other traditionally marginalized populations.
The 2017 National Assessment of Educational Progress highlights stagnant and declining performance in mathematics, foundational to STEM and ICT, in both 4th (17% proficient) and 8th (9% proficient) grades. Disparities are even more severe in critical areas, such as fractions. Conceptual knowledge in fractions mediates fraction performance between students with and without specific learning disabilities (LD) such as dyslexia, dysgraphia, and dyscalculia. In fact, researchers report 4th to 6th graders with LD begin their study of fractions with diminished engagement and conceptual understanding compared to their peers and show significantly less improvement in solving problems and application of computational procedures over time . Stagnant improvement in fractions limit students’ access to STEM and ICT careers more generally .
Model Mathematics Education (ModelME) has the potential to significantly enhance engagement and conceptual understanding of fractions for students by providing a novel, innovative and integrated curriculum in virtual learning environment. ModelME will be developed using the Universal Design for Learning (UDL) framework to maximize accessibility and engagement by providing conceptual understanding challenges rooted in authentic STEM and ICT careers. Executive function scaffolds, cognitive tutoring, and authentic formative and summative assessments will be included within the user interface. Competitive gameplay combined with collaborative problem-solving and reflection activities will strategically link to National Council of Teachers of Mathematics (NCTM) standards.
1. Alafari, E., Aldridge, J. M., & Fraser, B. J. (2012). Effectiveness of using games in tertiary-level mathematics classrooms. International Journal of Science and Mathematics Education, 10, 1369-1392.
2. Baker, R. S., & Inventado, P. S. (2014). Educational data mining and learning analytics. In J. A. Larusson & B. White (Eds.), Learning Analytics (pp. 61-75). New York, NY: Springer.
3. Balu, R., Zhu, P., Doolittle, F., Schiller, E., Jenkins, J., & Gersten, R. (2015). Evaluation of Response to Intervention (RtI) practices for elementary school reading. Report. Retrieved from https://www.mdrc.org/publication/evaluation-response-intervention-rti-practices-elementary-school-reading
4. Basham, J. D. & Marino, M. T. (2013). Understanding STEM education and supporting students through Universal Design for Learning. Teaching Exceptional Children. 45(4), 8-15.
5. Beserra, V., Nussbaum, M., Zeni, R., Rodriguez, W., & Wurman, G. (2014). Practicing arithmetic using educational video games with an interpersonal computer. Educational Technology and Society, 17(3), 343–358.
6. Bittinger, J. (2018). STEM pipeline for students with disabilities: From high school to intentions to major in STEM. Doctoral Dissertations. 1313.
7. Boaler, J. (2016). Mathematical mindsets: Unleashing students' potential through creative math, inspiring messages and innovative teaching. New York, NY: Jossey-Bass.
8. Borg, W. R., & Gall, M. D. (1989). Educational research. An introduction (5th ed.). White Plains, NY: Longman.
9. Bottge, B. A., Cohen, A. S., & Choi, H. J. (2018). Comparisons of mathematics intervention effects in resource and inclusive classrooms. Exceptional Children, 84(2), 197-212.
10. Byun, J. & Joung, E. (2018). Digital game learning for K-12 mathematics education: A meta-analysis. School Science and Mathematics, 118, 113-126.
11. CAST (2018). Universal Design for Learning Guidelines version 2.2. Retrieved from http://udlguidelines.cast.org
12. Chung, J., Cannady, M. A., Schunn, C., Dorph, R., & Bathgate, M., (2016) Measures Technical Brief: Engagement in Science Learning Activities. Retrieved from: http://www.activationlab.org/wp-content/uploads/2016/02/Engagement-Report-3.1-20160331.pdf
13. Cohen, J. (1992). A power primer. Psychological Bulletin, 112, 155-159.
14. Coy, K., Marino, M. T., & Serianni, B. (2014). Using Universal Design for Learning in synchronous online instruction. Journal of Special Education Technology, 29(1), 63-74.
15. Diamond, A. (2013). Executive functions. Annu. Rev. Psychol. 64, 135–168. doi: 10.1146/annurev-psych-113011-143750
16. Dossel, S. (2016). Maths anxiety. Australian Mathematics Teacher, 72(3), 40-44.
17. Dunn, C., Shannon, D., McCullough, B., Jenda, O., & Quazi, M. (2018). An innovative postsecondary education program in STEM (Practice Brief). Journal of Postsecondary Education and Disability, 31(1), 91-101.
18. Empson, S. B., & Levi, L. (2011). Extending Children’s Mathematics: Fractions and Decimals. Portsmouth, NH: Heinemann.
19. Friday Institute for Educational Innovation (2012). Student Attitudes toward STEM Survey-Upper Elementary School Students, Raleigh, NC: Author.
20. Fuchs, L. S., Malone, A. S., Schumacher, R. F., Namkung, J., & Wang, A. (2017). Fraction intervention for students with mathematics difficulties: Lessons learned from five randomized controlled trials. Journal of Learning Disabilities, 50(6), 631-639.
21. Gersten, R., Chard, D. J., Jayanthi, M., Baker, S. S., Murphy, P., & Flojo, J. (2009). Mathematics instruction for students with learning disabilities: A meta analysis of instructional components. Review of Educational Research, 79(3), 1202-1242,
22. Gersten, R. & Edyburn, D. (2007). Defining quality indicators in special education technology research. Journal of Special Education Technology, Retrieved from https://doi.org/10.1177/016264340702200302
23. Green, A. & Sanderson, D. (2018). The roots of STEM achievement: An analysis of persistence and attainment in STEM majors. American Economist, 63(1), 79-94. doi.org/10.1177/0569434517721770
24. Gregg, N., Wolfe, G., Jones, S., Todd, R., Moon, N., & Langston, C. (2016). STEM e-mentoring and community college students with disabilities. Journal of Postsecondary Education and Disability, 29(1), 47-63.
25. Hecht, S. A., & Vagi, K. J. (2010). Sources of group and individual differences in emerging fraction skills. Journal of Educational Psychology, 102(4), 843.
26. Herbel-Eisenmann, B. A., Steele, M. D., & Cirillo, M. (2013). (Developing) teacher discourse moves: A framework for professional development. Mathematics Teacher Educator, 1(2), 181-196.
27. Hord, C., Tzur, R., Xin, Y. P., Si, L., Kenney, R. H., & Woodward, J. (2016). Overcoming a 4th grader’s challenges with working-memory via constructivist-based pedagogy and strategic scaffolds: Tia’s solutions to challenging multiplicative tasks. Journal of Mathematical Behavior, 44, 13-33.
28. Hsuan-Chen, W., White, S., Rees, G., & Burgess, P.U. (2018). Executive function in high-functioning autism: Decision-making consistency as a characteristic gambling behavior. Cortex, 107, 21-36.
29. Hunt, J. H. (2014). Effects of a supplemental intervention focused in equivalency concepts for students with varying abilities. Remedial and Special Education, 35(3), 135-144.
30. Hunt, J. H. & Empson, S. (2015). Exploratory study of informal strategies for equal sharing problems of students with learning disability. Learning Disabilities Quarterly, 38(4), 208-220.
31. Hunt, J. H., MacDonald, B., Lambert, R., Sugita, T., & Silva, J. (2018). Think-Pair-Show-Share to Increase Classroom Discourse. Teaching Children Mathematics, 25(2), 78-84.
32. Hunt, J. H., MacDonald, B. L., & Silva, J. (2019). Gina’s mathematics: Thinking, tricks, or “teaching”? Journal of Mathematical Behavior, 100707.
33. Hunt, J. H. & Silva, J. (in press). Emma’s negotiation of number: Implicit intensive intervention. Journal for Research in Mathematics Education.
34. Hunt, J. H., Silva, J., & Lambert (in press). Empowering students with specific learning disabilities: Jim’s concept of units fractions. Journal of Mathematical Behavior.
35. Hunt., J.H., Silva, J., & Welch-Ptak, J. (revisions). Reasoning & sense making to support conceptions of fractions for students with learning disabilities. Cognition and Instruction.
36. Hunt, J. H. & Tzur, R. (2017). Where is difference? Processes of mathematical remediation through a constructivist lens. Journal of Mathematical Behavior, 48, 62-76.
37. Hunt, J. H., Tzur, R., & Westenskow, A. (2016). Evolution of unit fraction conceptions in two fifth-graders with a learning disability: An exploratory study. Mathematical Thinking and Learning, 18(3), 182-208.
38. Hunt, J. H., Westenskow, A., Silva, J., & Welch-Ptak, J. (2016). Levels of participatory conception of fractional quantity along a purposefully sequenced series of equal sharing tasks: Stu's trajectory. Journal of Mathematical Behavior, 41, 45-67.
39. Israel, M., Marino, M., Delisio, L., & Serianni, B. (2014). Supporting content learning through technology for K-12 students with disabilities (Document No. IC-10). Retrieved from University of Florida, Collaboration for Effective Educator, Development, Accountability, and Reform Center (CEEDAR). Website: http://ceedar.education.ufl.edu/tools/innovation-configurations/
40. Ke, F., & Abras, T. (2013). Games for engaged learning of middle school children with special learning needs. British Journal of Educational Technology, 44(2), 225–242.
41. Kinnebrew, J. S., Loretz, K. M., & Biswas, G. (2013). A contextualized, differential sequence mining method to derive students’ learning behavior patterns. Journal of Educational Data Mining, 5, 190–219.
42. Lambert, R., & Tan, P. (2017). Conceptualizations of students with and without disabilities as mathematical problem solvers in educational research: A critical review. Education Sciences, 7(2), 51; https://doi.org/10.3390/educsci7020051.
43. Leroy, N., & Bressoux, P. (2016). Does a motivation matter more than motivation in predicting mathematics learning gains? A longitudinal study of sixth-grade students in France. Contemporary Educational Psychology, 44, 41-53. https://doi.org/10.1016/j.cedpsych.2016.02.001.
44. Lewis, K. E. (2014). Difference not deficit: Reconceptualizing mathematical learning disabilities. Journal for Research in Mathematics Education, 45(3), 351-396.
45. Lin, C. H., Liu, E. Z., Chen, Y. L., Liou, P. Y., Chang, M., & Wu, C. H. et al. (2013). Game-based remedial instruction in mastery learning for upper-primary school students. Educational Technology and Society, 16(2), 271-281.
46. Marino, M. T. (2009). Understanding how adolescents with reading difficulties utilize technology-based tools. Exceptionality, 17(2), 88-102.
47. Marino, M. T., Becht, K., Vasquez III, E., Gallup, J., Basham, J. D., & Gallegos, B. (2014). Enhancing secondary science content accessibility with video games. Teaching Exceptional Children, 47(1), 27-34. doi: 10.1177/0040059914542762
48. Marino, M. T., Black, A., Hayes, M., & Beecher, C. C. (2010). An analysis of factors that affect struggling readers’ comprehension during a technology-enhanced STEM astronomy curriculum. Journal of Special Education Technology, 25(3), 35-48.
49. Marino, M. T., Coyne, M. D., & Dunn, M. W. (2010). Technology-based curricula: How altered readability levels affect struggling readers’ passage comprehension. Journal of Computing in Mathematics and Science Teaching, 29(1), 31-49.
50. Marino, M. T., Gotch, C., Israel, M., Vasquez, E. III, Basham, J. D., & Becht, K. M. (2014). UDL in the middle school science classroom: Can video games and alternative text heighten engagement and learning for students with learning disabilities? Learning Disability Quarterly, 37, 87-99.
51. Marino, M. T., Israel, M., Beecher, C. C., & Basham, J. D. (2013). Students' and teachers' perceptions of using videogames to enhance science instruction. Journal of Science Education and Technology. 22, 667-680
52. McDaniel, S., Albrittan, K., & Roach, A. (2013). Highlighting the need for further response to intervention research in general education. Retrieved from https://files.eric.ed.gov/fulltext/EJ1064666.pdf.
53. Miyake, A., & Friedman, N. P. (2012). The nature and organization of individual differences in executive functions: Four general conclusions. Current Directions in Psychological Science, 21(1), 8-14. doi: 10.1177/0963721411429458.
54. National Academies of Sciences, Engineering, and Medicine (2018). Aging and Disability: Beyond Stereotypes to Inclusion: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi.org/10.17226/25029.
55. National Education Technology Plan (2016). Future ready learning: Reimagining the role of technology in education. Retrieved from https://tech.ed.gov/files/2015/12/NETP16.pdf
56. National Education Technology Plan Update (2017). Reimagining the role of technology in education. Retrieved from https://tech.ed.gov/files/2017/01/NETP17.pdf
57. National Science Foundation, National Center for Science and Engineering Statistics. (2019). Women, Minorities, and Persons with Disabilities in Science and Engineering: 2019. Special Report NSF 19-304. Alexandria, VA. Retrieved from https://www.nsf.gov/statistics/wmpd.
58. National Science and Technology Council. (2018). Charting a course for success: America’s strategy for STEM education. Retrieved from https://www.whitehouse.gov/wp-content/uploads/2018/12/STEM-Education-Strategic-Plan-2018.pdf
59. Nelson, L. L., & Basham, J. D. (2014). A Blueprint for UDL: Considering the design of implementation. Lawrence, KS: UDL-IRN. Retrieved from http://udl-irn.org.
60. Plass, J. L., O’Keefe, P. A., Homer, B. D., Case, J., Hayward, E. O., Stein, M., & Perlin, K. (2013). The impact of individual, competitive, and collaborative mathematics game play on learning, performance, and motivation. Journal of Educational Psychology, 105(4), 1050–1066. https://doi.org/10.1037/a0032688
61. Powell, A., Nielsen, N., Butler, M., Buxton, C., Johnson, O., Ketterlin-Geller, L., ... & McCulloch, C. (2018). The Use of Theory in Research on Broadening Participation in PreK-12 STEM Education: Information and Guidance for Prospective DRK-12 Grantees. Education Development Center, Inc.
62. Rappolt-Schlichtmann, G., Daley, S.G., & Rose, T. L. (2012). A research reader in Universal Design for Learning. Boston. Harvard Education Press.
63. Raudenbush, S. W., & Bryk, A. S. (2002). Hierarchical linear models: Applications and data analysis methods (2nd ed.). Thousand Oaks, CA: Sage.
64. Ries, E. (2011). The lean startup: How today's entrepreneurs use continuous innovation to create radically successful businesses. Random House LLC.
65. Rosas, C., & Campbell, L. (2010). Who’s teaching math to our most needy students? A descriptive study. Teacher Education and Special Education, 33(2), 102-113.
66. Sawyer, R., Azevedo, R., Rowe, J., & Lester, J. (2018). Filtered time series analyses of student problem-solving behaviors in game-based learning. Proceedings from the 11th International Conference on Educational Data Mining, 229-238. Buffalo, NY. Retrieved from http://educationaldatamining.org/files/conferences/EDM2018/EDM2018_Preface_TOC_Proceedings.pdf
67. Siew, P. H. (2018). Pedagogical change in mathematics learning: Harnessing the power of digital game-based learning. Educational Technology and Society, 21(4), 259-276.
68. Smith, S. J., Rao, K., Lowery, A., Gardner, G., Moore, E., Coy, K., Marino, M. T., & Wojcik, B. (2019). Recommendations for a national research agenda in UDL: Outcomes from the UDL-IRN preconference on research. Journal of Disability Policy Studies, 1-12. doi:10.1177/1044207319826219
69. Smithsonian Science Education Center (2019). The STEM imperative. Retrieved from https://ssec.si.edu/stem-imperative
70. Sparks, S. (2015). Study: RTI practice falls short of promise. Education Week, November 6, 2015. Retrieved from https://www.edweek.org/ew/articles/2015/11/11/study-rti-practice-falls-short-of-promise.html
71. Steffe, L. P. (2002). A new hypothesis concerning children’s fractional knowledge. Journal of Mathematical Behavior, 20, 267-307.
72. Strother, S., Brendefur, J. L., Thied, K., & Appleton-Mae, S. (2016). Five key ideas to teach fractions and decimals with understanding. Advances in Social Science Research Journal, 3(2), 132-137.
73. Sutton, H. (2017). Students with disabilities as likely to enter STEM fields as those without disabilities. Disability Compliance for Higher Education, 22(9), 9-9.
74. Tabachnick, B. G., & Fidell, L. S. (2007). Using Multivariate Statistics (7th ed.). New York, NY: Pearson.
75. Taub, M., & Azevedo, R. (2018). Using sequence mining to analyze metacognitive monitoring and scientific inquiry based on levels of efficiency and emotional expressivity during game-based learning. Journal of Educational Data Mining, 10, 1-26.
76. Taub, M., & Azevedo, R. (2019). Investigating students’ cognitive and metacognitive self-regulated learning during learning with a hypermedia-learning environment. International Journal of Artificial Intelligence in Education, 29, 1-28.
77. Taub, M., Azevedo, R., Bradbury, A. E., Millar, G. C., & Lester, J. (2018). Using sequence mining to reveal the efficiency in scientific reasoning during STEM learning with a game-based learning environment. Learning and Instruction, 54, 93-103.
78. Taub, M., Azevedo, R., Rajendran, R., Cloude, E. B., Biswas, G., & Price, M. J. (in press/online first 2019). How are students’ emotions related to the accuracy of their use of cognitive and metacognitive processes during learning with an Intelligent Tutoring System? Learning and Instruction.
79. Taub, M., Mudrick, N. V., Azevedo, R., Millar, G. C., Rowe, J., & Lester, J. (2017). Using multi-channel data with multi-level modeling to assess in-game performance during gameplay with Crystal Island. Computers in Human Behavior, 76, 641-655.
80. Torske T., Nærland T., Øie, M.G., Stenberg, N. and Andreassen, O.A. (2018). Metacognitive Aspects of Executive Function Are Highly Associated with Social Functioning on Parent-Rated Measures in Children with Autism Spectrum Disorder. Front. Behav. Neurosci. 11:258. doi: 10.3389/fnbeh.2017.00258.
81. Tzur, R. (2007). Fine grain assessment of students’ mathematical understanding: Participatory and anticipatory stages in learning a new mathematical conception. Educational Studies in Mathematics, 66(3), 273-291.
82. Tzur, R., & Hunt, J. H. (2015). Iteration: Unit fraction knowledge and the French fry task. Teaching Children Mathematics, 22(3), 148-157.
83. U.S. Department of Education. (2018). 40th annual report to congress on the implementation of the Individuals with Disabilities Education Act, 2018. Retrieved from https://www2.ed.gov/about/reports/annual/osep/2018/parts-b-c/40th-arc-for-idea.pdf
84. U.S. Department of Education, Office of Civil Rights. (2018). STEM course taking. Retrieved from https://www2.ed.gov/about/offices/list/ocr/docs/stem-course-taking.pdf
85. U.S. Department of Labor (2018). Occupational Outlook Handbook. Retrieved from https://www.bls.gov/ooh/
86. von Glasersfeld, E. (1989). Cognition, construction of knowledge, and teaching. Synthese, 80(1), 121-140.
87. White, K., & McCoy, L. P. (2019). Effects of Game-Based Learning on Attitude and Achievement in Elementary Mathematics. Networks: An Online Journal for Teacher Research, 21(1). https://doi.org/10.4148/2470-6353.1259
88. White, J. L., & Mitchell, S. K. (2013). Career Certainty and Persisting Interest in STEM: An Analysis of Underrepresented Groups. Journal of Women and Minorities in Science and Engineering, 19(1), 47-66.
89. Wilkins, J. L. M., & Norton, A. (2018). Learning progression toward a measurement concept of fractions. International Journal of STEM Education, 5(27). https://doi.org/10.1186/s40594-018-0119-2
90. Wilkins, J. L. M., Norton, A. & Boyce, S. J. (2013). Validating a written instrument for assessing students' fractions schemes and operations. The Mathematics Educator, 22(2), 31-54.
91. Xin, Y. P., Chiu, M. M., Tzur, R., Ma, X., Park, J. Y., & Yang, X. (2019). Linking Teacher–Learner Discourse With Mathematical Reasoning of Students With Learning Disabilities: An Exploratory Study. Learning Disability Quarterly, 0731948719858707.
92. Xin, Y. P., Liu, J., Jones, S. R., Tzur, R., & Si, L. (2016). A preliminary discourse analysis of constructivist-oriented mathematics instruction for a student with learning disabilities. The Journal of Educational Research, 109(4), 436-447.
Rohana is a Ph.D. scholar at UCF and funded U.S. Department of Education Office of Special Education Programs TELEPORTS Scholar. She partners with technology companies to conceptualize and develop innovative products to improve STEM college and career readiness for students with disabilities. After leaving the STEM industry, Rohana became a secondary special education teacher in 2015. Her work focused on STEM integration in a rural Title 1 school district. She models self-determination, self-efficacy, and grit to overcome adversity. Her background is in large-scale construction, STEM business administration, technical writing, and transition for students with disabilities.