Get ready for ISTELive 21! Launch the site now.
Creative Constructor
Lab Virtual
Leadership Exchange
at ISTELive 21
Edtech Advocacy &
Policy Summit

Online Graduate Student Surveys: Engagement, Technology and Instructor Behaviors

Listen and learn

Listen and learn : Research paper
Roundtable presentation


Thursday, December 3, 1:45–2:30 pm PST (Pacific Standard Time)
Presentation 1 of 3
Other presentations:
English Learners and Technology: What We Know, and How to Do Better
Knowledge Building Framework for Enhancing Online Discussions Among In-Service Teachers

Dr. Lonna Rocha  
Christopher Niileksela  
Bruce Frey  
Thomas DeLuca  
Monica Simonsen  
Steven Lee  
Andrea Garcia  
Jeffrey Hoover  

The session presents the evaluation of university online graduate programs within a school of education. Pulling from current student surveys, the findings indicate students’ engagement with a variety of activities, rates the usefulness of technology tools, and shows a correspondence between what students value and what instructors actually do.

Audience: Teachers, Teacher education/higher ed faculty
Attendee devices: Devices useful
Attendee device specification: Smartphone: Windows, Android, iOS
Laptop: Chromebook, Mac, PC
Tablet: Android, iOS, Windows
Participant accounts, software and other materials: Kahoots
Topic: Distance, online & blended learning
Grade level: Community college/university
Subject area: Higher education, Inservice teacher education
ISTE Standards: For Educators:
Leader
  • Shape, advance and accelerate a shared vision for empowered learning with technology by engaging with education stakeholders.
For Education Leaders:
Visionary Planner
  • Communicate effectively with stakeholders to gather input on the plan, celebrate successes and engage in a continuous improvement cycle.
Systems Designer
  • Establish partnerships that support the strategic vision, achieve learning priorities and improve operations.
Additional detail: Session recorded for video-on-demand

Proposal summary

Framework

The Sloan Consortium (now called the Online Learning Consortium [OLC]) “is the leading professional organization devoted to advancing quality online learning by providing professional development, instruction, best practice publications and guidance to educators, online learning professionals and organizations around the world.” (http://onlinelearningconsortium.org/about/).
The OLC has developed a quality framework for creating and evaluating online programs. The framework is organized into five pillars: 1) Learning effectiveness (ensuring students receive a high-quality online education); 2) Cost effectiveness and institutional commitment (now called Scale, offering best value to learners and to achieve capacity enrollment for the institution); 3) Access (all qualified and motivated students can complete programs of their choice) 4) Faculty satisfaction (instructors find online teaching rewarding and beneficial) and; 5) Student satisfaction (students are satisfied with the course rigor, interaction with instructors and peers, and support services). These pillars aligned closely with the themes we obtained through interviews with the stakeholders. The evaluation team worked to combine the themes identified by the stakeholders and the pillars for quality online education from the OLC when developing the evaluation plan and logic model for the current evaluation.

Methods

Surveys of online students were collected in the past and feedback from faculty suggested a need to gather new information that may inform course design and instruction. With a focus on providing information that could be relevant to faculty for course design and instruction, the student survey was created in conversation with online instructors. Using Qualtrics, an online survey engine, the team sent the survey to 608 students who were reported to be current students as of 2019. Of those, 101 (15.0%) students started the survey and completed at least one scale, and 72 (11.8%) completed all scales on the survey. The student engagement scale was designed to measure how engaged students were with course material and activities. Two complementary scales were developed to understand how students perceive online instructor behavior. Two scales were used to measure how useful and engaging technology was. Data are reported primarily through descriptive statistics. As some open-ended questions were included in the survey, there is some qualitative analysis included in the research.

Results

On a set of surveys administered during the summer of 2019, current students reported a strong level of engagement in courses and reported they use what they learn in classes in the field. They reported strong correspondence between what they value in online instructors and what the instructors actually do in their courses. Students provided positive ratings about course design appear to be satisfied with the technology used in courses. Students reported that some technology tools in courses worked well for learning (e.g., emails, PowerPoints, readings), while some others were lacking. Students noted a desire to firmly establish a community through the program (e.g., connecting with others and the university community through the orientation, wishing there was more community facilitated through the programs).
In response to open-ended questions, students suggested some technology challenges related to access of course materials, submitting work, or interacting with various technological tools used in the course. While not all students indicated they participated in an orientation, half who did found it beneficial to learn about learning management system, their online program, and the university community. Students described a good online student as someone with time management skills, motivation, positive interactions with instructors and students, and being engaged in the course. Students described good online instructors as someone willing to work with students, who is available, responsive, provides timely communication and feedback, and outlines clear course expectations.

Importance

The purpose of these research areas is to understand online student experiences with the goal of informing university instructor practices and support. Having a sense for how online students experience and perceive their online courses may help inform instructors as they develop learning opportunities, engage with students, and respond to questions and submitted work. In addition, institutions of higher education may benefit as they work to support online instructors in their work with online students.

References

Adelman, H. & Taylor, L. (2000). Moving prevention from the fringes into the fabric of school improvement. Journal of Educational and Psychological Consultation, 11, 7-36.
Barnette, J. (2018). Instructor presence in online education: An analysis of student perceptions and performance (Doctoral dissertation). Retrieved from https://scholarworks.iu.edu
/dspace/bitstream/handle/2022/22578/Barnette%20dissertation%20Final.pdf?sequence=1&isAllowed=y
Bolliger, D. U., Inan, F. A., & Wasilik, O. (2014). Development and Validation of the Online Instructor Satisfaction Measure (OISM). Educational Technology & Society, 17 (2), 183–195.
Brooks, E., & Morse, R. (2014). Methodology: Best Online Graduate Education Programs
Rankings. Retrieved September 18, 2019, from US News & World Report website: https://www.usnews.com/education/online-education/articles/education-methodology
Dixson, M. D. (2015). Measuring student engagement in the online course: The online student engagement scale (OSE). Online Learning, 19(4), 1-15.
Gunawardena, C. N., & Zittle, F. J. (1997). Social presence as a predictor of satisfaction within a computer mediated conferencing environment. American Journal of Distance Education, 11(3), 8–26.
Haythornthwaite, C. (2002). Building social networks via computer networks: Creating and sustaining distributed learning communities. In K. A. Renninger & W. Shumar (Eds.), Building Virtual Communities: Learning and Change in Cyberspace. Cambridge: Cambridge University Press.
Ivankova, N, & Stick, S. (2007). Students’ persistence in a distributed doctoral program in educational leadership in higher education: A mixed methods study. Research in Higher Education, 48(1), 93-135.
James, S., Swan, K., & Daston, C. (2016). Retention, progression and the taking of online courses. Online Learning. Retrieved from http://onlinelearningconsortium.org/read/online-learning-journal/ .
Jiang, M., & Ting, E. (2000). A study of factors influencing students’ perceived learning in a web-based course environment. International Journal of Educational Telecommunications 6(4), 317–338.
Joint Committee on Standards for Educational Evaluation (Ed.) (1994): The Program
Evaluation Standards, 2nd Edition. Newbury Park: Sage.
Lan, J. (2001). Web-based instruction for education faculty: A needs assessment. Journal of Research on Computing in Education, 33, 385-400. Lee, S.W., Lohmeier, J.H., Frey, B.B. & Tollefson, N. (2004, November). Facilitating university research through program evaluation: The Research Evaluation Model (REM). Presented at the Annual Meeting of the American Evaluation Association, Atlanta, GA.
Lee, J., Song, H., & Hong, A. (2019). Exploring factors and indicators for measuring students’ sustainable engagement in e-learning. Sustainability, 11, 985.
119
Everspring Program Evaluation Report: Year 5
September 18th, 2019
Levin, H. M., Belfield, C., Hollands, F., Bowden, A. B., Cheng, H., Shand, R., Pan, Y., & Hanisch-Cerda, B. (2012). Cost–effectiveness analysis of interventions that improve high school completion. Teacher College, Columbia University.
Martin, F., Wang, C., & Sadaf, A. (2018). Student perception of helpfulness of facilitation strategies that enhance instructor presence, connectedness, engagement and learning in online courses. The Internet and Higher Education, 37, 52-65.
Muilenburg, L.Y., & Berge, Z. (2005). Student barriers to online learning: A factor analytic study. Distance Education, 26, 29-48.
Richardson, J., & Swan, K. (2003). An examination of social presence in online learning: students’ perceived learning and satisfaction. Seattle, WA: Paper presented at the annual meeting of the American Educational Research Association.
Rovai, A. P. (2002). A preliminary look at structural differences in sense of classroom community between higher education traditional and ALN courses. Journal of Asynchronous Learning Networks 6(1) 29-38. Strauss, A., & Corbin, J. (1998). Basics of qualitative research: techniques and procedures for developing grounded theory (2nd ed.). California: Sage Publications, Inc.
Stufflebeam, D. (2001). Evaluation Model: New Directions for Evaluation Models. G.C. Green & G.T. Henry (Eds.) San Francisco, CA: Jossey Bass.
Swan, K. (2003). Learning effectiveness: what the research tells us. In J. Bourne & J. C. Moore (Eds.) Elements of Quality Online Education: Practice and Direction, 13-45. Needham, MA: Sloan-C.

More [+]

Presenters

Photo
Dr. Lonna Rocha, The University of Kansas

People also viewed

Anatomy of Effective Digital-Age Projects: Designing for Equity, Inquiry, Literacy
Blended Learning in 90 Days
Growing Marigolds in Your Garden: Growing Ourselves to Grow Others

Testimonials