Hearing Students’ Voices: Understanding Student Perspectives of Online Learning
The College at Brockport
The College at Brockport
The College at Brockport
The College at Brockport
Across the nation, the demand for online teaching continues to steadily increase according to data from the U.S. Department of Education, National Center for Educational Statistics (NCES, 2018). Currently, about one in six college students is enrolled in a 100% online program and more than 33% of college students are enrolled in at least one online class (NCES, n.d.). Additionally, a great proportion of students take classes that include online components. Our public comprehensive college located in northeastern United States is no exception. Even though many faculty and students profess to prefer face to face classes, the campus is abuzz with talk of new online programs and initiatives — aimed to ensure greater student access — as well as to boost enrollment. Indeed, improving and proliferating online learning is a focal point on campus; central to both the college President’s mission, as well as the vision statement of Chancellor of the State University of New York, under which our college is governed.Despite the development of educational technology and the increasing demand from higher education institutions on online and blended or hybrid (that is, partially face to face and partially online) teaching, Jaschik & Lederman (2014) report that faculty members still retain the old concerns regarding the role of technology in and outside of the classroom. Cohan (2019) compares online education to McDonaldization, which is “cheap, easy, fast, canned, formulaic and alienating” (p. 1). To address these concerns, faculty members at our college came together to form the “Online Refresh Faculty Learning Community” (FLC) to explore ways to improve and better support online teaching and learning. The grassroots Faculty Learning Communities were established of the faculty, by the faculty, and for the faculty to support faculty with professional development opportunities (Cox, 2004; Zhang, LeSavoy, Lieberman, & Barrett, 2014). An FLC is composed of six to fifteen faculty and professional staff across different disciplines to build a genuine community, make a year-long commitment, and engage in active and collaborative professional development conceived as learning (Cox; Zhang, LeSavoy, Lieberman, & Barrett; Zhang & Pearlman, 2018). At the research site, 11 faculty and staff members from seven disciplines who were interested in better facilitating students’ online learning experience formed a one-year-long Online Refresh FLC. It offered a safe, collaborative and interdisciplinary platform for the participants, on a regular basis, to exchange ideas and share experiences, pilot different technology tools and strategies of teaching, reflect and improve their teaching practices, and support each other to provide quality learning experiences (Zhang, LeSavoy, Lieberman, & Barrett).
When consulting the literature for this study, three main themes emerged. We reviewed each of these three themes: first flexibility and/or convenience in scheduling and attendance of online courses, the importance of instructor presence, and the overall preference for face-to-face classes if feasible.
Flexibility. Many of the articles searched for this literature review discussed flexibility or convenience as a major advantage in selecting online courses (Bonnici, Maatta, Klose, Julien, & Bajjaly, 2016; Hass & Joseph, 2018; Jayaratne & Moore, 2017; Kim, Welch, & Nam, 2016; Phillips, Schumacher, & Arif, 2016; Wolfe et al., 2016). We cited online education in a variety of disciplines such as library and information science, economics, agriculture and life sciences, and pharmacy and with a variety of populations including undergraduate and graduate students. The predominant study design was survey and the analyses were mixed methods in nature. This allowed specific sub analyses to be done to determine if there were statistically significant differences in demographics among members in a given population.Theoretical Framework
Instructor Presence. Many concurrent articles discussed instructor presence. Simply put, Kim, Welch, and Nam (2016) found that “interactions between the instructor and students are essential to online learning” (p. 187). This is also supported by Bhagat, Wu & Chang (2016)’s study using confirmatory factor analysis on an instrument to measure student perceptions towards online learning. They found that instructor characteristics, such as the level of instructor presence in the learning management system, and social presence, for instance the level of student interaction, were two factors that influenced student perceptions towards online learning. Cole (2016) also found that instructor communication in a course is “the most significant predictor of student satisfaction with online courses” (p. 619), while Richardson, Besser, Koehler, Lim, and Straight (2016) found that feedback through the learning management system was one of the most important ways that instructors communicated with students. Lastly, Diep, Zhu, Struyven, and Black (2017) studied blended approaches to online education and was more specific in saying that “instructor expertise is the most significant factor” in student satisfaction of online courses (p. 485). While this last assertion leaves some room for debate, it is clear that instructors are essential to an online course.
Preference for Face to Face Classes. In multiple studies published in the last three years, students either directly stated preferring face-to-face learning over asynchronous online courses, or stated more advantages of face-to-face classes when compared to online courses (Bonnici et al., 2016; Hass & Joseph, 2018; Kim et al., 2016). Surprisingly though, the asynchronous modality was preferred when it came to tutorial support in lieu of face-to-face interaction (Richardson, 2016). The key to understanding this, however, may be in Cole et al.’s (2017) examination of media rich theory and the discovery that professors tried to replicate face-to-face classes in the online environment instead of designing their classes to capitalize on the benefits of the online environment. That is, online students in Cole et al.’s study were happy to have fewer channels of information in the online environment. Other reasons for preferring face-to-face classes were indicative of a trade-off between flexibility and perceived usefulness. In the end, the flexibility seems to have won out. Students wanted online courses, but they wanted courses that were engaging and had a strong instructor presence with a convivial communication style.
In exploring online pedagogy, we wonder, what are the methods and instructional approaches that best support learning in an online environment? Here we turn to the work of Picciano (2017), who argues that online education “has evolved as a subset of learning in general rather than a subset of distance learning” (p. 187). Picciano’s work focuses on critiquing, reviewing and building upon a multitude of theories and models of learning in order to create a dynamic multimodal model. Indeed, Picciano’s fluid model of online learning expands upon previous theories of learning. It incorporates elements of the following: social constructivism— where learning takes place through interactions with others; connectivism— where the tools of the digital age can expand our learning; behaviorism— where rewards are given to promote desired learning and cognitivism— where learning takes place through a series of processes in the brain (e.g. information processing). Picciano writes of his model:Based on these theories, Piccano’s (2017) model includes the following elements: 1. Content; 2. Socioemotional aspects; 3. Self paced/independent work; 4. Dialectic/questioning; 5. Reflection; 6. Collaboration and 7. Evaluation/assessment. These elements vary in emphasis depending upon the context of the online course. Examples of differing contexts include whether the course is blended, fully online or a self-paced course. Piccano’s multimodal model provides a framework for us to consider the design of our courses from an instructor’s perspective, informed by theories of learning.
Behaviorists will find elements of self-study and independent learning in adaptive software. Cognitivists might appreciate reflection and dialectic questioning as important elements of the model. Social constructivists will welcome the emphasis on community and interaction throughout the model. Connectivists might value the collaboration and the possibility of student-generated content (Picciano, 2017, p.186).
Due to the ever-growing popularity of online instruction worldwide and the mounting pressure at our home campus, as instructors, we have engaged in training in online teaching, explored theories of learning, as well as studied models of online education. However, our students’ perspectives of these online and blended learning environments remain unclear, yet should be central when considering course design in online and blended learning environments. Much of the research regarding online learning has been conducted through quantitative research, thus limiting the depth of student voices. By engaging in both quantitative and qualitative research, we can better understand student experiences, enabling instructors to create high quality courses.
Purpose of the Study
Therefore, the purpose of this research is to learn more about our students’ perceptions, attitudes and experiences in online environments to better inform our instructional practices and meet the varied needs of our students. Interestingly, research on online learning suggests the instructor has a great deal of influence on students’ views of the online course (Bhagat, Wu & Chang, 2016). As instructors in a variety of disciplines with the ability to impact our students’ experiences and online learning outcomes, we desired to hear from students, in their own words about their online instructional experiences — what insights and advice would our students provide to online instructors? Student feedback on online learning can inform our thinking and planning to enable us to create the most meaningful online environments possible. With this goal in mind, we designed a study guided by the following research questions:
- What are students’ perceptions and attitudes about online learning?
- What are students’ experiences in various online learning environments?
This study used mixed methods to investigate participants’ perspectives of blended and online learning. We first report the analyses and results of quantitative data. Afterward, the analyses and results of qualitative data are presented.
Setting and ParticipantsSurvey Instrument Development and Research Procedure
This study was conducted at a public comprehensive university located in northeastern United States. During the 2018-2019 academic year, there were over 500 sections with 8,429 enrollments for fully online courses, or 11.05% of the total number of 76,265 enrollments. There were also 167 sections of blended courses with 2,871 enrollments. We obtained approval from the Institutional Review Board (IRB) and invited participants for an online survey through Qualtrics, a web-based software. A list of 1,000 students were randomly selected from those registered at the college in the Spring 2018 semester to recruit participants of this study, and each student received an electronic link to the voluntary survey and consent form via three emails between April and May during the Spring semester. Randomly sampled individuals were from both graduate and undergraduate programs and had a history of taking an online or blended course of the span of their academic career before the Spring 2018 semester. Of the total population surveyed, 105 students participated in the study (10.5%) by submitting the survey voluntarily and anonymously via Qualtrics.
The survey instrument was composed of 13 questions created through input from the Online Refresh FLC team, comprised of 11 faculty and staff members from seven disciplines with a variety of online and blended teaching experience. The survey included (1) four demographic questions, (2) eight open-ended questions, and (3) one question using a 5-point Likert scale system to rate the comfort level using technology (e.g., extremely uncomfortable = 1, uncomfortable = 2, neutral = 3, comfortable = 4, and extremely comfortable = 5). We added open-ended questions to increase the breadth of responses (Fontana & Frey, 2005; Zhang, LeSavoy, Lieberman, & Barrett, 2014; Jhang, Nam, & Pelttari, 2016). Table 1 includes eight short essay questions in the survey to gain more insight on the participants’ perspectives of online learning.
In this mixed methods study, the research team needed to validate interrater reliability. For the quantitative data, the researchers used a coding sheet in Excel and then transferred the data to SPSS for analysis. Interrater reliability was calculated by dividing the number of agreements by the total number of agreements and disagreements and multiplying by 100. The interrater reliability across all coding categories was 100%.Data Analyses
Qualitative data were analyzed using grounded theory procedures including the constant comparative method of analysis to conceptualize the data (Strauss & Corbin, 1990). Open coding was used to highlight concepts that emerged from the data, while in vivo codingenabled the researchers to give participants a voice by directly using phrases from participant surveys, as appropriate, to represent student perceptions of online instruction (Plano Clark & Creswell, 2015). Following the initial coding, research team members worked together to refine the codes and determine categories (Saldaña, 2009).
We applied descriptive statistics, calculating both the frequency and percentage, to analyze the participants’ demographic characteristics and their comfort level of using technology with a 5-point Likert scale system (e.g., extremely uncomfortable = 1, uncomfortable = 2, neutral = 3, comfortable = 4, and extremely comfortable = 5). One-way ANOVAs were used to investigate the comfort levels of college students using technology among those at different ages, with different number of online courses the participants took, and with their preferred course modality. Correlation coefficiency analysis was also used to find out whether or not the mean scores of the comfort levels of college students using technology was correlated with their age.
Descriptive statistics and demographics. Participants’ demographic data were analyzed by descriptive statistics, calculating both the frequency and percentage of the participants’ demographic characteristics. A total number of 105 students at the research site participated in this survey voluntarily and anonymously during the semester of Spring 2018, contributing to a return rate of 10.5%. Ninety-seven participants answered the question of their age (92.4%). According to the definition of the undergraduate students (United States Census Bureau, n.d.), 72 participants were considered as traditional college students with age ranges from 18 to 24 years (74.2%), and 25 respondents who reported to be older than 24 years of age which were considered as non-traditional college students (25.8%). The mean age of the participants was 25.0 (SD = 8.0), ranging from 18 to 51 years of age. The mean number of online courses the participants took was 3.7 (SD = 3.5), ranging from one to twenty online courses. Table 2 contains the frequency and percentage of participants by their demographic characteristics.
Comfort using technology and preferred course modality. Participants were asked to rate their comfort level of using technology by a 5-point Likert scale system (i.e., extremely uncomfortable = 1, uncomfortable = 2, neutral = 3, comfortable = 4, and extremely comfortable = 5). A total of 49 participants answered this question (46.7%). The mean rating was 4.45, ranging from 2 to 5. Among 97 participants who reported their preferred course modality (92.4%), 54 preferred in-person courses (55.7%), followed by 26 preferred online courses (26.8%), and blended courses were the least favorable (n = 17, 17.5%). Table 3 reports the frequency and percentage of participants’ comfort level using technology and their preferred course modality.
Inferential Analyses. The authors used a one-way ANOVA to investigate whether or not there was any significant difference in the mean scores of the comfortable levels of college students using technology among those at different ages. A significant difference in the mean scores of the comfort levels of college students using technology was found among the college students at different ages (F = 5.19, p = .004) with younger participants indicating more comfort using technology than older participants. A one-way ANOVA was also used to investigate whether or not there was any significant difference in the mean scores of the comfortable levels of college students using technology with the number of online courses they took. A significant difference in the mean scores of the comfort levels of college students using technology was found with different number of online courses they took (F = 6.85, p = .001). The more online courses the participants took, the more comfort they felt when using technology. No significant difference in the mean scores of the comfort levels of college students using technology was found with the students’ preferred course modality (p > .05). Table 4 describes the one-way ANOVA analyses on the comfort level of using technology reported by the participants.
Furthermore, correlation coefficiency was used to find out whether or not the mean scores of the comfort levels of college students using technology was correlated with their age. A correlation was found between the mean scores of the comfort levels of college students using technology and their age (r = -0.47, p = .001). The age of the participants correlated negatively with the comfort level they had when using technology. No correlation was found between the mean scores of the comfort levels of college students using technology and the number of the online courses they took (p > .05). Similarly, no correlation was found between the mean scores of the comfort levels of college students using technology and their preference of the course modality (p > .05). Table 5 reports the correlation analysis between the comfortable level of using technology and participants’ age.
Online surveys enabled participants to share their learning experiences anonymously. In analyzing the data, we have identified four factors related to student perceptions of online courses: (1) set-up of the course; (2) learner characteristics and sense of course learning; (3) social interactions; and (4) issues with technology. The themes and subthemes are illustrated in Table 6.
“It’s Only as Good as the Professor”: Course Set-Up in Online Learning Environments. Many students indicated that online courses have tremendous potential. However, the quality of an online course is highly dependent upon or “only as good as” the professor designing and teaching the class. It seems for some, “A well designed online class with a ‘present’ professor can be very close to the experience of campus classes.”
Through the student survey data analysis, five subthemes emerged regarding the ways in which the instructor’s course set -up impacted students’ perceptions of the effectiveness and efficiency of online learning environments. First, effective communication was deemed crucial to the success of an online course. Most salient was the idea that professors answer their emails and respond quickly to student inquiries. This mimics the findings of Cole (2016). Next, a second theme of course organization emerged. Students reported clear due dates and understandable instructions on assignments as important components of the course organization. Third, there was a desire for students to have access to materials early so that they might progress at their own speed through the asynchronous activities. Fourth, a criticism of the variety of activities emerged. One student specifically pleaded for instructors to “Make it interesting. Not just commenting on people's commenting.” Lastly, students perceived use of video as a welcome addition to the course that may add a personal connection to the instructor.
“It Isn’t for Everybody”: Considering Learner Characteristics and a Sense of Learning in the Course. Online learning may be viewed by students as “perfect” in terms of convenience for those with busy lifestyles. Students rave about the flexibility — “the courses can be accessed from anywhere.” In addition to the flexibility of the asynchronous environment, the concept of equity emerged as a finding. Students often used language relating to how online classes could be beneficial for others. Specific examples include “If you don't have a babysitter or you have a cold, you can still get your work done.” Another student shared, “If you are very busy, a commuter, a working parent, have a job, etc., online classes would be very beneficial to you.”
However, there is also a perception that while wonderful in its flexibility, online learning works best for certain people— those who demonstrate strong self- regulatory behaviors. Indeed, personal attributes were seen as a determining factor in course success. For example, the idea that “one must have great self-determination,” and that “you need to be very disciplined with your work” were common responses. Many students perceive online classes as requiring a “massive amount of organization and self- motivation.” Students emphasize the ability to be a “self-learner” as well as the ability to manage time wisely, as online learning certainly “isn’t for everybody.”
Interestingly, some students believe self-regulatory behaviors were actually learned during their online courses. “Benefits of learning online include self -advocacy if you want to do well, time management, (and) self- sufficiency,” wrote one student. Another student wrote, “It teaches you to constantly stay on your game with work, and it helps you if you have to teach yourself information.” Additionally, a third student reflected upon learning time management — stating “It’s on you,” and “Your professor is not babysitting you, (which) teaches you to be accountable,” as a potential beneficial learning outcome of an online course.
“There is a Feeling of Solitude”: Examining Perceptions of Social Interactions. In our study, 55.7% of respondents preferred face to face classes over blended or online courses. Student feedback points to human contact as a factor in preference for face to face classes. As one student reflected on the drawbacks of online learning, “I like having a person there in front of me physically explaining things and being able to ask questions.” Similarly, other students emphasized, “I enjoy my professors knowing me,” and “I like going to classes and having personal relationships with my teachers.” It seems without face to face classes, students may feel isolated and disengaged. For instance, one student suggested, “Without meeting in person, there is a feeling of solitude and little collaboration.” While another student responded, “If you are not in a physical class, then chances are you will not make friends with your online peers.” The perception that online class “takes away the ability to meet people” was a negative aspect of online learning for some students. Clearly, the social aspect of face to face classes is something many students value.
“I’m Not Really Computer Literate”: Exploring the Role of Technology. We, as instructors, sometimes assume that our students are digital natives and that technology isn’t a factor in our courses. However, for some students, issues with technology can impede online learning. Many students surveyed reported frustration with learning online applications, frustration with the lack of reliability of the internet, and some students even perceived a greater potential for course problems. According to one student, “With everything being electronic, there are more ways to mess up and not submit your assignment right.” While another student wrote, “I'm not really computer literate, which made my online courses challenging.” What does this mean for us as instructors who teach courses online? How might we mitigate these concerns?
The purpose of this research was to examine student’s perceptions and attitudes in regard to online learning as well as their experiences in various online learning environments. Understanding students’ experiences and perceptions can facilitate better course design for future courses, which can result in better student experiences and may even improve class retention. The results of the current study found that student comfort level in courses was affected by the number of online courses they have completed and students’ age. Students who participated in more online courses were more comfortable, but it is likely that students who were less comfortable in online classes self-selected out of enrolling in future online courses. Unfortunately, as instructors we cannot control students’ previous experiences which will likely factor whether they will even consider enrolling in another course. Younger students, who are often referred to as digital natives, were not surprisingly more comfortable with taking online courses.Limitations
The responses to the open-ended questions revealed four themes: 1. Course set-up, 2. Learner characteristics, 3. Social interactions, and 4. Technology. These themes map with the elements of Piccano’s model for learning (Picciano, 2017). Much was revealed in regard to course set-up, including subthemes 1. Effective communication, 2. Course organization, 3. Early access to materials, 4. Creativity of assignments and, 5. Welcome video. These subthemes accentuate the large importance the instructor has upon the student views of an online class as reported by Bhagat, Wu, and Chang (2016). These subthemes particularly relate to Piccano’s elements of content and evaluation. Piccano states that the way in which content is delivered is the primary driver of instruction. Online learning can provide an avenue to deliver content with enhanced visualization through (Mayer, 2009) recorded lectures, videos, simulations, and others through learning management systems. Piccano’s fifth element, Evaluation, is also highly dependent upon effective course set-up and allows for a wide variety of mechanisms to effectively examine student’s learning from the traditional exam to student driven videos and podcasts.
The results of the survey revealed that students are very aware of the importance of learner characteristics when choosing course modalities. Some students expressed the need for students to have strong self-regulatory behaviors while others described the acquisition of improved self-regulatory behaviors can be a positive benefit from taking online courses. The independent nature of a student can greatly affect the capability of a student to perform well in an online class, but may also drive them to a new level of independence. Students were also very concerned with the social aspect of online learning which is an element in Piccano’s model. Students expressed fears of the lack of instructor and peer contact leading to a sense of isolation. It is for this reason that it is particularly critical for instructors to build a sense of community into their classroom while also being as ‘present’ as possible by responding quickly to student concerns and including welcome and instructional videos. An effective online teacher should be able to organize opportunities to engage students with one another too through well organized discussion boards and collaborative learning opportunities such as group assignments such as VoiceThread, research projects, videos, or wikis.
This sample size of this study was relatively small with a lower response rate. The lower response rate was likely due to the timing of the distribution of the surveys which was near the end of the semester when many students may feel too busy and overwhelmed to complete a survey. We did distribute the survey three times and feel that the information provided still represents a meaningful sample. Respondents did also represent a higher proportion of nontraditional students which may indicate the type of student more likely to complete a voluntary survey.Future Directions
The results of this study indicate a need to further examine the perceptions in online learning and teaching. As many students indicated the importance of having solid self-regulatory behaviors, it may be beneficial to further examine this using self-determination theory (Deci & Ryan, 1985). Further research should also examine the perspectives of faculty upon the learning environment of their students.Conclusion
Piccano’s (2017) model can facilitate instructors in developing meaningful online courses which can maximize learning while minimizing many of the negatives and concerns addressed in the results of this study such as isolation, computer literacy, and poor self-regulatory behaviors. Instructors should design their courses promoting learning using a variety of means including some or all of the following: media in text, video, and/or audio, collaborative/student generated content dialectic questioning, evaluations, and reflections to maximize the student experience in an online course.
Bhagat, K., Wu, L., & Chang, C. (2019). The impact of personality on students' perceptions towards online learning. Australasian Journal of Educational Technology, 35(4), 98-108.
Cohan, D. J. (2019). The trouble with online education. Retrieved from https://www.psychologytoday.com/us/blog/social-lights/201907/the-trouble-online-education.
Cox, M. D. (2004). Introduction to faculty learning communities. New Directions for Teaching and Learning, 97, 5-23.
Deci, E. L., & Ryan, R. M. (1985). Intrinsic motivation and self-determination in human behaviour. New York: Plenum.
Fontana, A., & Frey, J. H. (2005). Interviewing: The art of science. In Denzin, N. K., & Lincoln, Y. S. (Eds.), Handbook of qualitative research (pp. 361-375). Thousand Oaks, CA: Sage.
Jaschik, S., & Lederman, D. (2014). The 2014 inside higher ed survey of faculty attitudes on technology. Retrieved from https://www.insidehighered.com/system/files/media/IHE-FacTechSurvey2014%20final.pdf.
Mayer, R.E. (2009). Multimedia learning (2nd edition). New York: Cambridge University Press.
Picciano, A.G. (2017). Theories and frameworks for online education: Seeking an integrated model. Online Learning, 21(3), 166-190. doi: 10.24059/olj.v21i3.1225
Plano Clark, V. & Creswell, J. (2015). Understanding research a consumer’s guide (2nd ed.). Boston, MA: Pearson.
Saldaña, (2009). The coding manual for qualitative researchers. Thousand Oaks, CA: Sage.
Strauss, A. L., & Corbin, J. M. (1990). Basics of qualitative research: Grounded theory procedures and techniques. Newbury Park, CA: Sage.
U.S. Department of Education, National Center for Education Statistics. (2018). Distance learning. Retrieved from https://nces.ed.gov/fastfacts/display.asp?id=80
U.S. Department of Education, National Center for Education Statistics. (n.d.) National undergraduates/definitions and data. Retrieved from https://nces.ed.gov/pubs/web/97578e.asp
Zhang, J., LeSavoy, B., Lieberman, L., & Barrett, L. (2014). Faculty learning community (FLC) on student leadership: Applying student voices to leadership development. Mountain Rise: The International Journal of the Scholarship of Teaching and Learning, 8(2), 1-18. doi: http://dx.doi.org/10.1234/mr.v8i2.250.
Zhang, J., Nam, Y., & Pelttari, C. (2016). Perspectives on educative teacher performance assessment (edTPA) from teacher candidates and college supervisors. Korean Journal of Teacher Education, 32(3), 29-58. doi: http://dx.doi.org/10.14333/KJTE.2016.32.3.29.
Zhang, J., & Pearlman, A. M. G. (2018). Expanding access to international education through technology enhanced collaborative online international learning (COIL) courses. The International Journal of Technology in Teaching and Learning, 14(1), 1-11.