Development of the Student Expectations of Online Learning Survey (SEOLS): A Pilot Study


Sandra M. Harris
Walden University
sandra.harris@waldenu.edu


Yvonne I. Larrier
Indiana University South Bend
ylarrier@iusb.edu

Marianne Castano-Bishop
Indiana University South Bend
cbishopm@iusb.edu

Abstract

The problem of attrition in online learning has drawn attention from distance education administrators and chief academic officers of higher education institutions.  Many studies have addressed factors related to student attrition, persistence and retention in online courses.  However, few studies have examined how student expectations influence student retention and persistence in online learning. There is a need for a systematic method of addressing the relationship between student expectations and persistence in online education. This study investigated the reliability of the Student Expectations of Online Learning Survey (SEOLS) as a tool for assessing student expectations for elements of online courses. The 44 items on the survey are distributed among 7 scales. The pilot study consisted of 17 students enrolled in online courses of a master’s level counseling program at a mid-sized Midwestern University in the United States. Results revealed good to excellent reliability indices for the scales that ranged from α = .64 to α = .95. Data from the pilot study indicated that the SEOLS is an instrument that can be used to reliably assess student expectations of the online learning environment. The authors present a discussion for use of the instrument and implications for future research.

Introduction

Since the proliferation of desktop computers in the mid 1990s, the number of online courses offered in higher education continues to rise. According to Allen and Seaman (2010) the number of enrollments in online courses rose from just over 1.6 million in fall 2002 (9.6% of total college enrollments) to over 4.6 million in fall 2008 (25.3% of total college enrollments). From fall 2003 to fall 2008 the average annual rate of growth in online courses was 16.7% as compared to an annual average growth rate of 1.5% for traditional courses (Allen & Seaman).

As the number of students taking online courses increases, distance education administrators and universities are challenged by lower retention rates for online courses compared to traditional courses (DiRamio & Wolverton, 2006; Liu, Gomes, Khan, &Yen, 2007). Research indicates that the failed attrition rate for undergraduate courses in the US ranges between 20 – 50% (Frankola, 2001). Attrition rates in online courses have been cited to be 10-20% higher than traditional face-to-face courses (Rovai, 2007).  Recent work by Patterson and McFadden (2009) indicated that dropout rates in online courses may be six to seven times higher when compared to traditional courses. When a sample of chief academic officers (CAOs) was asked if retention was harder in online courses than traditional courses, the number of CAOs who responded yes was twice the number who responded no (28 percent vs. 13 percent).

A number of studies have addressed the internal and external factors that affect retention in online courses (Berge & Huang, 2004; Martinez, 2003; Swan, 2001). Few if any studies have focused on how student expectations affect student retention in online courses.  Expectations are the fundamental principles of human behavior. The degree to which individual expectations are met in various situations affects a person’s choice of subsequent behavior.  Therefore, the purpose of this pilot study was to assess the reliability of the Student Expectations of Online Learning Survey (SEOLS), which assesses students’ expectations of various elements of the online learning environment.

Literature Review

Theoretical Foundation

Expectancy theory is the theoretical foundation for this study.  Expectancy theory provides a framework for explaining how future actions are predicated on the degree to which expected outcomes are met (Isaac, Zerbe, & Pitt, 2001). In the context of retention in online education, expectancy theory would posit that the degree to which a student’s expectations are met in online courses influences whether the student will persist in taking online courses. When student expectations are consistent with their course experiences, students are more likely to persist in the online learning environment. Having knowledge and understanding of student expectations and how those expectations impact student performance and persistence is the first step in developing programs for helping to students develop realistic expectations for online courses.

Factors Related to Retention in Online Courses

The low retention rate in online courses has sparked a great deal of research aimed at identifying the factors that contribute to student retention (Berge & Huang, 2004; Martinez, 2003; Swan, 2001).  A review of the literature revealed that the following six themes are related to student expectations of online courses:  proficiency in using technology, expectations of the course instructor, expectations of the course content, expectations for social interaction, expectations for course organization, and other personal variables. A brief summary of literature related to each of the six themes is presented below.

Proficiency with technology. Problems with technology is premiere reason that students dropout from online courses (Frankola, 2001). Research shows that novice online learners frequently underestimate the amount of technical skill needed to be successful in online courses (Carr, 2000; Fozdar, Kumar, & Kannan, 2006). Basic computer skills such as use of word processing software, familiarity with using email and the internet, as well as proficiency using the course delivery platform are basic skills that are needed for a successful online learning experience. Other research (Nichols, 2010) shows that the availability of support services to assist novice learners in developing basic computer skills affect retention in online courses. Lack of support services is frequently cited as a reason why students attrite from online courses (Fodzar, et al., 2006; McGivney, 2004, Nichols, 2010).

Expectations of the course instructor. One panel of 20 experts identified expectations regarding overall instructional quality as a factor that affects student retention (Heyman, 2010). Instructional quality refers to issues such as frequency of student and instructor interaction (Herbert, 2006); instructor presence in the course room and instructor response time to student needs (Artino, 2008; Ni & Aust, 2008). Other research found that early introductions, prompt response to assignments, and frequent communication with the instructor were related to retention in online courses (Herbert, 2006; Nistor & Baum, 2010).

Expectations of course content. Additional research has indicated that student expectations regarding course demands and course content impacts retention in online courses.  Many novice online learners believe that online courses, compared to traditional courses, are easier (Nash, 2005) and require less time commitment (Pierrkeas, Xenos, Panagiotpkopoulurs, & Vergigis, 2004).  Students who adopt the expectation that online courses are easier are more likely to drop out of or fail their online courses when such courses turn out to be more difficult than they had anticipated (Nash, 2005). The relevancy of the course content is also a factor that affects student persistence in online courses. Data has shown that students tend to be more active in courses when they find the material to be interesting or relevant to their daily lives (McGivney, 2004). Other research (Fisher & Baird, 2005) has shown that courses which contain a large amount of inaccurate or irrelevant material detract from student learning and negatively affect course retention.

Expectations for social interactions.  Past research has revealed a positive relationship between social interaction and student retention in online courses (Gallie, 2005). Distance learning students frequently report feelings of isolation as being a prime reason for dropping out of distance learning courses and programs (Nash, 2005). Feeling connected and having a sense of affiliation are essential components of effective online learning environments (Rovai, 2003). According to Gaide (2004) social networks formed by students in online courses provides students with additional support and encouragement from their peers, especially when students may not receive such support and encouragement in their home or work environment.

Expectations regarding the course delivery system. Bocci, Eastman, and Swift (2004) revealed that the course delivery system is an important element in the online learning experience. Students frequently report poor course design as a reason for dropping out of online courses (Frankola, 2001). The overall arrangement and organization of the course room can also facilitate or inhibit student learning.  Nichols (2010) found that ease of course navigation and easy access to course content was directly related to student retention. 

Personal variables. There are a number of personal variables that affect student retention in online courses.  For instance, a study by Qureshi, Morton, and Antonsz (2002) showed that non-traditional age students with family and work responsibilities tend to be especially attracted to web based courses. Nontraditional students also typically work full-time jobs (Taniguchi & Kaufman, 2007). Patterson and McPherson (2009) found that age was positively related to persistence in online courses. The competing demands of work, family, and coursework frequently presents nontraditional age students with challenges in time management (Frankola, 2001) as they attempt to balance the forces which compete for their time and energy. A study by Park and Hee Jun (2009) revealed that student perceptions of support received from family and the work organization were related to student persistence in online courses. When students lack such support, it becomes even more important for them to develop supportive peer and mentoring relationships in the online environment. In addition, nontraditional students may face electronic and technological challenges due to less exposure to technology (Rodriguez, Green, & Ree, 2003) than their traditional age counterparts.

Method

This study employed a quantitative, non-experimental, exploratory repeated measures research design. Leedy and Ormond (2005) assert that quantitative research is applied in order to explain, authenticate, or validate relationships. The research was non-experimental because it was not possible to implement the major requirement of random assignment of participants that is needed in a true experimental design (Trochim, 2007). A purposive sampling scheme was used to recruit participants for the study.  This sampling procedure is used when a researcher has a specific purpose for the research and is interested in specific groups (Trochim, 2007). The researcher was specifically interested in developing an instrument that measures student expectations of online courses.  The participants were currently enrolled in online courses and we felt they would have developed expectations for the online learning environment. Therefore, the use of purposive sampling was appropriate for this research. 

Participants

The sample for this study consisted of 17 pre-service school and clinical mental health counseling students.  Participants were registered in a master’s level counseling program at a mid-sized University located in the Midwestern region of the United States. An invitation letter along with a link to the survey was emailed to 43 counselor education graduate students during spring 2009 and spring 2010.  The students were selected because they were enrolled in two online courses being offered at the university. Only 17 of the participants completed the survey, for a response rate of 42%.  The sample consisted of 78% females (n=14) and 22% (n=3) males. The survey collected data for three age ranges; 50% of respondents were in the 21-29 age group, 11% were in the 30 – 39 age group, and 39% were in the >40 age group. 

Instrument Development

The SEOLS is a literature based, researcher developed instrument that assesses student expectations of the learning environment.  The instrument contains 44-items to which the respondents use the following 5 point likert type scale: 1 = strongly disagree to 5 = strongly agree. The items assigned to each scale are summated to yield a total scale score.

Items on the SEOLS pertain to variables that previous researchers (Berge & Huang, 2004; Simpson, 2004) have identified as having an impact on student retention in online courses.  A summary of that research is presented in the literature review section of this paper. The six scales for the SEOLS were predicated from the six major themes found in the literature (proficiency in using technology, expectations of the course instructor, expectations of the course content, expectations for social interaction, expectations for course organization, and other personal variables). Several statements were developed to address components of the six themes.

Face and content validity of the SEOLS. Face validity refers to the degree to which items on an instrument appear to measure a given construct of interest (Kaplan and Saccuzzo, 2009). The face validity of the SEOLS was judged by two groups: a panel of students who had experience with online learning and a panel of faculty who had experience with teaching online. Each panel met as a separate focus group to discuss the structure and content of items in the survey.  Both groups provided useful comments for wording and restructuring of items to improve the overall readability of items on the SEOLS.

Content validity refers to judgment from an expert regarding how well items on an instrument sample the area or domain of interest (Kaplan and Saccuzzo, 2009).  The panel of faculty who had experience with teaching online indicated that the SEOLS had adequate content validity in that it contained a representative sample of items that have an impact on student expectations of the online learning environment. Feedback from the two focus groups resulted in a 44-item survey that contained six scales. A description of each scale is presented in the results section of this article. A copy of the original instrument is presented in Appendix A.

Reliability of the SEOLS. Survey research requires that evidence is provided regarding the reliability of an instrument (Kaplan & Saccuzzo, 2009). Reliability refers to the homogeneity of an instrument or how well items on an instrument measure a unitary construct (Gregory, 2007). Reliability is frequently assessed by Cronbach’s alpha (Cohen & Cohen, 1988;Trochim, 2007) because it is an “estimate of the major source of measurement error, sets the upper limits of reliability, [and] provides the most stable estimate of reliability” (Westhuis & Thayer, 1989, p. 157). The significance of the obtained alphas were judged against the value of alpha = .70 because previous researchers (Kaplan & Saccuzzo, 2009; Mertler & Vanatta, 2005) suggest that values of .70 or greater indicate an internally consistent scale.

An item-analysis was also performed to assess the internal consistency of single items as they relate to the homogeneity of scales to which the items were assigned (Thorndike, 1967). The item analysis was conducted by investigating the item-total correlations for each item in a scale. Items with item-total correlations of .25 or higher were retained on the survey. This value was chosen because it represents the critical value of r with alpha set at .01 and df = 100 (Ary, Jacob, Razavieh, 1996).  

Results and Discussion

Results from the reliability analysis revealed that the scales had adequate to excellent internal consistency. Table 1 presents a summary of the results. The alpha coefficients ranged from a low of .64 to a high of .95. The F-test represents the comparison of the obtained coefficient alpha against the test value of .70. Results indicated that five of six scales had alphas that significantly exceeded the test value of .70. Those scales were considered to be highly reliable scales. A summary of the results from the reliability and item analyses for each scale is presented below. Appendix A contains a copy of the original version of the SEOLS.

Table 1
Psychometric Properties for Scales of the SEOLS

 

Scale

 

n

α

95% CI

 

Lower

   Upper

F

df1

df2

Sig

Proficiency with technology

7

.95

.90

.98

6.12

16

96

.000

Expectations of the course instructor

9

.92

.84

.97

3.68

16

128

.000

Expectations of the course content

7

.90

.80

.96

9.93

15

96

.001

Expectations for social interaction

5

.86

.72

.94

2.15

16

64

.016

Expectations for course design

4

.95

.89

.98

5.50

16

48

.000

Personal variables

4

.64

.25

.86

.84

16

48

.638

Proficiency with technology. This scale initially contained 14 items that addressed proficiency in basic computer skills such as using email, the internet, and word processor software.  The initial reliability analysis generated an alpha coefficient of .64.  A review of the item analysis revealed several poorly performing items. A evaluation of those items resulted in seven items being removed from the scale. Item 8 was separated into four individual items that were used to create the Proficiency with the Course Delivery System Scale. This scale assesses student proficiency in using basic features of the online courseroom. Items 9 – 14 were removed from the scale because further analysis revealed the items did not pertain to proficiency with basic computer skills. Deleting the six items from the scale resulted in a 7-item scale and increased the obtained coefficient alpha for the scale to .95.  

The obtained spread of scores for this scale was 28 – 35, with M = 32.82 and SD= 2.86. The minimum possible score was 5 and the maximum was 35. The data indicated that this sample of participants had a fairly high level of proficiency in technology. There are many variables that could have possibly accounted for the findings. The results could possibly be attributed to the number of online courses taken, the age of the participants, or the participants’ familiarity with computer technology outside of the classroom.

Expectations for the course instructor.  This 9-item scale assessed student expectations regarding instructor performance and communication in the online courseroom. The obtained coefficient alpha  was .92. The item analysis revealed that the item-total correlations for all items exceeded the minimum test value of .25. Several items (Items 16, 17, 18, and 22) were reworded to improve the consistency of wording for items in the scale.

The obtained spread of scores for this scale was 36 - 45, with M = 41.61 and SD = 3.52. The minimum possible score was 9 and the maximum was 45. The data indicated that this sample of participants had high expectations for the course instructor. There are also many variables that could have possibly accounted for the findings. The participants’ overall maturity as students or their experience in online courses may have impacted their responses to items on this scale.

Expectations of course content.  This 7-item scale addressed student expectations toward the course content such as rigor of the course, opportunities for interaction, and application of course material to real life. The obtained alpha for the scale was .90. The item analysis revealed that the item-total correlations for all items exceeded the minimum test value of .25. No modifications were made to this scale.

The obtained spread of scores for this scale was 22 – 35, with M = 30.67 and SD = 3.72. The minimum possible score was 5 and the maximum was 35. The data indicated that this sample of participants had fairly high expectations for the course content. There are many variables that could have possibly accounted for the findings. The results could possibly be attributed to the fact that participants were graduate students seeking advanced credentials and they therefore expected the course to provide information that was meaningful and relevant to their profession.

Expectations for social interaction. This 5-item scale addressed student expectations for engaging in social interactions with others in the course.  The obtained alpha for the scale was .86.The item analysis revealed one low performing item, which was reworded to specifically address social interactions in the course room.  

The obtained spread of scores for this scale was 12 - 20, with M = 20.53  and SD = 3.28. The minimum possible score was 5 and the maximum was 25. The data indicated that this sample of participants had high expectations for social interactions in the course room. There are many variables that could have possibly accounted for the findings. The participants’ overall maturity as students or their experience in online courses may have impacted their responses to items on this scale.

Expectations for course design. This 4-item scale addressed design elements of the course such as labeling of discussion forums and topics, location of course materials, and clarity of course instruction.  The obtained alpha for the scale was .95.  The item analysis revealed that the item-total correlations for all items exceeded the minimum test value of .25. No modifications were made to this scale. The items were also reworded to reflect student expectations for course design and organization.

The obtained spread of scores for this scale was 8 - 20, with M = 15.65 and SD = 3.90. The minimum possible score was 4 and the maximum was 20. The data indicated that this sample of participants had high expectations for social interactions in the course room. There are many variables that could have possibly accounted for the findings. The participants’ overall maturity as students or their experience with the design and organization of the learning management system used in their program may have impacted their responses to items on this scale.

Personal variables. This scale was originally 5 items which addressed how personal variables such as support of family and friends, home environment, and time management impact student performance in online courses.  The initial obtained alpha for the scale was .44.  The item analysis revealed that Item 41 was negatively correlated with other items in the scale.  Removing Item 41 resulted in an alpha of .64. While the obtained value yielded an acceptable level of reliability, the authors chose to reword the items to improve the consistency of the wording.  One question was divided into two separate items.  The modifications resulted in a revised scale that consisted of six items.

The obtained spread of scores for this scale was 8 - 20, with M = 15.94 and SD = 2.35. The minimum possible score was 4 and the maximum was 20. The data indicated that this sample of participants had high expectations for the role of personal variables and their online course experiences. There are many variables that could have possibly accounted for the findings. The participants’ overall maturity as students or their personal life circumstances may have impacted their responses to items on this scale.

Discussion and Conclusion

Results from the pilot study revealed that while the initial version of the SEOLS collected reliable data regarding student expectations of the online learning environment, there was some room for improving the questionnaire. Results from the pilot study were used to revise the instrument.  The specific revisions were noted in the results section of this article. Several items were deleted from the SEOLS and several items were reworded to improve the consistency of the wording in the items throughout the survey. The major revision to the survey consisted of the addition of a 4-item scale that addresses expectations regarding the course navigation aspect of the course learning management system. The revised SEOLS consists of 43-items that are distributed across seven scales. Appendix B contains a copy of the revised instrument.

The consistency of high average scores across the six scales indicate that the participants many have had high expectations for their online learning experience. There were a number of variables that could have resulted in the high scores. Variables such as level of experience in online courses, age of the participants, or educational level may have shaped the students expectations of their online experience. The small sample size precluded any meaningful comparative data analysis on the demographic variables of the participants.

Implications for Future Research and Practice

Additional studies should use the SEOLS in conjunction with other measures to study the relationship between individual expectations, course performance, course completion, retention, and degree completion in online programs.  Future studies could also investigate how other variables such as age, experience in online courses, and degree being sought impact expectations of the online learning environment. Data from the SEOLS, in addition to other relevant information, could also possibly be used to identify students who may be at risk of dropping out due to unrealistic expectations online learning.

Distance education administrators, course designers, and course instructors could use information from the SEOLS to develop strategies for addressing student expectations of online courses. If distance education administrators understand how student expectations of online courses shape their persistence in their coursework, administrators would have a conceptual basis for developing intervention and remediation programs which take into account those expectations. Distance education administrators could require faculty and staff to include in the orientation sessions information regarding the need for students to examine their personal beliefs and expectations about online learning. Academic advisors should be encouraged communicate with students how personal expectations of online learning influence performance in online courses. Advisors could help students develop realistic expectations of the demands and requirements of online courses. Course instructors could develop and implement course discussions that address individual expectations of online learning. Those discussions could encourage students to talk about how their expectations influence their performance. More experienced online learners could provide support, encouragement, and suggestions to novice learners on how to go about aligning personal expectations with the actual demands of online learning.

There were several limitations of this study that could be improved upon with subsequent research. The first limitation of the study was the small sample size. Additional studies with larger samples are needed to further assess the psychometric properties of the SEOLS.  A larger sample of responses is also needed to assess the construct validity of the SEOLS through the use of advanced statistical procedures such as structural equation modeling.

Another limitation of the study was the restricted range of demographic characteristics for the participants. The sample consisted of graduate students from one geographic areas of the US. Additional studies are needed to extend the generalizability of the SEOLS to a broader sample of online learners. Additional studies should include a more diverse range of demographics such as undergraduate students and students from different geographic areas of the US. Other studies could focus on a comparative analysis between students taking face to face courses and online courses to determine whether there are differences in student expectations of online and traditional class room based courses. Future studies could also assess whether personal expectations have a differential affect for online courses compared to class based courses.

Institutions of higher education are increasingly being tasked by stakeholders such as the federal government, governing boards, and other legislative bodies to demonstrate their effectiveness at graduating students in a timely and efficient manner (Patterson & McPherson, 2009). Findings from this study contribute to the existing literature because it provides distance education educators with an additional tool for investigating variables related to retention in online courses and programs. We provided evidence of a reliable instrument, the SEOLS, which can be used to quantitatively assess the relevance of expectancy theory to student retention in online courses. It is reasonable to assume that if a student’s expectations regarding online education are met, the students are more likely to persist in and graduate from online programs.


References

Allen, I. E., & Seaman, J. (2010). Learning on demand: Online education in the United States, 2009. Needham, MA: Sloan Center for Online Education.

Artino, A. R. (2008). Promoting academic motivation and self-regulation: Practical guidelines for online instructors. TechTrends, 52(3), 37-45.

Ary, D., Jacobs, L. C., & Razavieh, A. (1996). Introduction to research in education (5th Ed.) Fort Worth:  Harcourt Brace.

Berge, Z., & Huang, Y. (2004).  A model for sustainable student retention: A holistic perspective on the student dropout problem with special attention to e-learning.  DEOSNEWS, 13(5). Retrieved from Journal of Online Learning and Teaching.

Bocchi, J., Eastman, J., &Swift, C. (2004). Retaining the online learner:  Profile of students in an online MBA program and implications for teaching them.  Journal of Education for Business, 79(4), 246- 253.

Carr, S. (2000, February 11). As distance education comes of age, the challenge is keeping the students. The Chronicle Of Higher education: Information Technology. Retrieved June 11, 2009, from http://chronicle.com/free/v46/i23/23a00101.htm

Cohen, J. & Cohen, P. (1988). Applied multiple regression/correlations analysis for thebehavioral sciences (2nd ed.). Hillsdale, NJ; Lawrence Erlbaum.

DiRamio, D., & Wolverton, M. (2006). Integrating learning communities and distance education: Possibility or pipedream? Innovative Higher Education, 32(2), 99-113.

Fisher, M., Baird, D. E. (2005). Online learning design that fosters student support, self-regulation, and retention. Campus-Wide Information Systems, 22(2), 88-107.

Fozdar, B. I., Kumar, L. S. & Kannan S. (2006, December). A survey of study on the reasons responsible for student dropout from the bachelor of science programme at Indira Gandhi National Open University. The International Review of Research in Open and Distance Learning, 7(3). Retrieved June 2, 2009, from http://www.irrodl.org/index.php/irrodl/article/view/291/747

Frankola, K. (2001). Why online learners drop out.  Workforce, 80(10), 53 – 59.

Gaide, S. (2004, October 15). Best practices for helping students complete online degree programs. Distance Education Report, 8(20),  8.

Gaille, K. (2005). Student attrition before and after modifications in distance course delivery. Studies in Learning, Evaluation, Innovation, and Development, 2(3), 69-76.

Gregory, R. J. (2007). Psychological testing: History, principles, and applications. (5th ed.). Boston, MA: Pearson/Allyn-Bacon.

Herbert, M. (2006). Staying the course: A study in online student satisfaction andretention. Online Journal of Distance Learning Administration, 9(4). Retrieved September 15, 2011, from State University of West Georgia, Distance Education Center Website: http://www.westga.edu/%7Edistance/ojdla/spring61/miller61.htm

Heyman, E. (2010). Overcoming student retention issues in higher education onlineprograms. Online Journal of Distance Learning Administration, 8 (4). Retrieved September 15, 2011, from State University of West Georgia, Distance Education Center Website: http://www.westga.edu/%7Edistance/ojdla/spring61/miller61.htm

Isaac, R. G., Zerbe, W. J., & Pitt, D. C. (2001). Leadership and motivation: The effective application of expectancy theory. Journal of Managerial Issues, 8(2), 212-226.

Kaplan, R. M., & Sacuzzo, D. P.  (6th ed.). (2005). Psychological testing: Principles,applications, and issues. Belmont, CA: Wadsworth-Thompson.

Leedy, P. D, & Omrod, J. E. (2001). Practical research: Planning and design (7th ed.). Upper Saddle, River: NJ; Merrill – Prentice Hall.

Liu, S. Gomes, J., Khan, B., &Yen, C. (2007). Toward a learner-oriented community college online course dropout framework. International Journal on E-Learning, 6(4), 519 – 542.

Martinez, M. (2003).  High attrition rates in e-learning: Challenges, predictors and solutions.  The eLearning Developers Journal. Retrieved from http://www.elearningguild.com.

Mertler, C. A., & Vanatta, R. A. (2005). Advanced and multivariate statistical methods(3rd ed.) Glendale, CA; Pyrzcak Publishing.

McGivney, V. (2004, February). Understanding persistence in adult learning. Open Learning, 19(1), 34-46.

Nash, R. D. (2005). Course completion rates among distance learners: Identifying possible methods to improve retention. Online Journal of Distance Learning Administration. Retrieved May 21, 2009, from http://www.westga.edu/~distance/ojdla/winter84/nash84.htm

Ni, AS., & Aust, R. (2008). Examining teacher verbal immediacy and sense of classroom community in online classes. International Journal on E-Learning, 7(3), 477 – 498.

Nichols, M. (2010). Student perceptions of support services and the influence of targeted interventions on retention in distance Education. Distance Education 31(1), 93 – 133.

Nistor, N., & Neubauer, K. (2010). From participation to dropout: Quantitative participation patterns in online university courses. Computers & Education, 55(2), 663-672.

O'Lawrence, H. (2007). An overview of the influences of distance learning on adult learners. Journal of Education and Human Development [On-line], 1(1). Retrieved on August 5, 2008, from http://www.scientificjournals.org/journals2007/articles/1041.htm

Pierrakeas, C., Xenos, M., Panagiotakopoulos, C., & Vergidis, D. (2004, August). A
comparative study of dropout rates and causes for two different distance education courses. International Review of Research in Open and Distance Learning, 5(2).

Qureshi, E., Morton, L. L., & Antonsz, E. (2002). An interesting profile-university students who take distance education courses sow weaker motivation than on-campus students. Online Journal of Distance Education Learning Administration, 5(4). 

Park, J., & Hee Jun, C. (2009). Factors influencing adult learners' decision to drop out or persist in online learning. Journal of Educational Technology & Society, 12(4), 207-217.

Patterson, B., & McFadden, C. (2009). Attrition in online and campus degree programs. Online Journal of Distance Education Learning Administration,12 (2).

Rodriguez, R. E., Green, M. T., & Ree, M. J. (2003).Leading Generation X: Do the old rules apply? Journal of Leadership and Organizational Studies, 9, 67-75.

Rovai, A. P. (2002). In search of higher persistence rates in distance education online programs. The Internet and Higher Education, 6, 1 – 16.

Swan, K. (2001). Virtual interaction: Design factors affecting student satisfaction and perceived learning in asynchronous online courses. Distance Education 22(2), 306-331.

Taniguchi, H., Kaufman, G. (2007). Belated entry: Gender differences and similarities in the pattern of nontraditional college enrollment. Social Science Research, 36, 550-568.

Thorndike, R. L. (1957). Reliability. In E. F. Lindquist (Ed.), Educational measurement. (pp. 560–620). Washington D.D.: American Council on Education.

Trochim, W. MN. K, & Donnelly, J. P. (2007). Research methods knowledge base (3rd ed.). Macon, OH; Thomson.

Westhuis, D., & Thayer, B. A. (1989). Development and validation of the clinical anxiety scale:  A rapid assessment instrument for empirical practice. Educational and Psychological Research, 49, 153-163.


Appendix A
Student Expectations of Online Learning Survey (SEOLS)

Instructions: You are being invited to take part in an online survey regarding your expectations about online learning. Your input will be very valuable to us as we try to gain a further understanding of how student expectations of the online learning environment affects their participation in online courses and programs. The survey contains 43 questions and should take approximately 5 - 10 minutes to complete. Participation is voluntary and you have the right to end your participation at any time. Participation is also voluntary; at no time will you be asked your name or other identifying information. Please answer each item as honestly as possible, and please respond to all items on the survey.

  1. Proficiency with technology
    1. I am proficient in using a computer on my own.
    2. I am proficient in using a word processing software program like Microsoft Word on my own.
    3. I am proficient in using email on my own.
    4. I am proficient in attaching files to email messages on my own.
    5. I am proficient in using the internet on my own.
    6. I am proficient in doing internet searches for personal reasons on my own.
    7. I am proficient in doing internet searches for school work on my own.
    8. At my current level of proficiency, I can effectively use the following Oncourse CL options:
      • Messages
      • Dropbox
      • Discussion forums
      • Resources
    9. I feel that as I continue to use the computer and the internet, my ability to perform well in this course will improve.
    10. I am proficient in performing basic computer software troubleshooting.
    11. I am proficient in performing basic technical problems (hardware) troubleshooting.
    12. As a result of this course, I hope my computer skills will improve.
    13. I expect my instructor to give me class time to become familiar with the computer.
    14. I expect my instructor to give me class time to become familiar with navigating Oncourse CL.

    15.  Expectations of the course instructor
    16. I expect the course instructor to be clear in communicating the goals of the course.
    17. I expect the course instructor to be clear in communicating expectations of me. 
    18. I expect that the course requirements will be posted within an agreed upon time.
    19. I expect that the assignment feedback will be delivered to me in a constructive manner.
    20. I expect the course instructor to have a consistent presence in the discussion forums.
    21. I expect the course instructor to promote a supportive online learning environment.
    22. I expect the course instructor to have an appropriate online tone.
    23. I expect the course instructor to be responsive to students’ online tone in all communication formats.
    24. I expect the course instructor to provide contact information to students.

    25. Expectations of the course course content
    26. I expect this online course to be as rigorous as face to face courses.
    27. I expect this online course to provide me with opportunities for active learning.
    28. I expect this online course to provide me with opportunities for large group discussion.
    29. I expect this online course to provide me with opportunities for small group discussion.
    30. I expect this online course to provide me with opportunities for self- reflection.
    31. I expect this online course to provide me with opportunities to relate theory to real life.
    32. I expect this online course to require substantial and thoughtful postings and discussions from students.

    33. Expectations for Social Interaction
    34. I expect this online course to provide me with opportunities to meet new people.
    35. I expect my classmates to be respectful. 
    36. I expect that online interactions with my classmates will be as frequent as face to face interactions.
    37. I expect to have as many opportunities to get to know my classmates online as I would face to face.
    38. I expect to feel positive about interacting online.

    39. Expectations toward Course Organization
    40. Oncourse CL was user friendly.
    41. The forum names and topic titles are unambiguous.
    42. The course materials were easy to locate.
    43. The course instructions were clear and unambiguous.

    44. Expectations towards Time Management and Convenience
    45. I feel concerned that I may not manage my time well.
    46. I am an independent learner.
    47. I feel that this online course provides me with flexibility to complete the course requirements.
    48. I am confident that my family members and friends will be supportive.
    49. My home environment is conducive to getting my coursework completed.

    Appendix B
    Student Expectations of Online Learning Survey Revised (SEOLS-R)

    Instructions: You are being invited to take part in an online survey regarding your expectations about online learning. Your input will be very valuable to us as we try to gain a further understanding of how student expectations of the online learning environment affects their participation in online courses and programs. The survey contains 43 questions and should take approximately 5 - 10 minutes to complete. Participation is voluntary and you have the right to end your participation at any time. Participation is also voluntary; at no time will you be asked your name or other identifying information. Please answer each item as honestly as possible, and please respond to all items on the survey.

    Proficiency with Technology

    1. I am proficient in using a computer on my own.
    2. I am proficient in using a word processing software program like Microsoft Word on my own.
    3. I am proficient in using email on my own.
    4. I am proficient in attaching files to email messages on my own.
    5. I am proficient in using the internet on my own.
    6. I am proficient in doing internet searches for personal reasons on my own.
    7. I am proficient in doing internet searches for school work on my own.
    8. Expectations for the Online Instructor

    9. I expect the course instructor to be clear in communicating the goals of the course.
    10. I expect the course instructor to be clear in communicating expectations of me. 
    11. I expect the course instructor to post course requirements within an agreed upon time.
    12. I expect the course instructor to provide constructive feedback on assignments
    13. I expect the course instructor to have a consistent presence in the discussion forums.
    14. I expect the course instructor to promote a supportive online learning environment.
    15. I expect the course instructor to have an appropriate online tone.
    16. I expect the course instructor to be responsive to students’ tone in the course room.
    17. I expect the course instructor to provide instructor contact information to students.
    18. Expectations about Course Content

    19. I expect this online course to be as rigorous as face to face courses.
    20. I expect this online course to provide me with opportunities for active learning.
    21. I expect this online course to provide me with opportunities for large group discussion.
    22. I expect this online course to provide me with opportunities for small group discussion.
    23. I expect this online course to provide me with opportunities for self- reflection.
    24. I expect this online course to provide me with opportunities to relate theory to real life.
    25. I expect this online course to require thoughtful discussion postings rom students.
    26. Expectations about Social Interaction

    27. I expect this online course to provide me opportunities to meet new people.
    28. I expect peer comments to be made in a respectful manner. 
    29.  I expect that online interactions with my classmates will be as frequent as face to face interactions.
    30. I expect to have as many opportunities to get to know my classmates online as I would face to face.
    31. I expect to feel positive about online interaction with my peers.
    32. Expectations about Course Navigation

    33. I expect the course delivery system to be easy to navigate.
    34. I expect the course forum names to be clearly stated.
    35. I expect the course topic titles to be clearly stated.
    36. I expect the course materials to be easy to locate.
    37. I expect the course instructions to be clearly stated.
    38. Facilitators associated with successful online learning

    39. I feel that effective time management will enable me to succeed in this course.
    40. I feel that being an independent learner will enable me to succeed in this course.
    41. I feel that this online course provides me with flexibility to succeed in this course.
    42. I feel that having the support of my family will enable me to succeed in this course.
    43. I feel that having the support of my friends will enable me to succeed in this course.
    44. I feel that having a positive home environment will enable me to succeed in this course.
    45. Proficiency with the course delivery system

    46. I am proficient in using the “Message” feature in the courseroom.
    47. I am proficient in using the “Dropbox” feature in the courseroom.
    48. I am proficient in using the “Discussion Forum” feature in the courseroom.
    49. I am proficient in using the “Resource” feature in the courseroom.

    Online Journal of Distance Learning Administration, Volume XIV, Number V, Winter 2011
    University of West Georgia, Distance Education Center
    Back to the Online Journal of Distance Learning Administration Contents