Evaluating the Impact of a First-Year Experience on Student Success at a Distance Learning University


Kevin Folk
Global University
kfolk@globaluniversity.edu

Abstract

This study evaluated the impact of a First-Year Experience (FYE) program on student success at a small, private, faith-based distance learning university serving a predominantly nontraditional student population. Student data (N = 399) from two groups were compared to determine if the FYE made a difference in student success as measured by persistence and first-year grades. In addition, FYE students who completed the orientation course were surveyed (N = 50) to evaluate the extent to which the FYE achieved its objectives based on student perceptions. There were no significant differences between the two student groups, suggesting that the FYE did not have a significant impact on student success. However, survey results indicated that students generally agreed that the FYE achieved its objectives. The study contributes to the scant literature on evaluations of first-year interventions in distance learning environments, confirming the complexity involved with mitigating student dropout, and suggests areas of future research on pre-admission factors utilizing Rovai’s (2003) Composite Persistence Model.

Introduction

Online and distance education continues to grow in the U.S. In the fall of 2016, more than 6.2 million students enrolled in at least one distance education course (National Center for Education Statistics, 2018). Based on the total number of enrollments in higher education, more than one out of every four students engaged in some type of distance studies. While the growth rate of online and distance education has slowed somewhat in the past few years, distance enrollments continue to grow as overall enrollment in higher education continues to decrease (Allen, Seaman, Poulin, & Straut, 2016). Also remarkable is where the growth of distance education is currently taking place. While for-profit institutions suffered a double-digit decrease in distance enrollment, non-profit and public institutions experienced significant growth. Private non-profit institutions grew by 26% from 2012 to 2014 (Allen et al., 2016). In addition, nearly half of those students were enrolled exclusively in distance education courses (National Center for Education Statistics, 2018). Thus, approximately one out of every seven students in higher education in fall 2016 was a distance-only student.

Unfortunately, despite the continued growth of enrollments, retention rates for distance education programs are notoriously lower than those of traditional residential programs. In a large and robust study across the Virginia Community Colleges System, Jaggars and Xu (2010) found that online students were more likely to fail or drop out than students in face-to-face courses. In a subsequent study of students in Washington State Community and Technical Colleges, Xu and Jaggars (2011) found similar results, though recent national studies suggested that institutional retention rates may vary widely (James, Swan, & Daston, 2016; Shea & Bidjerano, 2016). Additionally, multiple studies of fully online and distance programs have reported comparable findings (Boston, Ice & Burgess, 2012; Boston, Ice & Gibson, 2011; Willging & Johnson, 2009).

The growing body of research regarding student retention in online and distance education continues to inform institutional practice by highlighting the reasons for student dropout. That knowledge has led some distance education programs and institutions to follow the example of traditional, residential schools by offering a first-year program or orientation, though one tailored to the unique needs of distance learners. However, there is limited research regarding the evaluation of these programs. That information would help to bridge theory and practice of student persistence in distance education by identifying what works to improve student retention and success. Referring generally to student retention in higher education, Tinto (2006-2007) acknowledged that, despite all the attention given to student retention, great gains have not been made in practice. The evidence from research and practice seems to confirm that the problem of retention is even more glaring in online and distance education.   

The purpose of this study is to evaluate the impact of Global University’s (GU) First-Year Experience (FYE) program on student success and add to the research in this neglected area. Even a slight increase in the number of students who complete their first course or two and enroll in another course has the potential to positively affect both the institution and its students. The literature has repeatedly demonstrated lower persistence rates for online and distance learning programs (Hart, 2012). While much is known about why students drop out, more research is necessary to evaluate the impact of practical interventions, like the GU FYE, that seek to increase persistence among distance learners. The following two research questions guided this study:

  1. Was there a difference in student success for FYE students as compared to non-FYE students, specifically persistence (i.e., completion of the orientation course and subsequent course reenrollment in the first year) and student cumulative grade average (CGA) for other first-year courses (excluding the orientation course)?
  2. To what extent did the FYE program achieve its objectives (i.e., orientation course learning outcomes and usefulness of the various program components) as perceived by FYE students?


Conceptual Framework

The conceptual/theoretical framework for this study is based on Rovai’s (2003) composite persistence model (CPM). Building upon the frameworks of both Tinto (1975, 1987, 1993) and Bean and Metzner (1985), Rovai proposed the CPM to more adequately explain the persistence of online and distance learners (see Figure 1). The variables in the CPM are organized around characteristics and skills prior to admission, and internal and external factors after admission. While several sets of factors are integrated from previous theoretical models, he proposed three additional sets of factors relevant to online and distance learning.

The first set of pre-admission factors proposed by Rovai (2003) include relevant student skills that a distance learner needs to be successful: (a) computer literacy, (b) information literacy, (c) time management, (d) reading and writing, and (e) computer-based instruction. GU’s first-year program aims to equip students with the learning skills necessary to succeed in a distance learning environment, mainly through a required orientation course. He also proposed a set of post-admission internal factors related to the unique needs of online and distance students: (a) clarity of programs, (b) self-esteem, (c) identification with school, (d) interpersonal relationships, and (e) accessibility to services. Parts of the GU orientation course, the first-year recommended course sequence, and assignment of a faculty advisor intend to address student needs. Finally, he inserted another set of internal factors related to pedagogy, emphasizing the need for learning and teaching styles to align in order to influence persistence positively. GU’s orientation course specifically deals with the instructional design and methods of its distance learning materials.



Figure 1. Rovai’s (2003) Composite Persistence Model. Adapted from “In Search of Higher Persistence Rates in Distance Education Online Programs,” by A. P. Rovai, 2003, Internet and Higher Education, 6, p. 9. Copyright 2002 by Elsevier Science Inc.

In proposing his composite persistence model, Rovai (2003) argued that the answer to helping adult, distance learners persist is not simple. There are no easy answers or quick fixes to retention. Instead, he insisted that “comprehensive, multicomponent strategies are required” (p. 14). Part of those comprehensive strategies may be addressed through first-year courses and programs; however, it is imperative that these strategies be evaluated for effectiveness.

Literature Review

Student Retention Theory

Tinto’s (1975) theory of student departure has impacted both research and practice in higher education. However, some have found his model to be inadequate to explain dropout for nontraditional students and nontraditional deliveries. Bean and Metzner (1985) proposed a model of nontraditional student attrition, arguing that external factors have more influence on these students who are less connected to campus life. Subsequent research has found that an integration of the models from Tinto (1975, 1987, 1993) and Bean and Metzner (1985) may lead to a more comprehensive understanding of student dropout (Cabrera, Castañeda, Nora, & Hengstler, 1992).

Applying the research of Cabrera et al. (1992) and other relevant distance education research, Rovai (2003) proposed the Composite Persistence Model (CPM) to better explain persistence of distance students. He combined various factors and variables from the models of Tinto (1975, 1987, 1993) and Bean and Metzner (1985). However, noting that those models were inadequate to explain persistence in distance learning, he proposed three additional groupings of variables that would better explain the persistence of distance students.

Rovai’s (2003) model has not been empirically tested and validated to the degree of other influential models of student retention. However, as distance learning has continued to grow substantially, and institutions have increasingly added not only distance courses but also fully distance programs, the model has received more attention in the literature (Lee, Choi, & Kim, 2013; Packham, Jones, Miller, & Thomas, 2004; Park, 2007; Park & Choi, 2009). This is due in large part to the higher attrition rates of these courses and programs. Though many retention studies in online and distance learning have utilized the models of Tinto (1987, 1993), Bean and Metzner (1985), or some other similarly integrated model, researchers and practitioners seem to confirm, at least in part, Rovai’s proposition that there are characteristics and factors unique to distance learners that impact persistence.

Student Persistence in Distance Learning

While there are various measures of student success, persistence is a key measure (Boston et al., 2012; Hart, 2012). One of the difficulties, however, in reviewing the literature on student persistence in distance learning is the lack of consistency in terminology (Cauble, 2015; Hart, 2012). Some studies examine course completion or non-completion, while others investigate course completion with subsequent enrollment or program completion. Some of the difficulty in definitions may be due to the uniqueness of distance learning. Whereas it is common practice to measure persistence (or retention, from the institutional perspective) by the return or departure of the student from year one to year two, distance learning programs often are designed to give the student ultimate flexibility in time and progression. Thus, measuring persistence or retention in that flexible environment has resulted in varied terminology.

Rovai (2003) defined persistence as “the behavior of continuing action despite the presence of obstacles” (p. 1). That definition fits well with his persistence model, as it allows room for the many factors which may represent obstacles for a distance learner. Hart (2012), who conducted a review of key literature related to persistence in distance learning, described persistence as “a phenomenon resulting in student success or completion of an online course” (p. 20). Together, those two compatible definitions may offer a more robust description of persistence in distance learning as used in this study. Persistence is a behavior (student-focused and potentially responsive) and a phenomenon (emphasizing the complexity of factors) that results in student success (course-related, goal-related, program-related) despite various obstacles (individual characteristics, skills, internal/external factors).

While the body of research investigating persistence in online and distance learning continues to grow, studies confirm that the issue of persistence is complex (Boston et al., 2011; Boston et al., 2012; Hart, 2012; Willging & Johnson, 2009). Multiple factors, internal and external, influence the success and persistence of distance learners (Gering, Sheppard, Adams, Renes, & Morotti, 2018). In addition, persistence as a phenomenon is necessarily institution and program specific, making it more difficult to generalize across institutions. However, the research does suggest that institutions build research-based interventions to support the success and persistence of students (Hart, 2012; Willging & Johnson, 2009). Rovai (2003) proposed that his model may be used to design specific interventions to improve student persistence, especially when dealing with new or first-year students.

First-Year Interventions in Distance Learning

One strategy for promoting student success and persistence is the offering of a first-year program. Research has shown that the first year of college is a critical time for students (Pascarella & Terenzini, 2005; Tinto, 2012). Institutions have responded in recent decades by dedicating increased attention to the first year (Barefoot, 2000; Goodman & Pascarella, 2006). Several larger studies of traditional programs have reported the positive impact of first-year seminars on student persistence (Goodman & Pascarella, 2006; Mayhew, Rockenbach, Bowman, Seifert, & Wolniak, 2016; Pascarella & Terenzini, 2005). Permzadian and Credé (2016) conducted a meta-analysis of first-year seminars and found an overall small effect on student retention and first-year student grades . Further analysis indicated that program effectiveness varied significantly by institutional and seminar characteristics. Studies continue to report significant differences in student success and persistence, including long-term impact, for students who participated in first-year seminars in traditional settings (Swanson, Vaughan, & Wilkinson, 2017).

Studies of first-year interventions in distance learning environments, though a much smaller body of research, have also revealed positive impacts on student success and persistence. Most of these interventions are first-year orientations or student success courses tailored to the unique needs of distance and online learners (Ali & Leeds, 2009; Beyrer, 2010; Brewer & Yucedag-Ozcan, 2013; Clay, Rowland, & Packard, 2009; Pattison, 2004). However, one of the issues discussed in the literature relates to students not using support services, including orientations or success courses, especially if not required (Brown et al., 2015; Nash, 2005. Thus, some researchers have suggested requiring orientation or success courses for all students to maximize impact (Glazer & Murphy, 2015; Jones, 2013). In addition, other researchers have suggested that more comprehensive strategies and interventions may lead to improved student success (Maathuis-Smith et al., 2011; Nichols, 2010).

Composite models such as Rovai’s (2003) offer better insight into the unique factors relevant to the persistence of distance learners. Studies, however, continue to demonstrate the complexity of factors involved with student decisions to persist. More limited research suggests that first-year interventions may positively impact student success and persistence for distance learners. First-year interventions that require an orientation and are more comprehensive may be most effective. As GU’s FYE program employs a recommended first-year sequence, a required orientation course, and the assignment of a faculty advisor to enhance communication, the researcher hypothesized that the FYE program would improve student success (i.e., persistence and GPA) and be perceived as beneficial and useful to first-year students.

Methods

The study involved an impact evaluation of GU’s FYE program. The goal of an impact evaluation is to assess the effects of a program or intervention (Henry, 2015). In order to evaluate the impact of GU’s FYE program and answer the research questions, the investigation utilized quantitative methods (Creswell, 2013). To answer the first research question, the researcher employed an ex post facto research design (Cohen, Manion, & Morrison, 2013). It is a research method used when experimental or quasi-experimental studies may not be ethically appropriate or feasible, such as in the case of evaluating the impact of an intervention, like the FYE program, on student success (Permzadian & Credé, 2016). To answer the second research question and evaluate the extent to which the FYE program achieved its objectives as perceived by students, a survey design was utilized (Creswell, 2013).

Setting

GU is a small (less than 5,000 degree-level students) private, faith-based institution dedicated exclusively to distance learning. GU serves international students as well as students from the U.S. by providing ministerial-focused degrees and programs. The focus of this study is on a subset of undergraduate students from the U.S. (more than 500 students in 2017) (Office of Research and Evaluation, 2018). These students complete GU courses primarily through independent study, though some are part of informal study groups (e.g., at a local church).

Like other Distance Learning Only Education Environments (DLOEEs) (York, 2014), GU confronts low course completion rates. The average course completion rate for U.S. students in 2016 was 70%; however, typical first courses for U.S. students tended to have lower than average completion rates, including GU’s former orientation course (pre-2017) with a 40% completion rate (Office of Research and Evaluation, 2017). In response, GU launched an FYE program in 2017 to address issues related to student retention and success. The FYE program includes three main components: (a) a mandatory one-credit orientation course, (b) assignment of a faculty advisor, and (c) a recommended first-year course sequence.

Participants

To answer the first research question, two different groups of GU students were compared (N = 399). The first group, or the treatment group, included all newly enrolled and reactivating independent-study students who were part of GU’s FYE program in 2017. These students enrolled in the required one-credit orientation course, were assigned a faculty advisor, and were advised to follow the recommended first-year course sequence. The total population of the FYE group is 146 students. The comparison group, or the nontreatment group, included students with similar characteristics (see Table 1) from the two previous years (2015 and 2016) who did not participate in the FYE but did enroll in the previously offered, two-credit orientation course. The total population of the non-FYE group is 253 students.



To answer the second research question and determine to what extent the FYE program achieved its objectives based on the perceptions of FYE students, all students in the treatment group who completed the orientation course were surveyed (N = 50). Similar to the full FYE group, these completers were predominantly male (58%), married (65%), affiliated with the Assemblies of God (73%), with a mean age of 36 years (SD = 13.85).


Data Collection Tools

To answer the first research question, archival data related to persistence and student CGA was collected through reports run in GU’s proprietary student information system (SIS). For the second research question, and to investigate to what extent the FYE program achieved its objectives as perceived by FYE students, the researcher designed a web-based survey instrument.

The survey included 39 items relating to the FYE program and its components that the students rated on a 7-point Likert scale (strongly agree to strongly disagree) (Fink, 2017). Several additional questions related to demographic/other information were also included. The 39 Likert scale items were assigned to 10 different subscales, including six subscales relating to the six learning outcomes of the orientation course: (a) synthesis of program and personal goals; (b) familiarity with GU course, procedures, and design; (c) personal learning styles and study habits; (d) understanding GU form and style; (e) aptitude and ministry skills assessment; and (f) GU program fit with personal goals. The four additional subscales related to the following experiences: (a) orientation course overall, (b) faculty advisor, (c) first-year course sequence, and (d) first-year experience overall.

Due to the limited number of survey participants, the researcher did not pilot test the survey with any students from the target group. However, several steps were taken to establish the content validity of the survey items and subscales assigned (Creswell, 2013; Fink, 2017). A group of four scholarly experts, who were part of the dissertation committee, provided initial feedback on the survey instrument. In addition, a group of four practitioners from GU associated with the FYE program provided detailed feedback for each survey item and its relevance to the assigned subscale. Finally, a staff member who had recently completed the orientation course as a student, pilot tested the survey and provided feedback.

Data Analysis

The data collected was analyzed through IBM SPSS (Field, 2013). The researcher used descriptive statistics to summarize persistence, including course completion frequency/percentage for the orientation course and the frequency/percentage of students who had a subsequent enrollment in at least one course in the first year, and CGA for first-year courses (excluding the orientation course) for each group. For the dependent variables related to persistence, completion of the orientation course and subsequent enrollment in at least one first-year course, separate chi-square tests determined if there was a difference between the two groups (Field, 2013). For the dependent variable, student CGA for other first-year courses (excluding the orientation course), an independent samples t -test determined if there was a difference between the two groups (Field, 2013). The minimum significance level for these tests was set at p < .05.  In addition, data collected from the FYE survey was analyzed with descriptive statistics to summarize scores for all 10 subscales. Cronbach’s alpha was also calculated for each separate subscale (Creswell, 2013; Fink, 2017).

Results

Student Success

To address the first research question as to whether there was a difference in student success for FYE students as compared to non-FYE students, two chi-square tests of association and one independent samples t-test were performed. Student success was measured by persistence (i.e., completion of the orientation course and subsequent course reenrollment in the first year) and student CGA for other first-year courses (excluding the orientation course).

Persistence. Participation in the FYE program was not significantly associated with completion of the orientation course, χ2 (1, N = 399) = 0.44, p = .51. FYE students completed the orientation course at a similar rate (34%) to that of non-FYE students (38%) (see Table 2 for more details). In addition, participation in the FYE program was not significantly associated with subsequent course reenrollment in the first year, χ2 (1, N = 145) = 0.78, p = .38. FYE students persisted at a similar rate (60%) to that of non-FYE students (67%) (see Table 2 for more details).


Student grades. Levene’s test for equality of variances was not significant (p > .05), indicating that the assumption of homogeneity of variance was met. On average, students who participated in the FYE program had a similar CGA (M = 87.7, SE = 1.68) compared to non-FYE students in the control group (M = 88.11, SE = 0.85) (see Table 3 for more details). The difference in CGA, -.41, BCa 95% CI [-4.482, 3.363], was not significant t(90) = -0.245, p = .81.



FYE Survey


To answer the second research question regarding to what extent the FYE achieved its objectives based on student perceptions, a survey was distributed to students who completed the orientation course and participated in the FYE program in 2017 (N = 50). There were 24 total responses for a response rate of 48%. Twenty-three respondents answered the demographic questions as detailed in Table 4. In addition, these respondents indicated they had completed high school (n =23) with 17 reporting high school GPA (M = 3.5, with a range of 1.8 to 4.2).

The FYE survey contained 10 subscales. The Cronbach’s alpha ranged from .75 to .96, which indicates that all scales had at least acceptable values or higher for internal consistency reliability (Field, 2013; Gliem & Gliem, 2003). The subscale means ranged from 1.83 to 2.86 which correspond to the scaling where 1 = strongly agree, 2 = agree, and 3 = somewhat agree. The standard deviations ranged from 0.70 to 1.39, indicating


some patterns of responses were more disparate than others. Three subscales (first-year experience overall, faculty advisor, and GU program fit with personal goals) had noticeably higher standard deviations (1.34 to 1.39) and maximum scores (6.00 to 7.00), indicating greater variation in responses. All subscales had a minimum score of 1.00 with the maximum scores showing more variation; the first-year experience overall subscale had a maximum of 7.00, the highest point of the scale (7 = strongly disagree). Because the analysis of data indicated higher levels of skewness, a better measure of central tendency is the median value (Field, 2013). The medians ranged from 1.67 to 2.67 (see Table 5 for more details).


To address the second research question regarding the extent to which the FYE program achieved its objectives as perceived by FYE students, an analysis of the subscales was performed. The first six subscales on the survey related to the six course learning outcomes for the FYE orientation course, Essentials of Learning at Global University. The lowest scoring subscales were synthesis of program and personal goals (median = 1.67) and familiarity with GU course, procedures, and design (median = 1.83), indicating strongly agree to agree. The next highest scoring subscales were understanding GU form and style, aptitude and ministry skills assessments, and GU program and intended goals, each with a median score of 2.00 (corresponding to agree). The highest scoring subscale was the personal learning style and study habits subscale (median = 2.50), indicating agree to somewhat agree. The subscale with the greatest variation in scores was GU program fit with personal goals (SD = 1.34, range of 1.0 to 6.0).

Four additional subscales related to overall experience and the effectiveness of the three main components of the FYE program. The lowest scoring subscales were orientation course overall, first-year course sequence, and first-year experience overall, each with a median score of 2.00 (corresponding to agree). The highest scoring subscale was faculty advisor (median = 2.67), indicating agree to somewhat agree. The two subscales with the greatest variation in scores were faculty advisor (SD = 1.39, range of 1.0 to 6.0) and first-year experience overall (SD = 1.39, range of 1.0 to 7.0).

Discussion

The goal of this study was to evaluate the impact of the GU FYE program on student success and contribute to the scant literature related to the evaluation of first-year interventions in distance education. The findings were mixed. Related to the question as to whether there was a difference in student success for FYE students as compared to non-FYE students, no differences were found. These findings suggest that the FYE program did not have a significant impact upon student success. However, related to the question as to what extent the FYE program achieved its objectives per student perceptions, FYE students scored in the “somewhat agree” to “agree” range for all subscales of the FYE survey. Thus, FYE students who completed the orientation course and participated in the survey generally agreed that the objectives were achieved.

Improving Student Success

The findings do not support the literature related to the impact of first-year interventions and orientations on student success in distance learning environments. Studies have reported the positive impact of interventions and orientations on student persistence (Clay et al., 2009; Glazer & Murphy, 2015; Jones, 2013; Nichols, 2010; Pattison, 2004) and student grades (Beyrer, 2010). It is noteworthy that the student population for the current study is more nontraditional (e.g., mean age) than in the typical study. Likewise, it is worth pointing out that few studies reported statistical significance. While many studies mentioned increases in student success and persistence, which are clearly important to institutions, their results do not suggest that the findings were not the result of chance. This is important because of the amount of time and money invested in the creation of first-year interventions. With limited funding and a general decline in enrollment, especially for private institutions, wise decisions related to the allocation of funds are more critical than ever. Impact evaluations offer institutions the opportunity to study the impact of programs and invest in the most effective strategies. This study makes a valuable contribution to scholarship by reporting the impact of an FYE program in a distance learning environment based on an evaluation utilizing inferential statistics. Yet, more needs to be known about the impact of first-year programs in other DLOEEs.

One reason for no significant differences in student success variables between the FYE and non-FYE group may be due to the fact that both groups experienced an orientation course. The non-FYE students enrolled in a pre-existing two-credit orientation course. It may be different comparing FYE students to students with no first-year course. Future studies should consider comparing a group experiencing an FYE program with a group without any type of intervention. While ex post facto designs may be appropriate for practical reasons, future studies could also consider engaging in more experimental designs where feasible. In addition, this study only measured impact on persistence (i.e., subsequent reenrollment) in the first year. Because of the flexibility in GU’s program, it is possible that students who had no additional enrollments in the first year may subsequently reenroll in the second year. Longitudinal studies would be necessary to know the result of any longer-term impact, such as persistence to the second year and degree completion rates.

A principal concern is that nearly two-thirds of first-year or reactivating students did not complete the one-credit orientation course. These students applied to the university, received degree audits, enrolled in at least the orientation course, yet did not complete their first course(s). Boston et al. (2012) have suggested that the flexible, accessible nature of distance education may encourage more exploration from students as opposed to more traditional education environments. In other words, students are more willing to try out distance education. Considering GU’s open admissions policy and mission to provide access to ministry related studies, it may well be that some students explore the option of GU as an open, flexible, economic alternative. Dealing with large amounts of noncompletion may be a consequence of such openness and accessibility. Some researchers even question the assumption that persistence is always positive and attrition always negative in these types of low-risk distance learning environments (Park, Boman, Care, Edwards, & Perry, 2008). More needs to be known about the reasons why students do not complete the orientation course and drop out. In addition, future studies could evaluate the role of gender, age, and marital status, given the uniqueness of the more nontraditional student population.

Achieving FYE Program Objectives

The findings related to student perception of the effectiveness of the first-year program do support studies in the literature. Students who participated in first-year interventions and completed orientations have reported positive experiences (Jones, 2013; Pattison, 2004). There was an impact upon the FYE students surveyed, as noted by the subscale scores indicating overall agreement, which is important. However, the variation in scores related to first-year overall experience and faculty advisor suggest that the FYE program has room for improvement. One area for improvement could be increased communication with the student. Except for an initial contact by the faculty advisor, no additional proactive communication strategies were required. Research suggests that employing multiple communication strategies may further impact student success and persistence (Maathuis-Smith et al., 2011; Nichols, 2010).

Pre-admission Factors

According to Rovai’s (2003) CPM, more needs to be known about the pre-admission factors of all first-year students, including previous academic performance like high school GPA, in order to identify academically at-risk students. While not conclusive, the average high school GPA reported by the majority of survey respondents may indicate that many students who successfully completed the FYE orientation course were better academically prepared. Though students generally agreed that the objectives of the FYE were achieved, better prepared students may have succeeded in the first year without a more comprehensive intervention. More open, nonselective institutions like GU tend to attract a higher population of underprepared students. These institutions are also more likely to offer online and distance education, which, when compounded with higher noncompletion and dropout rates, may result in increased educational inequality (Xu & Xu, 2019). Thus, it is imperative to use models like Rovai’s (2003) CPM to identify these students before they enroll and develop appropriate targeted strategies to increase student success.

Conclusion

Higher rates of student dropout in distance education continue to be a problem. Research has demonstrated that persistence in distance learning is a complex phenomenon (Hart, 2012; Rovai, 2003). First-year interventions informed by relevant research like Rovai’s (2003) CPM have the potential to impact student success and increase persistence, but they must be evaluated for impact. This study helps to fill a gap in the literature related to evaluation of first-year interventions in distance learning environments. No significant difference in student success, as measured by persistence (i.e., completion of the orientation course and subsequent re-enrollment in the first year) and first-year student grades, was found when comparing FYE students and non-FYE students. Survey results, however, indicated that FYE students generally agreed that the program’s objectives were achieved. The findings confirm the complexity involved with mitigating student dropout, suggesting that further research and evaluation is necessary.



References

Ali, R., & Leeds, E. M. (2009). The impact of face-to-face orientation on online retention: A pilot study. Online Journal of Distance Learning Administration, 12(4). Retrieved from https://www.westga.edu/~distance/ojdla/

Allen, I. E., Seaman, J., Poulin, R., & Straut, T. T. (2016). Online report card: Tracking online education in the United States. Babson Survey Research Group. Retrieved from http://files.eric.ed.gov/fulltext/ED572777.pdf

Barefoot, B. O. (2000). The first-year experience. About Campus, 4(6), 12-18.

Bean, J. P., & Metzner, B. S. (1985). A conceptual model of nontraditional undergraduate   student attrition. Review of Educational Research, 55(4), 485-540.

Beyrer, G. M. D. (2010). Online student success: Making a difference. Journal of Online Learning and Teaching, 6(1). Retrieved from http://jolt.merlot.org/vol6no1/beyrer_0310.htm

Boston, W., Ice, P., & Burgess, M. (2012). Assessing student retention in online learning environments: A longitudinal study. Online Journal of Distance Learning Administration, 15(2). Retrieved from https://www.westga.edu/~distance/ojdla/

Boston, W. E., Ice, P., & Gibson, A. M. (2011). Comprehensive assessment of student retention in online learning environments. Online Journal of Distance Learning Administration, 14(1). Retrieved from https://www.westga.edu/~distance/ojdla/

Brewer, S. A., & Yucedag-Ozcan, A. (2013). Educational persistence: Self-efficacy and topics in a college orientation course. Journal of College Student Retention: Research, Theory & Practice, 14(4), 451-465.

Brown, M., Hughes, H., Keppell, M., Hard, N., & Smith, L. (2015). Stories from students in their first semester of distance learning. International Review of Research in Open and Distributed Learning, 16(4), 1-17.

Cabrera, A. F., Castañeda, M. B., Nora, A., & Hengstler, D. (1992). The convergence between two theories of college persistence. The Journal of Higher Education, 63(2), 143-164. doi:10.2307/1982157

Cauble, D. (2015). Predictors of persistence in online graduate nursing students (Doctoral dissertation). Retrieved from Proquest Dissertations & Theses A&I. (AAT 10023990)

Clay, M. N., Rowland, S., & Packard, A. (2009). Improving undergraduate online retention through gated advisement and redundant communication. Journal of College Student Retention: Research, Theory & Practice, 10(1), 93-102.

Cohen, L., Manion, L., & Morrison, K. (2013). Research methods in education (6th ed.) New York, NY: Routledge.

Creswell, J. W. (2013). Research design: Qualitative, quantitative, and mixed methods approaches (4th ed.). Thousand Oaks, CA: Sage.

Field, A. (2013). Discovering statistics using IBM SPSS Statistics (4th ed.). Thousand Oaks, CA: Sage.

Fink, A. (2017). How to conduct surveys: A step-by-step guide (6th ed.). Thousand Oaks, CA: Sage.

Gering, C. S., Sheppard, D. K., Adams, B. L., Renes, S. L., & Morotti, A. A. (2018).

Strengths-Based Analysis of Student Success in Online Courses. Online Learning, 22(3), 55-85.

Glazer, H. R., & Murphy, J. A. (2015). Optimizing success: A model for persistence in online education. American Journal of Distance Education, 29(2), 135–144.

Gliem, J. A., & Gliem, R. R. (2003). Calculating, interpreting, and reporting Cronbach’s alpha reliability coefficient for Likert-type scales. Paper presented at the Midwest Research-to-Practice Conference in Adult, Continuing, and Community Education, The Ohio State University, Columbus, OH. Retrieved from https://scholarworks.iupui.edu/bitstream/handle/1805/344/Gliem%20%26%20Gli em.pdf?sequence=1&isAllowed=y

Goodman, K., & Pascarella, E. T. (2006). First-year seminars increase persistence and retention: A summary of the evidence from how college affects students. Peer Review, 8(3), 26-28.

Hart, C. (2012). Factors associated with student persistence in an online program of study: A review of the literature. Journal of Interactive Online Learning, 11(1), 19-42.

Henry, G. T. (2015). Comparison group designs. In K. E. Newcomer, H. P. Hatry, & J. S. Wholey (Eds.), Handbook of practical program E\evaluation (4th ed.) (pp. 137-157). Hoboken, NJ: John Wiley & Sons.

Jaggars, S. S., & Xu, D. (2010). Online learning in the Virginia Community College System. Community College Research Center, Columbia University. Retrieved from http://files.eric.ed.gov/fulltext/ED512396.pdf

James, S., Swan, K., & Daston, C. (2016). Retention, progression and the taking of online courses. Online Learning, 20(2), 75-96.

Jones, K. R. (2013). Developing and implementing a mandatory online student orientation. Journal of Asynchronous Learning Networks, 17(1), 43-45.

Lee, Y., Choi, J., & Kim, T. (2013). Discriminating factors between completers of and dropouts from online learning courses: Dropout factors in online courses. British Journal of Educational Technology, 44(2), 328-337. doi:10.1111/j.1467-8535.2012.01306.x

Maathuis-Smith, S., Wellington, S., Cossham, A., Fields, A., Irvine, J., Welland, S., & Innes, M. (2011). Obtaining high retention and completion rates in a New Zealand ODL environment: A case study of strategies employed by information and library studies faculty at the Open Polytechnic. Journal of Open, Flexible and Distance Learning, 15(1), 31-45.

Mayhew, M. J., Rockenbach, A. N., Bowman, N. A., Seifert, T. A. D., & Wolniak, G. C. (2016). How college affects students: 21st century evidence that higher education works. San Francisco, CA: Jossey-Bass.

Nash, R. D. (2005). Course completion rates among distance learners: Identifying possible methods to improve retention. Online Journal of Distance Learning Administration, 8(4), 1-26.

National Center for Education Statistics. (2018). Table 311.15: Number and percentage of students enrolled in degree-granting   postsecondary institutions, by distance education participation, location of student, level of enrollment, and control and level of institution: Fall 2015 and fall 2016. Retrieved from https://nces.ed.gov/programs/digest/d17/tables/dt17_311.15.asp?current=yes

Nichols, M. (2010). Student perceptions of support services and the influence of targeted interventions on retention in distance education. Distance education, 31(1), 93-113.

Office of Research and Evaluation. (2017). Undergraduate Assessment Plan & Supporting Documentation. Academic Affairs Department, Global University, Springfield, MO.

Office of Research and Evaluation. (2018). Undergraduate Enrollment Report. Academic Affairs Department, Global University, Springfield, MO.

Packham, G., Jones, P., Miller, C., & Thomas, B. (2004). E‐learning and retention: key factors influencing student withdrawal. Education & Training, 46(6/7), 335-342. doi:10.1108/00400910410555240

Park, C. L., Boman, J., Care, W. D., Edwards, M., & Perry, B. (2008). Persistence and attrition: What is being measured? Journal of College Student Retention: Research, Theory & Practice, 10(2), 223-233.

Park, J. H. (2007). Factors related to learner dropout in online learning. Online Submission. Retrieved from https://eric.ed.gov/?id=ED504556

Park, J. H., & Choi, H. J. (2009). Factors influencing adult learners’ decision to drop out or persist in online learning. Journal of Educational Technology & Society, 12(4), 207-217.

Pascarella, E. T., & Terenzini, P. T. (2005). How college affects students: A third decade of research. San Francisco, CA: Jossey-Bass.

Pattison, S. A. (2004). The effect of an orientation on distance-program satisfaction. Journal of College Student Retention: Research, Theory & Practice, 5(2), 205-233.

Permzadian, V., & Credé, M. (2016). Do first-year seminars improve college grades and retention? A quantitative review of their overall effectiveness and an examination of moderators of effectiveness. Review of Educational Research, 86(1), 277-316.

Rovai, A. P. (2003). In search of higher persistence rates in distance education online programs. The Internet and Higher Education, 6(1), 1–16. doi:10.1016/S1096-7516(02)00158-6

Shea, P., & Bidjerano, T. (2016). A national study of differences between online and classroom-only community college students in time to first associate degree   attainment, transfer, and dropout. Online Learning, 20(3), 14-15.

Swanson, N. M., Vaughan, A. L., & Wilkinson, B. D. (2017). First-year seminars: Supporting male college students’ long-term academic success. Journal of College Student Retention: Research, Theory & Practice, 18(4), 386-400.

Tinto, V. (1975). Dropouts from higher education: A theoretical synthesis of recent research. Review of Educational Research, 45, 89-125.

Tinto, V. (1987). Leaving college: Rethinking the causes and cures of student attrition (1st ed.). Chicago, IL: University of Chicago.

Tinto, V. (1993). Leaving college: Rethinking the causes and cures of student attrition (2nd ed.). Chicago, IL: University of Chicago.

Tinto, V. (2006-2007). Research and practice of student retention: What next? Journal of    College Student Retention, 8(1), 1-19.

Tinto, V. (2012). Completing college: Rethinking institutional action. Chicago, IL: University of Chicago Press.

Willging, P. A., & Johnson, S. D. (2009). Factors that influence students' decision to drop out of online courses. Journal of Asynchronous Learning Networks, 13(3), 115-127.

Xu, D., & Jaggars, S. S. (2011). Online and Hybrid Course Enrollment and Performance in Washington State Community and Technical Colleges (CCRC Working Paper No. 31). Community College Research Center, Columbia University. Retrieved from ERIC database. (ED517746)

Xu, D., & Xu Y. (2019). The Promises and limits of online higher education: Understanding how distance education affects access, cost, and quality. American Enterprise Institute. Retrieved from: https://www.aei.org/research-products/report/the-promises-and-limits-of-online-higher-education/

York, J. A. (2014). Student attrition in higher education: Development of an instrument to assess attrition factors in distance learning only educational environments. (Doctoral dissertation). Retrieved from ProQuest Dissertations and Theses A&I. (AAT 3580438)



Online Journal of Distance Learning Administration, Volume XXII, Number 4, Winter 2019
University of West Georgia, Distance Education Center
Back to the Online Journal of Distance Learning Administration Contents