Service Quality Assessment of Online Sport Education at a Sport-Specific
Graduate Institute



Steve Chen
Morehead State University

Louis Marciani
The University of Southern Mississippi

Cynthia Ryder
United States Sports Academy

Rosalie Ward
The University of Southern Mississippi

David Allen
University of Central Florida

 

Abstract

This study assessed the service quality of United States Sports Academy 's online course curriculum and examined the factors, which affected the students' overall satisfaction of their educational experience with the Academy. The results generated were based on 449 responses (male = 331, female = 118) of the student evaluation survey that were collected from January 1 to August 31, 2004. The evaluation questionnaire contained 41 fixed-variable and 5-point Likert scale items. The student evaluation survey was filled out via electronic method after each student had completed the last unit assignment. The exploratory factor analysis identified five specific service categories, Course Materials (CM), Instructor Traits (IT), Library Resources (LR), Student Services (SS), and Overall Rating (OR) with a total factor loading of 72.3%. Each of the factor categories yielded a reliability coefficient ( a value) greater than .715. The results indicated the male participants perceived to be more satisfied with all of the service categories than females did. The younger participants and students who had taken fewer courses with the Academy tended to have a higher satisfactory rating. The stepwise factor analysis also confirmed both CM and SS were two better predictors of the participants' overall satisfaction. Overall, the participants were fairly satisfied with the service quality of the Academy. In conclusions, more attentions should be drawn to the following areas for improving quality of teaching: (1) fulfilling the students' needs based their characteristics; (2) improving the quality of faculty performance and communication; and (3) facilitating the use the library resources. More directions for improvement of the service quality and future research were further discussed.

Introduction

Distance learning in conjunction with an online delivery platform haas become a rapidly, growing trend for the graduate and undergraduate students to obtain advanced degrees and knowledge (Levine, 2000; Walsch & Reese, 1995). According to Flowers and Cotton's study (2003), 300 percent of growth in online enrollments was observed when compared to face-to-face enrollments at Ball State University . A national survey in 1999 revealed 47 percent of the studied institutions offered at least one course online (Council for Higher Education, 2001). The State University of New York has experienced growth in online courses from eight in 1995-1996 to over 1,500 in 2000-2001 (Shea, Pelz, Fredericksen, & Pickett, 2001).

The benefits of online education, such as flexibility in time and mobility, a wide variety of course selections, more advanced information, and financial benefits, are identified in research (Kerkman, 2004; Schrecker, 1998; Trotter, 2001; Nelson, 2000). In addition, several studies also delve into issues related to quality of online learning format, students' satisfaction, and methods for enhancing learning via online (Coates, Humphreys, Kane, Vachris, Agarwal & Day, 2001; Dziuban, Sorg, Cook, & Wang, 2002; Fang, 2003; Hong, Lai, & Holton, 2003; Jaeger, 1998; Jurczyk, Kushner-Benson, & Saver, 2004; Stein, 2004). Despite the rapid increase in popularity and development of the web-based distance learning educational programs, there are still studies which question the integrity and effectiveness of online learning based on issues, such as lack of face-to-face interpersonal communication, cold (isolated) environment, and appropriateness of the presented course content (Carnevale, 2001; Coates et al, 2001; Trotter, 2001; Wallace, 2000).

In spite of these various concerns, online, web-based instruction has been adopted in undergraduate and graduate programs in numerous fields such as education, business, medicine, and liberal arts ( Davis, Craft, Wojton., Major, Tureski, Welc, Eichler, Doerr , Allison, & Sharp, 2003; Lindsay, 2000; Versel, 2003). Even in the field of sport education, an area that requires a considerable amount of hands-on experience and physical involvement, the popularity of web-based instructional programs has grown rapidly (Bowman, 2003; Chen & Marciani, 2003). At the present time, examinations of the effect and service quality of online programs are heavily emphasized (Jaeger, 1998; Trotter, 2001 & 2002). Since the introduction of web-based instruction and online programs in sport education are still in the early stages, it is believed that the appropriateness of this delivery method should be investigated.

The purpose of this study was to examine the contributory factors pertaining to the students' satisfaction toward their overall online educational experience with a sport-specific institute. The research questions focused on the following two main aspects. (a) How did the students perceive the service quality of this specific institute based on their online learning experience? (b) What were the concerns that this online institute must address to improve its existing educational services?

Method

Subjects

Six hundred full-time distance learning students were surveyed to obtain their level of satisfaction toward the service quality of a sport-specific institute . A total of 449 responses were collected during the 8-month data collection period. The surveys were filled out by graduate students who enrolled in 130 distance learning courses. About 73.7 % of the responses ( N = 331) were completed by males; and 26.3 % ( N = 118) were done by females. In terms of the respondents' student status, a majority of responses (87%) were completed by master's degree students. The remaining 13% of the responses were completed by doctoral students (9%) and continuing education students (4%). In general, about 79% of the responses were completed by students in the age group, 21-40 years, who had also taken fewer than five online courses.

Instrument

The original survey items were generated by five faculty members, who served on the Institutional Effectiveness Committee. The survey was designed to assess the students' satisfaction on quality of service of the institution and performance of the faculty. Many items were similar to some of web-based course survey items found in previous research (Hong et al., 2003; Jurczyk et al., 2004; Shea et al., 2001). A series of discussions on survey items and pre-tests was conducted to adopt the final version of the survey questionnaire. During the pre-test stage, the survey originally contained a total of 41 questions; however, the final version of the survey contained 34 items after performing the factor analysis which attempted to strengthen the validity of survey items. The analysis utilized a varimax rotation method and had the eigen-value set at the level of 1 and above. Six sub-categories were further identified, which included: (a) Instructor Traits (IT); (b) Course Materials (CM); (c) Learning Resources (LR); (d) Student Services (SS); (e) Special Services (SPS); and (f) Overall Rating (OR). These categories accounted for a total of 74.3% of variance. In terms of the consistency of the responses, the reliability coefficients (Cronbach's Alpha) of the six categories all exceeded .715. The detailed survey items of the final version and results of the factor analysis on six sub-categories are reported in Table 1.

Procedure

The data generated for this study were based on the 449 online course evaluations collected from January 1, 2004 to August 31, 2004. According to the policy of the institution, each student must fill out a mandatory embedded electronic survey as he or she finished a course. It is important to notice that a student might complete multiple responses if he or she had taken more than one online course. This meant that the responses were not completed by 449 different individuals. It was estimated that about 200 individuals completed at least one survey. However, no student had completed more than four evaluations during the data collection period. In general, the students' responses were collected at the end of each month. The online course evaluations gathered in the eight-month period were further compiled and analyzed on September 13, 2004 by the researchers. The researchers performed a factor analysis to recheck the validity of the original survey. Different types of statistical analysis were also performed to draw applicable findings from the available data. The processed information should help the Institutional Effectiveness Committee of this specific institute improve the existing service quality and monitoring of the faculty's performance.

Data and Statistical Analysis

The data of this study were analyzed by the SPSS 11.0 program. Statistical methods for the data analysis included descriptive analysis, exploratory factor analysis, simple correlation analysis, t -test, ANOVA, and stepwise regression analysis. The relevant demographic information was presented in forms of frequencies and percentages. Both t -test and ANOVA analysis were utilized to test the differences among different factor scores based on demographic variable, such as gender, age, course experience, and class status. The stepwise regression analysis was used to identify the best predictors of Overall Course Rating from the five identified factors.

Results

The descriptive analyses showed the majority of the responses were completed by males (73.7%), and those who were under 40-year in age (85.7%). The students did not appear to have had long-term experience in online learning, as 76% of the responses showed that the respondents had taken less than five online courses. Despite their relatively short experience in the online program, 98% of responses ( n = 441) indicated the students were competent to handle the technical aspects related to dealing with online courses. Two-hundred eight six responses were filled out by the students who had utilized the financial aid service. Among those 286 responses, 94% of them indicate d the financial aid staff was accurate and helpful. Among those 180 responses indicating the respondents had sought support from the Help Desk, 90% of the responses recorded satisfactory rating. In general, about 88% of responses showed that the courses had met or exceeded the students' expectations. Nearly 92% of responses also indicated the offered courses related well to real world application in sport education.

The mean scores of five service-categories, course materials (1.82), instructor traits (1.94), special services (1.76), learning resources (2.58), and overall rating (1.78) from the male respondents were significantly (all t -values greater than 4.750; df = 447, p < .01) better than those from the females. The findings also showed the students who were younger and less-experienced in online learning (who had taken fewer online courses) tended to have a higher satisfactory rating. Significant differences (all F values greater than 2.323; df = 446, p < .05) were found in special services , instructor traits , learning resources , overall rating, and course materials from students in different graduate levels of course work. Doctoral students gave the best rating toward learning resources and instructor traits. However, continuing education students tended to give lower ratings in those four categories. Table 2 illustrates the different categorical mean scores based on gender and student status.

The mean score of OR was positively and significantly correlated ( p < .01) with CM ( r = .763), IT ( r = .665), SS ( r = .572), SPS ( r = .465) and LR ( r = .326). However, only CM, IT and SS showed a fair level of correctional strength with OR. The stepwise regression analysis was performed to identify the best predictors of the participants' overall satisfactory rating. The analysis included CM, IT, SS, SPS, and LR as the predictive variables and yielded three acceptable models. All of the models suggested that the factor of course materials was considered as better predictor of the overall rating. Both special services and instructor traits were also accepted as effective predictor of the overall rating. Table 3 presents the regression models depicting the acceptable predictors of the overall course rating.

Discussion, Conclusions and Recommendations

Based on the findings of this study, it is logical to infer that the respondents are fairly satisfied with the service quality of the Academy. A good overall satisfactory rating might explain why a high rate of the respondents would recommend the program to their friends (> 90%) or even planned to take additional distance learning classes with the institution (> 80%). The current findings support the idea that the online students are often satisfied with their online courses; and the satisfaction rate often reaches at 90% or higher ( Dziuban, et al., 2002: Morgan, 2004; Shea et al, 2001).

In this study, the course materials factor is considered as the most important predictor of the respondents' overall satisfaction. Although the study of Dziuban et al. (2002) favored the faculty's teaching performance as the primary factor that influenced the students' satisfaction, this study tended to support the viewpoint of Stein's (2004) and New Jersey Institute of Technology's report (2005). This finding should not be interpreted as a factor to minimize the importance of the faculty's effort and feedbacks. Rather, the readers should understand that well-designed course assignments, discussions and exam questions, and a quality textbook are essential tools necessary to facilitate the students' knowledge and learning. For some students, who do not spend extra effort to search information and supplementary materials, the textbooks and structured course materials are what they learn from. In order to maintain a high quality program, the institution should continue to excel in designing meaningful assignments, discussions, quizzes, and selection of quality textbooks.

It is understood that this sport-specific institution has a standardized format for all of its online courses. The assignments, discussions, final examination, and research paper all followed a uniform matrix and rubric. Students, who have a brand new online experience, may feel satisfactory about the system. However, they may feel bored and unchallenged after going through the exact format for multiple courses over and over. This factor may explain why more-experienced students (who had taken more than 5 online courses) tended to have a lower satisfactory rating in course materials , instructor traits, and overall rating . The faculty should be allowed to have more freedom to incorporate different types of assignments and requirement based on the specific needs of the course.

It is also important to recognize that students gave a fairly good rating toward the special services , student services , and course materials . The convenience and feature of the online delivery have proven to be very attractive to students. In the meantime, the integrity of the quality of learning has also become an area that some individuals attack. A critical concern, which appeared from the results, was the lower rating in the instructor traits category. Ultimately, students were enrolled for the purpose of obtaining quality education. Dziuban et al. (2002) particularly emphasized the role of faculty's teaching performance on students' learning. Obviously, there is an apparent need for the faculty to improve their teaching performance. Two fundamental areas, which were found to need improvement, were maximizing effort in grading student products and communicating with the students. Open forum-discussion, journal assignments, mandatory phone-conversation, and synchronized discussion meetings are all popular forms for enhancing student-student and student-faculty interaction. The institution may need to think about applying some of the aforementioned strategies, while keep the course module user-friendly.

Since the online graduate students are less likely to be present at the campus library, they usually rely heavily on the Internet or other libraries to search for information. Due to a lower rating in the learning resources category, it is imperative that the institution should apply more effective strategies to encourage the students using the available e-library resources. More visual and audio teaching and supplemental materials from the library might be embedded in the course modules with the help of Division of Information Technology. Encouraging students to utilize the library resources would help the students to minimize problems, such as not knowing how to locate scholarly references or not violate intellectual property rights. One suggestion to help facilitate the use of the library is for the institution to embed a tutorial program into its library website to guide students using its electronic library. Since the doctoral students usually had more mandatory research projects to complete, they tended to make a better use of the institution's library.

The researchers believe that more attention should focus on the needs of the continuing education students and female students, since they have given a lower rating in several service-categories. According to Chen and Marciani (2004), the online faculty members usually responded to their students relatively quickly, if the students initiated the contact with the faculty members. However, the faculty were less frequent in their efforts to initiate communication with students. Apparently, the female students did not contact their professors as often as the male students. It is possible that female students tend to more sensitive. Once the feedback does not return promptly, the female students would cease asking for help. The results had indicated that females would not receive the feedback from their professor as often as their male counterparts. This phenomenon may mark down the females' evaluation on the faculty-student interaction and the helpfulness of the professor. Doctoral students were more likely to contact with their instructor than any other groups of students. The researchers recommend to the Academic Affairs division to examine this issue closely. Monitoring effective strategies to generate more student-faculty interaction and asynchronous discussions is strongly proposed by several studies as the key to ensure the quality of learning (Hong et al, 2003; New Jersey Institute of Technology, 2005; Shea et al., 2001). The faculty members should be encouraged to understand the gender difference in responding to the students. Perhaps, their workload and distribution of the teaching assignments should also be reevaluated in order to allow them more time to establish more effective communication with the students.

The tendency of less faculty-student interaction can also be found with continuing education students. It is difficult to conclude if the faculty members treated the continuing education students differently than they did the graduate students. Because a most the continuing education students are part-time students, who are not really seeking a degree, this may cause the faculty members to devote less attention to them because of a heavy faculty workload. The aforementioned reasons could be logical explanations of the significant differences among different groups of the students. The important issue is that the institution should carefully examine these trends and resolve the problem of lacking sufficient student-faculty interaction.

One important limitation of this study is that an individual might complete multiple responses during the data collection process. Although, the student was supposed to provide feedback to each individual course separately and objectively, it was possible that the student might give the similar responses for multiple surveys. This phenomenon would tend to inflate the actual satisfactory rating on certain categories, because a same good evaluation from one individual was counted for multiple times. It is also difficult to control the halo effect, since the student was exposed to the same survey for multiple times. In reality, the current survey administrative method would not be able to correctly identify the actual satisfactory rating, because both positive and negative responses could be recorded more than one time . Theoretically, those students who had an unsatisfactory experience with the institution would more likely discontinue their study. Thus, the current result may tend to under-reflect the number of negative rating and inflate the actual satisfactory rating. The researchers would suggest comparing the current results with the similar type of exit survey data.

The results of this study provide some basic evidences supporting the quality of service that the institution offers. More importantly, the data should not be reviewed as a token of success, but as valuable information for the institution to use to improve on its performance. In order to further accurately assess the students' learning and satisfaction, the researchers would offer the following suggestions for future research.

(a) Develop a multi-facet evaluation method to measure the students' competency and correlate the scores with their satisfaction toward their learning experience. Some of the other evaluation components may include examining internship performance, monitoring proficiency exams, and tracking the students' employment history.

(b) Compare the same students' academic performance under two different educational settings (for example: face-to-face environment vs. online/web-based environment). This method should help the scholars differentiate the students' personal feedback under the traditional and non-traditional teaching environments.

(c) Compare the students' responses based on different types of programs, such as sport coaching, sport management, and sport medicine. This method would investigate whether the students react differently corresponding to the nature and requirement of various curricula. Scholars may test whether the online learning is still effective for courses that require physical practices and hand-on experience.

(d) Explore more appropriate computer technologies that can be implemented to facilitate students' learning.


References

Altman, E., & Pratt, A. (1998). The JAL guide to the professional literature: Distance education . Journal of Academic Librarianship, 24, 500.

Bowman, D. H. (2003). Internet spawns online physical education . Education Week , 22 , 3.

Carnevale, D. (2001). As online education surges, some colleges remain untouched. Chronicle of Higher Education, 47 , 41-43.

Chen, S. & Marciani, L. (2004, November). Distance education, online learning and sport education. Paper presented at the 2004 Alabama State AHPERD Fall Convention, Birmingham , AL .

Council for Higher Education Accreditation. (2001). Distance learning: Academic and political challenges for higher education accreditation. Retrieved October 1, 2004, from http://www.chea.org/commentary/distance-learning/chea_dis_learning.pdf .

Coates, D., Humphreys, B. R., Kane, J., Vachris, M., Agarwal, R., & Day, E. (2001). “No significant distance” between face to face and online instruction: Evidence form principles of economics. Retrieved October 1, 2004, from http://www.cerge-ei.cz/pdf/events/papers/011101_t.pdf .

Davis, J. E. , Craft, B. , Wojton, G., Major, C., Tureski, C., Welch, A., Eichler, R. C.,

Doerr, E. , Allison, S., & Sharp, T. (2003). Letters. Education Week , 22 , 34-36.

Dziuban, C., Sorg, S., Cook, I. , & Wang, M. (2002). Implications of web-based learning for student evaluation of university teaching. Unpublished manuscript, University of Central Florida at Orlando .

Fang, Z. (2003). Enhancing the quality of online higher education through measurement. Quality Assurance in Education: An International Perspective , 11 , 214-222.

Flowers, J., & Cotton, S. (2003). Master of arts in career and technical education: Now 100% online. Tech Directions, 63 , 22-23.

Hong, K. S., Lai, K. W., & Holton, D. (2003). Students' satisfaction and perceived learning with a web-based course. Educational Technology & Society, 6 , 116-124.

Jaeger, G. (1998). Online students do better. Retrieved from http://leahi.kcc.hawaii.edu/org/wwwdev/logs/1722.html.

Jurczyk, J., Kushner-Benson, S. N., & Saver, J. R. (2004). Measuring student perceptions in web-based courses: A standards-based approach. Online Journal of Distance Learning Administration, 7 (4). Retrieved March 1, 2005 from http://www.westga.edu/~distance/ojdla/winter74/jurczyk74.htm .

Kerkman, L. (2004). Convenience of online education attracts midcareer students. Chronicle of Philanthropy , 16 , 11-13.

Lindsay, E. B. (2000). Webwatch. Library Journal, 125 , 32-34.

Levine, A. (2000, March 13). The soul of a new university. New York Times , p. A21.

Morgan, K. (2004). Student satisfaction depends on course structure. Online Classroom , 14 , 5.

New Jersey Institute of Technology (2005). Multimedia course design, instructor preparation increase satisfaction. Online Classroom, 14 , 8.

Nelson, T. (2000). The message in the (digital) bottle. Education Week, 20 (15), 44.

Oppenheimer, T. (2004, April 25). The digital doctorate [Education life supplement]. The New York Times , pp. 31-33.

Schrecker, E. (1998). Technology and intellectual property: Who's in control? Academe , 84 ), 13.

Shea, P. J., Pelz, W., Fredericksen, E. E., & Pickett, A. M. (2001). Online teaching as a catalyst for classroom-based instructional transformation. Retrieved March 1, 2005 from http://tlt.suny.edu/Research/Faculty01.doc .

Stein, D. (2004). Student satisfaction depends on course structure. Online Classroom, 2 , 1;5.

Sumler, D.E. (2004).Unbundling the campus. University Business , 7, 7-9.

Trotter, A. (2001). Cyber learning at online high. Education Week, 20 , 28-34.

Unknown. (1999). Health & physical education online database. New Zealand Physical Educator, 1 , 7-8.

Versel, N. (2003). Online education continues growth. Modern Physician, 7 , 20.

Walsch, J., & Reese, B. (1995). Distance learning's growing reach. Technological Horizons in Education Journal, 22 , 58-62.

Wallace, R. M. (2000). Online Learning in Higher Education : A review of research on interactions among teachers and students . Education, Communication & Information , 3 , 241-281.


Table 1.
Survey Items and Analysis of the Major Factors within the Survey Questionnaire

Category

Number of

Items

% of

Variance

Cronbach's

a

Content of items

Demographic

& Academic Information

6

--

--

Gender; age; number of course taken, student status; use of the library materials; use of financial aids; receiving the course package; & use of the Help Desk

Sub-Categories:

5-point Likert Scale items with 1 = strongly agree, 3 = neutral, & 5 =strongly disagree

Course Materials

7

13.5

.848

Quality of textbook, assignments, quizzes, and discussions; relevance and value of the research paper; & quality of the final exam or project.

IT

6

22.9

.959

Helpfulness the instructors' advice; promptness of the instructors in responding to students; fairness in grading; student/instructor interaction; & instructors' commitment and professional conduct.

Library Resources

2

9.4

.798

Helpfulness of the librarians; & quality of the library source

Student Services

5

11.3

.897

Helpfulness of the general Student Service staff; staff's availability; staff being informative; ease of the registration; & accuracy of the information

Special Services

2

7.8

.715

Helpfulness of the financial aid officer and Help Desk

Overall Rating

2

9.9

.866

Relevance of the course; & meeting or exceeding expectation

Fixed Alternatives

(Yes, No or Maybe)

3

--

--

Recommending the program to others; learned as much as the traditional delivery via online; &willing to take additional online courses

Open-ended Question

1

--

--

Please indicate any of the problems or issues that the Administration Office should be aware of concerning your learning experience

Total

34

     

Table 2.
The Comparison of Categorical Means in Genders and Student Status (Mean & SD)

Demographic Variable

CM

IT

LR

SS

SPS

OR

Gender

           

Male ( n = 331)

1.70

(.52)

1.81

(.83)

2.40

(.85)

1.65

(.56)

1.53

(.49)

1.67

(.63)

Female ( n = 118)

2.14

(.57)

2.30

(1.00)

2.94

(.87)

2.11

(.70)

1.73

(.72)

2.10

(.78)

Student Status

           

Master's students ( n = 390)

1.79

(.55)

1.93

(.89)

2.58

(.84)

1.76

(.62)

1.57

(.50)

1.74

(.66)

Doctoral students ( n = 40)

1.85

(.57)

1.86

(.98)

1.85

(.86)

1.66

(.60)

1.40

(.62)

1.95

(.90)

Continuing Ed. Students ( n = 19)

2.38

(.61)

2.37

(.95)

3.34

(.97)

2.33

(.76)

2.00

(N/A)

2.23

(.71)

Overall

1.82

(.60)

1.94

(.90)

2.58

(.89)

1.76

(.63)

1.55

(.51)

1.78

(.69)

Table 3.
Summary of Stepwise Regression Analysis for Factors Predicting Overall Course Rating

Variables

Unstandardized Coefficients Beta

Std. Error

Standardized Coefficients Beta

t

Model 1

     

(Constant)

.387

.136

 

2.812**

Course Materials

.744

.086

.650

8.677**

Model 2

     

(Constant)

.183

.142

 

1.290

Course Materials

.510

.106

.445

4.797**

Student Services

.393

.114

.319

3.434**

Model 3

     

(Constant)

.175

.140

 

1.253

Course Materials

.443

.109

.387

4.056**

Student Services

.307

.120

.249

2.566**

Instructor Traits

.140

.067

.188

2.105**

** p < .01

Online Journal of Distance Learning Administration, Volume IX, Number II, Summer 2006
University of West Georgia, Distance Education Center
Back to the Online Journal of Distance Learning Administration Content