Upgrading or Replacing Your Learning Management System:
Implications for Student Support



Donna Petherbridge
Director of Instructional Services
Distance Education and Learning Technology Applications
North Carolina State University

Diane Chapman
Teaching Assistant Professor
Adult and Higher Education
North Carolina State University



Abstract

Changing your Learning Management System (LMS) presents challenges not only to instructional faculty, but also to the students who depend on these systems to completely deliver or support their courses. In assessing the transition for students experiencing an upgrade in the campus LMS, a number of findings have real implications for developing a strategy to support the needs of students undergoing the transition. This paper discusses selected student assessment components of an LMS implementation, and the implications of the findings for student support.

Introduction

Learning Management Systems (LMSs) are becoming hubs of instructional technology for many types of institutions of higher learning. According to a 2004 ECAR study, 90% of a given institution's students will have some experience with an LMS (Kvavik, Caruso & Morgan, 2004), and Coates (2005) indicates that “recent estimates suggest that in many countries, about three quarters of institutions have an LMS” (p.66) . As LMSs evolve, adding new features and forming new partnerships in a dynamic market, institutions will inevitably face the need to change, upgrade, or newly install an LMS. The introduction of new LMSs on campus, or upgrading from older to newer, (and often quite different), LMS versions, presents novel challenges to the entire higher educational organization undergoing the adoption of an LMS (Morgan, 2003). At North Carolina State University (NCSU), an upgrade from the current LMS (WebCT Campus Edition) to a newer version (WebCT Vista) revealed that not only faculty, but students, will need support during this type of transition. To identify problems, needs, and issues associated with an LMS upgrade, an assessment was undertaken to understand the implications of student support for the transition. The findings of this study can be used to inform other institutions as they are faced with similar situations.

Learning Management Systems

An LMS (sometimes known in higher education as a Course Management System, or CMS,) is a Web-based software system that can assist in planning, implementing and assessing the learning process, allowing students access to the learning process independent of place and often independent of time (Chapman, 2005; Nichani, 2001). Chapman (2005) notes that a consistent literature definition for an LMS is elusive, possibly because the LMS in use may have been developed for the nuances of a specific organization, purchased from a vendor, or is some conglomerate of both. Additionally, the issue is confusing because of similar systems appearing on the market that manage learning objects, offer publishing functions, and provide space for virtual learning communities (Chapman, 2005).

While cost and pricing structures vary, LMSs are among the most expensive purchases an institution may make to support e-learning, including the direct cost of the products themselves, and indirect costs related to the maintenance, training and support of these systems (Harrington, Gordon & Schibik, 2004; Paulsen, 2002). Inevitably, organizations will migrate to the LMS systems that give them the best perceived competitive advantage in their e-learning processes within their organizational budget constraints (Barron, 2000; Coates, 2005).

Changing Nature of the LMS Market

As with many emerging technologies and products, the LMS market continues to transform and adapt. From 2000 through 2002, half of the vendors in the LMS software business disappeared (Dobbs, 2002). With smaller companies driving the early market, the current trend seems to be toward consolidation and integration (Hall, 2003). Companies continue to unite, forming relationships that capture a large part of the LMS market (Cheal, Cummings, Fernandes, & Penney, 2006). A sure sign of this was the 2006 merger of two LMS giants, Blackboard and WebCT. The newer LMS products are more comprehensive and many can administer a virtual campus, manage the learning process, and integrate with legacy administrative systems.

It is rare to find courseware created in one LMS that can be easily moved to and used in another LMS. Consolidation efforts such as the Blackboard/WebCT merger may address some concerns about software compatibility, but may also stifle competition. Much of the software is considered proprietary and vendors resist the notion of open source code (Chapman, 2005). Proponents of open-source coding are striving for standardization that can help protect organizations as vendors go out of business (Smith, 1998) .

As technologies change and online teaching matures, campuses and schools must go through the process of evaluating available LMS vendors and their products every few years. A survey conducted by Bersin & Associations noted that LMS users may find that their current LMS does not meet their needs (Howard, 2004). The LMS market is also seeing a rise in the adoption of LMS systems. In the past only the largest organizations saw needs and had funding for such systems, but lately, smaller organizations are adopting LMS systems. Chapman (2002) found that 44% of companies with fewer than 1000 employees have an LMS. The market has tremendous growth when the speed of technological changes and adoption rates are taken into account. Barbian reported in 2002 that 24% of companies with 2-3 years LMS usage said they would be switching vendors within a year's time and almost half of the respondents with 10,000 employees or more said they would purchase an LMS within 12 months (2002).

Student Use of LMSs

An initial glimpse of LMS usage may give the impression that LMSs primarily support distance education via e-learning as many LMSs were initially developed in the context of supporting online courses and programs (Harrington, Gordon & Schibik, 2004). Though formal distance education (DE) courses in the United States have been offered in a variety of formats since the late 1800s, the Internet is now the dominant DE delivery mechanism, and one that is often preferred by students (Howell, Williams & Lindsay, 2003). For an example specifically illustrating the growth of Internet use for DE courses, at NC State, the Internet has rapidly become the predominant medium for DE instruction, growing from hosting around four-thousand student credit hours in 2000 to over eighteen-thousand student credit hours in 2005. While other modes of DE delivery exist, such as cable, DVDs, and video, the use of these modes tends to remain stable each year, while the use of the Internet has seen double-digit percentage increases over the past 5-years at NCSU. As the Internet is now the DE medium of choice, LMSs are becoming the delivery mechanisms of choice.

The use of LMSs to support DE exclusively appears to be changing quickly as online learning technology is being used to enhance on-campus education (Coates, 2005). For example, in a 2004 national survey of academic department chairpersons, only 25% of respondents indicated that the primary use of their LMS was to support web-based distance education courses, while 44% indicated that their institution's LMS was being used primarily to support traditional face-to-face courses, and around one-third indicated that the LMSs were used for hybrid courses (Harrington, Gordon & Schibik, 2004). Additionally, Bates & Poole (2003) indicate that WebCT estimates that 80% of their LMS users are enhancing their classroom teaching as opposed to teaching DE courses. According to Maeroff (2002), hybrid courses, which combine the features of both classrooms and online courses, reducing class meetings to interact part of the time online, will become the rule rather than the exception in higher education.

As institutions realize that LMSs can potentially add value to how students engage with their studies higher educational institutions will continue to see more of a convergence between online and face-to-face approaches through a hybrid model” (Bates & Poole, 2003). As Coate's (2005) notes, while institutions may have initially purchased LMS systems to increase their competitive advantage in the DE arena, drawing in new student populations and increasing learner access through their DE courses, the challenge institutions now face is not financial or technological, but educational. No matter the mode that an LMS supports, whether a face-to-face or a DE course, or some combination in between, institutions must now focus on how LMSs promote student learning and engagement (Coates, 2005; Oliver, 2001; Weigel, 2005).

Moving to WebCT Vista at NC State

In 1999, NCSU held its first formal assessment of LMSs for campus use. This resulted in the selection of WebCT Campus Edition (WebCT CE) to use in conjunction with a proprietary campus system known as WolfWare. Through the investment in and support of these systems, an increasing number of NCSU courses and materials were put online as Internet delivered courses experienced double-digit percentage growth, and have grown into the predominant DE delivery mechanism at NCSU (Figure 1.1).

Figure 1.1 Distance Education SCH Delivery Methods (Swanson, 2005).

Correspondingly, there was an increase in the number of students collaborating in web-based environments. By 2003, several key factors merited another campus-wide, LMS discussion, including an increasing interest in the use of these types of systems, continued growth in the number classes depending on LMSs as a delivery platform, and significant changes in the dynamics of the LMS vendor market since the previous review. Over a period of several months, faculty, staff, students, and others in the campus community participated in vendor presentations and discussions as part of the selection process, resulting in a significant campus input. As a result, WebCT Vista was selected to meet NCSU's instructional technology needs and compliment the proprietary (WolfWare) system.

Implementing the New LMS

Pilot Phases I & II

A pilot program to phase in the new LMS was scheduled to begin in fall semester 2004. Phase I included six sections and approximately 70 graduate students from the College of Education (CED), including 20 graduate students who were enrolled as members of an online cohort. Phase II, during spring 2005, included the original participants from Phase I, sections from undergraduate level face-to-face and online undergraduate Chemistry courses, and one section of a face-to-face undergraduate Educational Psychology class. Approximately 500 students participated in Phase II of the pilot. During both pilot phases, an optional, face-to-face LMS orientation was made available for students participating in the pilot. Additionally, throughout the semester, the students were supported via email communications from a help desk and a Web site devoted to the new LMS, WebCT Vista.

Methodology

The overarching assessment goal for the initial pilot phases (fall 2004 & spring 2005) was to gather information to better inform training and support services for WebCT Vista. The assessment would help support personnel better understand the demographics of the student users and their experience with both the product and the adequacy of support. To obtain this information, several online surveys were given to the students, one during December 2004, for the fall graduate CED pilot group (n=70), and another set, one for previous graduate CED pilot participants, and a similar survey for new undergraduate participants, in May 2005, for the spring pilot group (n=502).

During both the fall and spring pilots, a number of key questions were investigated:

•  Were there differences in the overall satisfaction with using WebCT Vista as the medium for the course between those who had used WebCT CE previously and those who had never used WebCT CE for both fall 2004 and spring 2005 pilot participants?

•  Were there differences in the overall satisfaction with using WebCT Vista as the medium for the course between those who attended face-to-face training and those who did not attend for fall 2004 pilot participants?

•  Were there differences in overall satisfaction with using WebCT Vista as the medium for the course between those enrolled in the Training and Development (CED) cohort (e.g. the completely online students) and those who were not members of the CED cohort for fall 2004 pilot participants?

•  Were there differences in overall satisfaction with using WebCT Vista between the spring 2005 undergraduate Chemistry and Educational Psychology students and the graduate CED students?

Instrumentation

An assessment committee, comprised of instructional technologists, assessment experts, and selected faculty members, was formed in support of the LMS assessment. Working closely with the various constituents involved, the assessment committee developed surveys designed to understand student demographics, and to measure student satisfaction with the technical support provided during the pilot implementation as well as students' attitudes toward using WebCT Vista. Additional information about survey development, including copies of the instruments, can be found at: http://delta2.ncsu.edu/slic/slic_subcommittees/assessment/ .

Results

Demographics

During Phase I (fall 2004), the participants were CED graduate students. Phase II (spring 2005) included both CED graduate students and Chemistry and Educational Psychology undergraduate students.

Fall 2004, Phase I
Out of 70 pilot graduate students in fall 2004, 42 responded to the survey for a 60% response rate. Demographically, the majority of respondents were female (69%), mostly between 31 – 40 years of age (41%), and taking two classes (50%). The most common range of time students dedicated to engaging course materials was between 5-12 hours per week (60% of respondents). Additionally, the majority of respondents indicated they were logged into WebCT Vista working on the online components of the course between 1 – 4 hours per week (47%), from home (83%), relying on Internet Explorer as their browser (86%). Graduate respondents indicated they had good or satisfactory computer skills in general, and all indicated they had the necessary skills required to successfully complete a DE course.

Spring 2005, Phase II

Of the 81 pilot CED graduate students in spring 2005, 34 responded to the survey for a response rate of 41%, whereas only 52 of the 421 undergraduate Chemistry and Educational Psychology students responded to the survey for a 12.4% response rate for the undergraduate students. While one-half of the graduate students were over 30 years old, the median age for the undergraduate students was 20. Similar to the fall graduate participants, the majority of spring graduate participants (around 64%) spent between 5 – 12 hours engaging with course materials, and primarily spending between 5 – 8 hours per week logged into WebCT Vista (around 39%), with a number of these students (around 27%) spending 9 or more hours logged into WebCT Vista. The Chemistry and Educational Psychology undergraduate respondents tended to spend less time actually logged into WebCT Vista, with 75% logged into WebCT Vista four hours or less per week, with the exception of the DE Chemistry undergraduate students; as 80% of them were logged into WebCT Vista more than five hours per week.

The spring undergraduate students primarily logged into their online course from home/dorms (59%), or other places on campus (40%). Around 92% of the undergraduate students were PC users, and while the majority used Internet Explorer (73%), these students were three times more likely to use Mozilla than were the graduate students.

The undergraduate respondents indicated they had mostly excellent (40%) or good (50%) overall computing skills, compared to the fall 2004 CED graduate students, who perceived their skills to be mostly good (88%) or satisfactory (12%). About half of the undergraduate respondents (48%) rated their overall skill for meeting the technology requirements of a DE course as excellent, with the remainder indicating their skills were either good (46%) or satisfactory (around 6%). Interestingly, a larger percentage of the graduate students rated their overall skill for meeting the technology requirements of a DE course as excellent (around 62%). This could be for several reasons, including the fact that they were taking DE courses and because there is some evidence that non-traditional, adult learners often do better in DE courses than traditional students as they are more self-directed and self motivated.

Technical Support Experiences

Using data from both the fall 2004 and spring 2005 surveys, staff involved in the study of the WebCT Vista pilot sought to better understand the experiences of the students related to technical support, which included a help desk email address, and a FAQ website. To understand their experiences, students were about the quality and responsiveness of the technical support available.

Fall 2004

Less than half of the graduate students responding to the fall survey (42%) used the technical support available. While 50% of respondents found the technical support average or above, there appeared to be a number of users who were having difficulty in navigating the transition from WebCT CE to WebCT Vista, especially prior users of WebCT CE. The lack of immediacy in accessing WebCT Vista technical support (e.g. unavailability of immediate help after hours, no published help phone number) resulted in about 20% of the fall pilot CED graduate students being unsatisfied with the support available, finding the responsiveness and quality of support below average. Some of the comments made about WebCT Vista technical support noted perceived inadequacies such as slow response times and a need for a published phone number, as opposed to only access to an email help desk and an FAQ-style website.

Spring 2005

15% of undergraduate students and 58% of CED graduate students used the various support services available. The majority of all students rated the email support as average or higher (70%). The email support service was rated much higher by CED graduate students in the spring than they had rated it in the fall. CED graduate students also rated the website and other support services generally higher than they did in the fall, generally noting an improvement in their perceptions of the technical services provided to this group during spring 2005. Interestingly, the undergraduate students who rated the support services had very little consistency in their ratings (e.g. they ranged from excellent to poor with little agreement on any support dimension measured), perhaps indicating that overall, the undergraduates who received technical support were less satisfied with it then were the graduate students. One common complaint that did arise regarding technical support was the scheduled WebCT Vista maintenance outages on Mondays, as students were concerned about the downtimes impacting their work.

Differences in the overall satisfaction with WebCT Vista and WebCT CE

Using data from both the fall 2004 and spring 2005 surveys, staff involved in the study of the WebCT Vista pilot sought to answer a number of questions about user satisfaction with WebCT Vista. Students were asked to compare WebCT Vista to other LMSs they had used in order to better understand their satisfaction with this LMS.

Fall 2004

About half of the fall 2004 CED graduate survey respondents had used earlier versions of WebCT. About half of the respondents found WebCT Vista easier to use WebCT CE, while 32% found it harder and 18% indicated there was no difference. Compared to other LMSs used, 44% found WebCT Vista easier to use, 32% found it harder, and 24% indicated there was no difference.

Spring 2005

Prior to their experiences with WebCT Vista, a majority of both the CED graduate students and the Chemistry and Educational Psychology undergraduate students had experience using other types of LMSs, including WebCT CE, Blackboard, and WolfWare (NCSU's homegrown LMS). 56% of undergraduates and 50% of graduate respondents indicated WebCT Vista was somewhat or much easier to use than other versions of WebCT. There was a 13% decrease in the number of graduate users who found WebCT Vista harder to use, compared to the fall 2004 data, perhaps indicating that graduate users were becoming more familiar with the tool.

Using data from the fall 2005 survey to investigate the differences in overall satisfaction with using WebCT Vista as the medium for the course between those who had used WebCT CE previously, the CED graduate students were placed into two groups: those with prior experience in WebCT CE or earlier versions of WebCT, and those who had not used WebCT CE (this group included those who had used other LMSs and some who had no LMS experience). An analysis of variance between these two groups indicated statistically significant overall satisfaction differences (a < .05, p = .05) between those with prior WebCT CE experiences, and those with no prior WebCT CE experiences (Table 1.1). In examining the means of these groups, the prior WebCT CE experience group had higher means (1.86) than those with no prior experience (1.35). The higher the mean score in this case, the more negative the overall satisfaction with WebCT Vista, implying that those with prior WebCT CE experience were less satisfied with WebCT Vista than those without prior WebCT CE experience.

Table 1.1. ANOVA. Fall 2004 overall satisfaction with using WebCT Vista as the medium for this course

Sum of Squares

df

Mean Square

F

Sig.

Between Groups

2.764

1

2.764

4.073

.050

Within Groups

27.141

40

.679

Total

29.905

41

Using data from the spring 2005 survey, when investigating the differences in overall satisfaction with using WebCT Vista as the medium for the course between those who had used WebCT CE previously, the spring pilot Chemistry and Educational Psychology undergraduate students were also placed into two groups: those with any prior experience in WebCT CE or earlier versions of WebCT, and those who had not used WebCT CE. An analysis of variance between these two groups indicated statistically significant differences (a < .05, p = .023) between those with prior WebCT CE experiences, and those with no prior WebCT CE experiences (Tables 1.2). In examining the means of these groups, the prior WebCT CE experience group had higher means (1.96) than those with no prior experience (1.44). The higher the mean score in this case, the more negative the overall satisfaction with WebCT Vista, implying that those with prior WebCT CE experience were less satisfied with WebCT Vista than those who had not used WebCT Vista prior.

Table 1.2. ANOVA. Overall satisfaction with using WebCT Vista as the medium for this course

Sum of Squares

df

Mean Square

F

Sig.

Between Groups

3.068

1

3.068

5.554

.023

Within Groups

25.411

46

.552

Total

28.479

47

Overall Satisfaction Summary

Spring pilot Chemistry and Educational Psychology undergraduate students with prior WebCT CE experience were less satisfied with WebCT Vista than those who had not used WebCT CE (a < .05, p = .023). These findings were directly in support of findings from the fall 2004 pilot, which indicated that prior WebCT CE graduate student users were less satisfied using WebCT Vista when first encountering WebCT Vista, and indicated that prior use and comfort with WebCT CE made transitioning into WebCT Vista's different interface and environment somewhat challenging. Even though fall 2004 pilot participants were graduate students, and spring 2005 participants were undergraduates, it is clear that regardless of student population, for students who had not used WebCT before, the adoption process appeared to be easier.

Differences in satisfaction with WebCT Vista between those who attended face-to-face training and those who did not attend

During both phases of the WebCT Vista pilot program, an optional, face-to-face training session was made available only to the graduate students. Using data from the fall 2004 survey to investigate the differences in overall satisfaction with using WebCT Vista as the medium for the course between those who attended the optional face-to-face training and those who did not attend, fall 2004 pilot participants were divided into two groups: those that attended, and those that did not attend. An analysis of variance between these two groups indicated no statistically significant differences (a < .05, p = .11) in overall satisfaction with using WebCT Vista as the medium for the course between those who attended the available training session, and those who did not (see Tables 1.3).

Table 1.3. ANOVA. Overall satisfaction with using WebCT Vista as the medium for this course

Sum of Squares

df

Mean Square

F

Sig.

Between Groups

1.905

1

1.905

2.721

.107

Within Groups

28.000

40

.700

Total

29.905

41

Training Participation Summary

Though no statistically significant differences were noted between those who attended training and those who did not, a number of participants throughout the pilot program, including graduate and undergraduate students involved in both the fall and spring pilot phases, indicated that training should be optional, and available in both online and face-to-face formats, for any student who wished to participate.

Differences in satisfaction with using WebCT Vista between those enrolled in the graduate CED cohort (e.g. the completely online students) and those who were not members of the CED cohort

Using data from the fall 2005 survey, a question that was analyzed during the fall phase of the pilot sought to understand the differences in the overall satisfaction with using WebCT Vista as the medium for the course between those enrolled in the graduate CED cohort (e.g. the completely online students) and those who were not members of the CED cohort. An analysis of variance between these two groups indicated statistically significant differences (a < .05, p = .007) in overall satisfaction with using WebCT Vista as the medium for the course between those who are members in the graduate CED cohort, and those who are not members (Table 1.4). The higher mean for CED cohort members implied that the students who depended on the LMS to deliver their entire degree program online were less satisfied with WebCT Vista than those who were not members of this cohort.

Table 1.4. Fall 2004 overall satisfaction with using WebCT Vista as the medium for the course

Sum of Squares

df

Mean Square

F

Sig.

Between Groups

5.035

1

5.035

8.099

.007

Within Groups

24.870

40

.622

Total

29.905

41

Cohort Participation Summary

CED cohort members were clearly more negative toward WebCT Vista than those who were not members of the CED cohort. For students who depended primarily on WebCT CE as their delivery mechanism, and had used WebCT CE, in this case, for at least a year of their program, the transition was more stressful to the students who had already heavily invested in learning WebCT CE to deliver their degree program, implying that during an LMS transition, students in online degree programs may be the most frustrated of all.

Differences in satisfaction with using WebCT Vista between undergraduate students and the graduate students

The staff involved in the pilot study of WebCT Vista had experienced some negativity about WebCT Vista from the fall 2004 CED graduates students, as evidenced by a one-third of these participants indicating WebCT Vista was more difficult to use than WebCT CE, and by over 150 help calls recorded for the fall pilot alone at the WebCT Vista help desk. As a result, the support staff members were interested in knowing if there were differences in satisfaction between the graduate and undergraduate users. Using data from the spring 2005 survey, the overall attitude toward using WebCT Vista as the platform for the course indicated a statistically significant difference (a < .05, p = .004) between the Chemistry and Educational Psychology undergraduate students and CED graduate students (see Tables 1.5). The graduate students (who had one semester's experience in using the LMS) were more positive than the undergraduates (who were new to the LMS) about using WebCT Vista as the course platform.

Table 1.5. T-test. Satisfaction with using WebCT Vista as the course platform

Paired Differences

t

df

Sig. (2-tailed)

Mean

Std. Deviation

Std. Error Mean

95% Confidence Interval of the Difference

Lower

Upper

 

CED Vista_as_course_platform - UG Vista_as_course_platform

.500

.839

.159

.175

.825

3.154

27

.004

Satisfaction of Graduates and Undergraduates Summary

This finding was initially surprising to the staff supporting the WebCT Vista implementation, as the graduate students from the fall 2004 pilot had been quite vocal in voicing their frustrations in changing LMSs. The support staff were incorrect in their initial hypothesis that the graduate students would be more negative than the undergraduate students during the spring phase of the pilot. Because the graduate students were more satisfied than the undergraduates during Phase II of the pilot, it was concluded that because the CED graduates had used WebCT Vista during fall 2004, and were getting more comfortable with using WebCT Vista, they were more positive about using this LMS. The undergraduates, on the other hand, were being exposed to this new LMS for the first time, and thus more negative. Overall, it appeared that as students get more experience using WebCT Vista, they become more satisfied with and more positive toward the LMS.

Implications for Student Support Discussion

In planning for a campus wide transition to a new LMS, adequate resources need to be available in order to support students transitioning to the new LMS. Institutions will need to consider a variety of support issues related to students, accessible (and well-trained) technical support staff, available training for students, expanded support models, and proactive communications with students.

Adequate Training for Technical Staff
While the help desk might be open for students, it is only as effective as the adequate training and knowledge base of the people who staff it. Ensuring that staff who will be supporting students and faculty members in transitioning to a new LMS are adequately trained and have access to a knowledge or solution bank for responding quickly to student calls is key to ensuring that technical staff respond to questions quickly and accurately and with a supportive attitude. Several technical staff members who know the new system quite well may need to be the technical leads, modeling responses for calls and monitoring and occasionally auditing the calls of part-time or staff members who are not as knowledgeable to ensure that responses are accurate and timely.



A number of the same types of questions are repeatedly asked during LMS implementation (e.g. what kind of browser do I need to use so that the system will work? Where can I download the latest version of Java?) As these questions are asked, an FAQ list should be created by technical staff and made available for all users of the system, both for the technical staff and users to refer to. A list (and response to) common problems may also be made available to help support the users (e.g. see http://vista.ncsu.edu/help/index.php ). Troubleshooting tips can be created explaining how to access the system, and how to use each tool. Technical staff should know how to quickly access and then apply these responses to user problems as they come into the help desk.

In addition to understanding how to technically use the new LMS and troubleshoot technical issues, technical staff must be trained to be empathetic to the user using the new system. As one user of the new system indicated in the following comment:

•  The kind of support is adequate, but the technical staff must take the requests seriously. None of us are computer idiots; we do know something about how to read and use a discussion board after all this time (nearly two years) of using one almost daily.

As was learned, technical staff should not assume that using a new LMS will be easier, at least initially, to a particular population of students (a mistake made in assuming the undergraduate students would find the system easier to use more quickly than the non-traditional graduate students). Changing your LMS is a change management issue, and the reality is that during the initial adoption process of a new system, most users will initially need some time getting used to a new system. During the implementation, as students got more experience using Vista, they liked it better, as the same graduate students who were dissatisfied in the fall were more satisfied than the new undergraduate users in the spring (a < .05, p = .004), and in a fall 2005 subsequent survey of students, the data indicated that as students continued to use WebCT Vista, they liked it as well or better than WebCT, and more than half of students believed that WebCT Vista positively supported their learning in their courses. As students continue to use the new LMS to support their learning, the help desk staff must be ready and willing to provide timely support. To ensure that students feel supported by the help desk, technical staff must answer every question with care, concern, and as the earlier student comment indicates, with seriousness.

Adequate, Available Training for Students

Adequate training for students should be considered as part of the implementation process. While there was no statistically significant difference in satisfaction with using WebCT Vista for students who had attended training (a = .05, p = .11), participation in the training did appear to positively impact student comfort levels, based on the qualitative comments reviewed for those who did participate in the training. Student feedback was overwhelmingly clear that training, both online and face-to-face, should be available, though not required, for all students (graduate and undergraduate) who are beginning to use a new LMS. Following are some comments from the pilot participants regarding training:

•  When I attended the face-to-face training in the fall of 2004, it was extremely helpful. I started the program in the summer and did not have any prior training.

•  I think I benefited from the face-to-face instruction, however most DE students would probably take the training online.

•  Training given in August of 2004 was appropriate for the use of WebCT. It answered all questions. I think f2f would be best but I could also imagine an online training system that could be appropriate.

•  Online training should be sufficient, but face-to-face would be a good option for those who do not feel comfortable with new technology

Additionally, the training sessions should be well-advertised, lest students miss out on training opportunities, as noted by one of the pilot participants:

•  If you do have a training course for WebCT, make sure it is clearly advertised. I was never aware of the previous training course offered.

For institutions transitioning to new LMSs, training, both online and face-to-face, should be offered to both faculty and students who will be using the new LMS. While training should not necessarily be required for either students or faculty, it should be strongly encouraged in order to support the effective use of and ease of transition to the new LMS.

Expanded Support Models .

According to Eastmond (2000), student success in web-based courses depends on both the course design and the support services provided by the institution, and this is particularly true of part-time, working adult students, who tend to “succeed on a course-by-course basis” (p. 345). Online students, mainly working adults, tend to be “most active at times when traditional college offices are closed” (Young, 2000, p.A49), and this can be an issue when the technical help desk is only available between the traditional hours of 8am – 5pm . The lack of effective technical support when students need it may be a vital service gap, even causing some students to drop entirely out of continuing education (Lorenzetti, 2003).

A criticism from the students surveyed who were taking only distance education, web-based courses, was the lack of support via email and telephone outside of traditional business hours. This contributed notably to the dissatisfaction of the students who were members of graduate CED online cohort students, who were less satisfied with using WebCT Vista than other graduate CED students not members of the cohort (a = .05, p = .007). Additionally, there was some confusion about whom/which campus entity to approach for support among the faculty member teaching the class, the main Information Technology help desk at NCSU, and the DELTA help desk staff members who were initially supporting the pilot implementation. Selective, related comments from students surveyed regarding support included:

•  We need a help desk via phone where help can be accessed immediately at time of problem

•  An absolute contact number, not just email, is needed in the event of a problem

•  Toll free phone numbers

•  Spell out who/where to go for tech support

•  I'd like for technical support to have evening and weekend hours, but I know the budget shortfalls impact the progress made with a more flexible schedule.

•  Technical support should have a discussion area so that students can post questions for all to see. This area should be monitored constantly and questions should be responded to in a reasonable amount of time.

In order to minimize student frustration, especially for non-traditional, distance education students transitioning to a new LMS, there is a need for supporting student learning outside of traditional college business hours. In order to provide the just-in-time support that students may need institutions should consider expanding their technical support hours in a clearly advertised way beyond traditional 8am – 5pm business hours model (e.g. expanded weekly hours from 8am – 8pm , and selected weekend hours, perhaps Saturday and Sunday 1 – 5pm ). While this can present both fiscal and staffing challenges, creative staffing models (e.g. allowing existing staff to rotate schedules to fit the expanded hours), partnerships with other universities (e.g. sharing help duties for the same LMS), or minimally, ensuring adequate staffing of the help desk during times when the call volumes notably increase (e.g. start of semester) may help ease the frustration of the non-traditional, working student. As Young (2000) notes, students are pressuring universities, especially those with distance education programs, to be as accessible as an all night convenience store, and a '24-7' help desk model may indeed be the standard that distance education programs will eventually need to adopt in order to retain their students.

Proactive Communications

Good communication is critical to the success of any LMS implementation, and very important in supporting the students. Websites with FAQs should be available as an important communication tool, but it will not help students if they are not prompted to visit it. A number of the students in the pilot were initially unclear whom to ask for help, when help would be available, and when the LMS would need to be taken offline for regular maintenance. Strategically placing the primary student LMS login on a page with high priority announcements can be useful, especially when those announcements relate to technical maintenance or unplanned outages that may result in the LMS not being available. Additionally, using the announcements tool within the LMS to communicate to students is helpful. Finally, any information that can be given to students in advance, such as the semester's list of all the planned technical maintenance windows, is important in easing student frustrations related to the LMS.

Conclusion

Planning for a campus wide transition from one LMS version to another requires sufficient resources for supporting students transitioning to the new LMS. LMS implementation staff cannot make the assumption that a new LMS will be easier to use and more satisfactory initially for the students; as a matter of fact, some initial negativity and the need for extra assistance in the early stages of such a transition are to be expected, even if the students already have experience using LMS systems. As previously discussed, at least four areas may need attention in order to support students in making a successful LMS transition:

•  adequate training for technical support staff

•  availability of training and support for students

•  expanded student support models, and

•  good communication strategies.

As researchers and practitioners continue to study, critique and use LMS systems, advocating for new functionality and better usability in light of the challenge of positively impacting student learning, higher education will continue to see LMSs “expand, evolve and morph into new educational environments” (Schrum, 2002, p. 20). Inevitably, this means that the majority of campuses will likely experience a transition from one LMS to another, or minimally, an upgrade of an existing LMS. Regardless of an institution's student population, the transition from one LMS to another will be a change management challenge, and institutions will need to be adequately prepared to help students master these inevitable transitions.


References

Barron, T. (2000, April). The LMS guess. Learning Circuits . Retrieved June 21, 2005 , from http://www.learningcircuits.org/2000/apr2000/Barron.htm

Barbian, J. (2002). Great expectations. Training, 39 , 102.

Bates, A.W. & Poole, G. (2003). Effective Teaching with Technology in Higher Education . San Francisco : Jossey-Bass.

Chapman, B. (2002). Product shootout: Learning management systems. Training, 39 , 54.

Chapman, D. (2005) Introduction to learning management systems. In C. Howard, J. Boettcher, L. Justice, K. Schenk, P.L. Rogers, & G.A. Berg (Eds.), Encyclopedia of distance learning : Volumes 1-4 , Hershey, PA: Idea Group.

Cheal, C., Cummings, R., Fernandes, K, & Penney, M. (2006). Choices and changes: How four public universities are coping with the LMS market consolidation. Educause 2006 Annual Conference . Dallas , Texas . Retrieved March 7, 2007 from http://www.educause.edu/LibraryDetailPage/666?ID=EDU06315

Coates, H. (2005). Leveraging LMSs to enhance campus-based student engagement. Educause Quarterly , 1 , 66 – 68.

Hall, J. (2003, January). Assessing learning management systems. Chief Learning Officer Magazine . Retrieved June 25, 2005 from http://www.clomagazine.com/content/templates/clo_feature_tech.asp?articleid=91&zoneid=72

Eastmond, D. (2000). Enabling student accomplishment online: An overview of factors for success in web-based distance education. Journal of Educational Computing Research, 23 (4), 343 - 358.

Grant, M. M. (2004). Learning to teach with the web: Factors influencing teacher education faculty. Internet and Higher Education, 7 , 329 - 341.

Green, K. C. (2002, December). Campus computing looks ahead: Tracking the digital puck. Syllabus, 16(5). Retrieved June 25, 2005 , from http://www.campus-technology.com/article.asp?id=6986

Hall, J. (2003, January). Assessing learning management systems. Chief Learning Officer Magazine . Retrieved from http://www.clomagazine.com/content/templates/clo_feature_tech.asp?articleid=91&zoneid=72

Harrington, C.F.; Gordon , S.A. ; & Schibik, T.J. (2004). Course management system utilization and implications for practice: A national survey of department chairpersons. Online Journal of Distance Learning Administration , 7(4), Retrieved June 23, 2005, from http://www.westga.edu/%7Edistance/ojdla/winter74/harrington74.htm

Howard, C. (2004). How to Avoid the Pitfalls of Long Unsuccessful LMS Implementations. Available on the Bersin Associates Web site at: http://www.bersin.com/tips_techniques/apr_04_lms_fourgotchas.asp

Howell, S.L., Williams, P.B. & Lindsay, N.K. (2003). Thirty-two trends affecting distance education: An informed foundation for strategic planning. Online Journal of Distance Learning Administration . 6(3). Retrieved June 26, 2005 , from http://www.westga.edu/~distance/ojdla/fall63/howell63.html

Kvavik, R. B., Caruso, J. B., & Morgan, G. (2004). ECAR study of students and information technology: Convenience, connection, and control . Boulder , Colorado : Educause Center for Applied Research.

Lorenzetti, J.P. (2003, October 1). Close the gaps before your students fall through. Distance Education Report, 7(19), 2 – 3, 7.

Lynch, D. (2002, January 18). Professor should embrace technology in courses. Chronicle of Higher Education , B15.

Maslowski, R., Visscher, A., Collis, B., & Bloeman, P. (2000). The formative evaluation of a web-based course management system within a university setting. Educational Technology , 40(3), 5 – 19.

Morgan, G. (2003, May). Key findings: Faculty use of course management systems . Boulder , Colorado : EDUCAUSE Center for Applied Research. Retrieved July 8, 2004 , from http://www.educause.edu/ir/library/pdf/ecar_so/ers/ERS0302/ekf0302.pdf

Nichani, M. (2001, May). LCMS = LMS + CMS [RLOs]. elearningpost. Retrieved June 21, 2005 from http://www.elearningpost.com/features/archives/001022.asp

Oliver, K. (2001). Recommendations for student tools in online course management systems. Journal of Computing in Higher Education , 13(1), 47 – 70.

Paulsen, M.F. (2002). An analysis of online education and learning management systems in the Nordic countries. Online Journal of Distance Learning Administration , 5(3). Retrieved June 23, 2005 , from http://www.westga.edu/~distance/ojdla/fall53/paulsen53.html

Schrum, L. (2002). Oh, what wonders you will see: Distance education past, present and future. Learning and Leading with Technology. 30(3), 6 – 21.

Smith, B. (1998, November). Higher education and enterprise learning management systems. Converge Magazine . Retrieved July 22, 2005, from http://www.centerdigitaled.com/converge/?pg=magstory&id=4822

Swanson, R.C. (August 4, 2005). North Carolina State University Distance Education (DELTA) Annual Report 2004-05 . Distance Education Learning Technology Applications, NC State University , Raleigh , NC .

Young, J. (2000, May 26). Distance education transforms help desks into '24-7' operations. Chronicle of Higher Education, 46 (38), pA49-A50.

Weigel, V. (2005, May/June). From course management to curricular capabilities: A capabilities approach for the next-generation CMS. Educause Review, 54 – 67.


Online Journal of Distance Learning Administration, Volume X, Number I, Spring 2007
University of West Georgia, Distance Education Center
Back to the Online Journal of Distance Learning Administration Content