Assessing Facilitator Performance as an Influence on Student Satisfaction


 

Scotty Dunlap
Eastern Kentucky University
Scotty.Dunlap@eku.edu

David May
Eastern Kentucky University
David.May@eku.edu


Abstract


Growth in class size within the online environment has resulted in a facilitator model in which an instructor teaches the class with the assistance of facilitators who interact with students in smaller groups. This research sought to determine the effectiveness of a structured performance evaluation for facilitators and the correlation to student satisfaction.

Background

While planning to expand an on campus graduate program to include a fully online degree option, the faculty of one southeastern regional university believed that “if we build it, they will come” (Robinson & Puopolo, 1989). Subsequent to translating the curriculum for an online environment, marketing the program, and making the necessary course corrections, they did come. Enrollment in the program escalated from 12 to a current enrollment of approximately 300 students.

With the rapid growth of the online program came a new challenge; that being the need to ensure ongoing quality of the online educational experience. The program has drawn students from a wide range of employment and levels of experience. Students include:

  1. Firefighters
  2. Police officers
  3. Workplace safety professionals
  4. Workplace security professionals
  5. Healthcare professionals
  6. Military personnel

The caliber of these students placed a great expectation of quality on the educational experience. Though they had come, there was a need to ensure that the experience was positive and encouraged them to keep coming, both in the form of student retention and recruiting new students.

Class enrollments have grown to as many as 150 students in a class. If enrollment exceeds 20 students, one facilitator is added for each additional increment of 20 students and students are randomly assigned to groups. Each facilitator is responsible for directly engaging with students in their respective group and grade work that is submitted. The professor is responsible for similar work within his or her group as well as being accessible to all students within the class and to ensure the facilitator groups are functioning effectively.

The use of facilitators creates a significant issue when seeking to ensure the quality of online courses. Students will primarily interact with the facilitator as the class unfolds, so the execution of responsibilities by the facilitator can have a great impact on student satisfaction and success.

Literature Review

Though the use of facilitators in an online environment may be necessary based on an academic program’s strategy, it creates an area of critical concern in ensuring the integrity of the educational process. Facilitators play a critical role in executing the objectives of an online course. Wang (2008) identified four roles of facilitators in managing online discussions:

  1. Intellectual – lead online discussions through such activities as initiating questions to develop the intellectual ability of the student to analyze pertinent issues
  2. Social – create a comfortable environment in which open and constructive communication can occur
  3. Managerial – lead practical issues of online discussions, such as following up on those who have not participated and providing feedback to students
  4. Technical – assist students in using the online forum that is used to conduct the course

Curran et al. (2005) researched the use of facilitators by evaluating the volume of discussion board posts made by facilitators and students, and to what degree posts were accessed by students. They found:

“The overall mean postings made by participants in the MDcme programs was 8.8, and mean postings per participant on an individual program basis ranged between 1 and 33. The mean number of postings accessed by participants was 71.3, and the mean postings accessed by participants on an individual program basis ranged between 5 and 177.” (p. 243)

A Pearson r indicated a correlation between the volume of posts made and the number of discussion topics that were addressed. This research indicates the value of the facilitator in directing online discussion.

Braidman, Regan, and Cowie (2008) researched the use of student facilitators and online portfolios among first and second year medical students. Facilitators were selected from among third year medical students. Upon reviewing online interaction they found that the level of discussion became increasingly more complex and advanced. An analysis of student responses to the experience was very positive.

Purpose

The research presented here sought to expand the discussion of the effectiveness of facilitators in online graduate courses. This research was driven by three themes that tended to surface in student evaluations of previous courses:

  1. Facilitators were not present in weekly online discussions
  2. Sufficient feedback was not provided for student assignment submissions
  3. Grades were not posted in a timely fashion

Methodology

Utilizing the philosophy of behaviorism (Skinner, 1953), it was realized that there were no consequences for facilitators who behaved poorly beyond those whose performance was egregious and were not permitted to engage in future courses. A structured performance evaluation process was established that measured the performance of two facilitators across four classes in three areas:

  1. Assignments in online courses were due each Sunday evening. Facilitators were required to post grades with appropriate feedback the following Wednesday evening to allow students time to make corrections prior to submitting the next assignment. Facilitator performance was measured on a percentage of how many grades were posted by the assigned deadline.
  2. Facilitators were required to demonstrate presence in the weekly discussion board by making at least five posts per discussion thread. Facilitator performance was measured on a percentage of posts that were made compared to the number of required posts.
  3. Facilitators were required to make quality discussion board posts. “Quality” was defined as sharing a personal professional experience, articulate reasoning for agreeing or disagreeing with a student’s post, or the use of probing questions with additional information.

Facilitator performance was measured based on the degree to which they accomplished performance expectations as well as student responses on course evaluations. This measurement could then be used to establish a priority order in which facilitators would be selected for future courses. This created a consequence for facilitators in that poor performance would result in being placed lower on the selection list for future courses.

Findings

One area in which findings were evaluated was to determine how facilitators were rated on the performance evaluation process. Both individuals facilitated four classes during the course of this research.  The results of comparing the mean class scores on the facilitator’s grade posting performance, the number of posts per discussion board thread, and the quality of the discussion board responses for each of the four classes are presented in Table 1.  The results indicate that the effectiveness of the monitoring strategy varied drastically by facilitator for each of the areas under consideration.  For Facilitator 1, implementing the monitoring strategy caused the facilitator to have significantly higher means in Classes 3 and 4 than in Class 1 for each of the three performance areas and significantly increased the facilitator’s  number of posts per thread from Class 1 to Class 2.  For Facilitator 2, however, there was only significant increase in performance between classes in the quality of discussion board responses in Class 4; in fact, Facilitator 2 posted grades significantly slower in Class 2 than in Class 1, the opposite effect that was intended.

Table 1: Comparison of Mean Performance

Facilitator and Class

Mean Grade Posting

Mean Posts Per Thread

Mean Quality Responses

Facilitator 1 Class 1

.6145

.375

.375

Facilitator 1 Class 2

.5829

1.875*

1.438

Facilitator 1 Class 3

1.000***

5.000***

5.000***

Facilitator 1 Class 4

.9332***

5.000***

5.000***

Facilitator 2 Class 1

.8515

3.312

1.688

Facilitator 2 Class 2

.5071*

3.000

1.063

Facilitator 2 Class 3

.9000

3.091

2.091

Facilitator 2 Class 4

.8347

4.429

3.786**

*              Significantly different than Time 1 at .05
**           Significantly different than Time 1 at .01
***         Significantly different than Time 1 at .001

A second area in which findings were evaluated was to determine if there was a correlation between student evaluation scores and scores associated with the performance review. Tables 2 and 3 depict the scores for each facilitator

Table 2: Facilitator 1 Student Evaluation and Performance Review Scores

Table 3: Facilitator 2 Student Evaluation and Performance Review Scores

This reveals a positive correlation between student evaluation and performance review scores of .87 for Facilitator 1. A negative correlation of .32 for Facilitator 2 exists when evaluating the four classes, however, a positive correlation of .90 occurs when evaluating courses one through three.

Discussion and Conclusions

This research presents the value of utilizing a structured performance evaluation process when facilitators are used to conduct online courses. The first analysis revealed that the process resulted in substantial improvement for Facilitator 1. The result of the process for Facilitator 2 raised additional questions. Upon exploring the differences between the two facilitators, it was found that Facilitator 2 experienced issues that impacted participation in the course, such as having to work additional hours at the facilitator’s full-time employer for an extended period of time. This resulted in less time that could be dedicated to conducting the course.

The second analysis revealed a positive correlation of .87 for Facilitator 1 when comparing student evaluation scores to success on the performance evaluation. A positive correlation also existed for Facilitator 2 through the third class, but class 4 presented contradictory results in that Facilitator 2 scored high on the performance evaluation and low on the student evaluation. A variable that may have impacted the conflicting results in Class 4 for Facilitator 2 was feedback to students. Though Facilitator 2 did provide feedback, students had issues accessing the feedback due to problems with the course management system or personal challenges in identifying where to go in the course management system to access the feedback.

This research presented the potential benefit of implementing a structured performance evaluation process when facilitators are utilized in an online environment. Future research can be conducted to more fully develop the performance evaluation process to include other factors that impact student satisfaction, such as analyzing the quality of feedback provided to students on assignments.

References


Braidman, I., Regan, M., & Cowie, B. (2008). Online reflective learning supported by student facilitators. Medical Education, 42, 528-529.

Curran, V. et al. (2005). The nature of the interaction between participants and facilitators in online asynchronous continuing medical education learning environments. Teaching and Learning in Medicine, 17(3), 240–246.

Robinson, P. & Puopolo, L. (Directors). (1989). Field of dreams [Motion picture]. United States: Universal Pictures.

Skinner, B.F. (1953). Science and human behavior. New York: Macmillan.

Wang, Q. (2008). Student-facilitators’ roles in moderating online discussions. British Journal of Educational Technology, 39(5), 859-874.


Online Journal of Distance Learning Administration, Volume XIV, Number II, Summer 2011
University of West Georgia, Distance Education Center
Back to the Online Journal of Distance Learning Administration Contents