Analyzing The Influences Of Course Design And Gender On Online Participation


Kenneth V. Anthony
Mississippi State University
kva3@msstate.edu

Abstract

Interaction is a critical component of successful online learning and by extension an important component in overall online program quality. The researcher studied the impact of course design on participation in an online university course. The participants were university students’ (n= 62, male= 33, female= 29). Their responses from online discussions were analyzed using repeated measures factorial ANOVA finding a statistically significant decrease in student participation in weeks when major assignments were due. The impact of assignments was similar for female and male participants. Measures of effect size indicated that course design accounted for more variation in online participation than gender. The key finding of the study was that course design can have a significant impact on level of participation and therefore student success in the online course. Ways to prevent or mitigate the impact of the reduction in student participation are presented.

Background

Parker (1999) identified interaction as a critical component to successful online learning. She emphasized that interaction must be planned to meet the needs of learners. Swan (2001) as cited in Eom, Wen, and Ashill (2006)  stated, “Most students who reported higher levels of interaction with instructor and peers reported higher levels of satisfaction and higher levels of learning.” Eom et al. (2006) did not find a significant relationship between interaction and perceived outcomes, but found that instructor feedback, a component of course interaction, did positively impact student satisfaction. Preventing a significant reduction in interaction during the conduct of an online course is important to maintaining the viability of online programs because interaction is so closely related to attrition and attrition to perceived program quality and viability (Angelino, Williams, & Natvig, 2007). Previous research indicated that attrition rates remain higher in online courses than in face to face courses (Angelino, Williams, & Natvig, 2007; Brown, 2012; Moody, 2004); therefore, all parties involved in online education should be concerned with identifying ways to increase interaction with the ultimate goal of decreasing attrition in online courses.

Online discussion is a significant part of many online classes and is often intended to replicate in-class face to face discussions. Parker (1999) identified “the presence of interactivity” as critical to the perception of quality instruction. Instructors strive to maximize participation of all students in online discussions. Previous research has been conducted to identify factors that increase student participation and satisfaction (Bullen, 1998; Eom et al., 2006).

Purpose Of The Study

The objective of the study was to determine whether students decreased their amount of participation during weeks when major assignments were due in the class thus reducing their enjoyment and learning in the course. The researcher had noticed in his classes that there appeared to be less participation in weeks when major assignments were due, but wanted to test if there was a statistically significant difference in student participation in this situation.

The Importance Of Participation In The Online Class
           
Why is it important to know what impacts student participation and is participation important to successful learning in an online course? Eom, Wen, and Ashill (2006) identified six factors that influence student satisfaction: a) course structure, b) self motivation, c) learning styles, d) instructor knowledge and facilitation, e) interaction, and f) instructor feedback.

Of interest to this study is the finding that increased interaction leads to increased satisfaction. In many online courses the primary sources of interaction are the discussion forums. Less interaction in discussion forums leads to reduced interaction and reduced satisfaction. This reduced interaction and satisfaction can result in increased drop outs and decreased performance and learning. As stated earlier, attrition rates for online courses remain higher than face to face courses and are also used to gauge the quality of online programming (Angelino, Williams, & Natvig, 2007; Brown, 2012; Moody, 2004).
           
In a review of online learning, Parker (1999) identified the presence of interactivity as important to the perceived quality of distance education. She wrote, “Integration of interactivity into the course content profoundly increases the potential for both enjoyment of learning and the enhancement of cognitive skills” (p. 16).  Enjoyment and course satisfaction can impact student participation and continuation in online learning. Patterson and McFadden (2012) found that online programs had a higher attrition rate (23.5%) than campus based programs (4%). They reported that factors other than student characteristics impacted attrition. The researchers recommended further research to study characteristics that impact attrition and the tools to increase interaction. Patterson and McFadden’s findings interpreted in light of Eom, Wen, and Ashill (2006) and Parker (1999) indicated that student satisfaction may be a reason that students drop out more from online courses than face to face courses. Because of this, course designers and instructors need to understand factors that they can directly influence that impact participation including course design factors.
           
Finally, Song and McNary (2011) did not find a strong correlation between the number of posts (level of participation) and a student’s final course grade. They did find a relationship between one aspect of course design and the types of student interaction. When the instructor provided guidelines on how to interact, the students followed the guidelines improving the quality of their participation. Clearly, participation alone does not guarantee a good grade, but as other research has indicated, it is a factor in student satisfaction and performance and continued participation in a course.
           
If student satisfaction and performance can be linked to interaction and by extension participation, factors that negatively influence participation are important for course designers, administrators, and instructors to understand and consider when developing and teaching online courses. Traditional face to face three hour courses require between 45 and 50 classroom contact hours and a place based classroom that enables the instructor to actively monitor student attendance and participation on a scheduled basis. The compressed nature of some online courses and the asynchronous nature of participation make it difficult for many students to manage their time and easy for them to reduce participation or drop out of the course impacting the educational goals of the course (Michinov et al, 2011; Patterson & McFadden, 2012). Because of this, it is important that all stakeholders in online education understand what factors influence student participation, particularly those that the instructor can directly address including course assignments.

Factors That Influence Online Participation

           
Previous research has identified a variety of factors that influence participation in online courses. Michinov et al (2011) found a negative relationship between procrastination and participation in online discussion forums. This finding further highlighted the relationship between participation and performance. Those students who participated less, because they were procrastinators, had lower performances in the class than those who had higher levels of participation. This study described one factor that influences participation as well as examined the result of participation on performance.
           
Bullen (1998) conducted a qualitative study in whichsStudents reported a variety of factors that influenced their level of participation. One important factor was conceptual. The students did not understand that participation in an online course required more than just sitting in class as they did in a face to face class. Another factor was the abstract nature of the classroom. The idea that the course was open twenty-four hours a day and did not require students to come to a specific classroom at a specific time overwhelmed some students making it difficult for them to manage their time and the course. Other students reported that they were not comfortable with the asynchronous nature of the online discussions. Bullen also reported that introverts were likely to be less active in online forums than extroverts.

Additionally, Bullen (1998) addressed two course design issues: a) assigning a grade for participation and b) impact of assignment. Students reported that often they participated just to receive their participation grade indicating that quality was not a concern. Students also stated they reduced their level of participation when an assignment was due. Bullen’s research was qualitative in nature and the current study was conducted to determine if there was a statistically significant drop in participation related to major assignments.

Blau and Barak (2012) studied the impact of personality on discussion participation in synchronous online discussion. Validating Bullen (1998), they found that extroverts were more willing to participate and actually did participate more in online discussions than introverts. An additional finding indicated that students participated more in discussions that used “sensitive, intriguing, and challenging topics” (p. 20). Though it is difficult to address the specific personalities of students in an online environment, instructors can plan more interesting topics for discussion to encourage greater student participation.

Other researchers have focused on gender differences in online participation and techniques that instructors can use to encourage greater participation. Topcu (2006) reported no significant difference in the level of participation in online discussion forums based on gender when grade point average and internet experience were controlled. Barrett and Lally (1999) reported that male students made more and longer contributions to discussions than female students, but that female student contributions were more interactive. Blum (1999) found that males dominated distance education in a similar manner as they do in face to face environments.

Previous research has identified a variety of factors that can influence student participation in online discussions including personality, understanding the nature of online instruction and how it differs from face to face instruction, design issues, and gender. Of particular interest to the current study was Bullen’s (1998) qualitative finding that students self- reported reducing participation in a course due to an assignment. The goal of the current study was to determine whether students decreased their amount of participation at a statistically significant level during weeks when major assignments were due in the class.

Methods
           
Data were collected from four online U.S. history classes at a university that offers both online and traditional classes primarily for military service members and their families. All four classes were taught by the same instructor and were taught using the same syllabus controlling for any variation based on instructor or type of assignments. Each class included the same number of content related discussions per week (two) and the same number of assignments. Each course was ten weeks long. Data from the first and last week were not included because they were not related to course content. In the remaining eight weeks, there were four weeks when there were no major assignments due and four weeks when there were major assignments due. The number of student responses in the online discussions for each week was collected. The number of discussion responses for the four weeks with assignments and for the weeks without assignments were averaged for each participant (n= 62, male= 33, female= 29) to create the dependent variable (level of participation).  A factorial design was used to control for the effect of gender on participation since previous research had indicated that there a difference in the level of participation due to gender (Barrett & Lally, 1999; Blum, 1999). Repeated measures factorial [2 (assignment, no assignment) x 2 (male, female)] ANOVA was used to test the following null hypotheses:

Within-subjects:
            H0: Mean postings for the weeks with assignments (yes) = Mean postings for the weeks with no assignment (no)

Between-subjects:
            H0: Mean male postings per week = mean female postings per week
            Interaction: There is no interaction between gender and assignment.

The dependent variable was the level of participation in online class discussion as measured by mean postings per week. The within-subjects factor was whether there was an assignment due or no assignment due for the weeks. The between- subjects factor was gender of the student.

Results
           
The data did not meet the assumptions of equality of variance or normality. The researcher conducted a natural log transformation resulting in re-expressed data that met the required assumptions for repeated measures ANOVA. Levene’s test of equality of variance indicated that the assumption of homogeneity of variance was met for both levels of the IV:  differences in assignment (average number of responses per week), assignment p= .879 and no assignment p.= .066. Visual inspection of the data and the Kolmogorov- Smirnov test of normality indicated that the data was normal for both levels of the IV differences in assignment (average number of responses per week) on the between subjects factors (gender), assignment- female p= .200, assignment male p= .132, no assignment female p= .061, and no assignment male p= .200. One area of concern was that the Shapiro-Wilk test of normality indicated that the data did not meet the assumption of normality on one level of the between subjects factors (gender), assignment- female p= .027, but was normal on the other levels assignment- male p.= .139, no assignment- female p= .572, and no assignment- male p= .180. The assumption of sphericity was met since there were only two levels on the within subjects factor.
           
Repeated measures factorial ANOVA at .05 alpha resulted in a statistically significant difference in average number of student responses depending on whether there was an assignment the week of the discussion, F(1, 60)= 69.789, MSE= 13.485, p<.001. Cohen’s f= 1.078 indicated a large effect size for assignment or no assignment. In the weeks that students had major assignments the average number of responses (M= 2.539, SD=.099, n= 62) was lower than the weeks that students did not have major assignments (M= 3.20, SD=.050, n= 62). There was a statistically significant difference at the .05 alpha between males (M=2.716, SD= .092, n= 33) and females (M= 3.023, SD= .098, n= 29) in the average number of responses overall, F(1, 60)= 5.194, MSE= 2.915, p=.026. Cohen’s f= .7639 indicated a large effect size for gender. There was no statistically significant interaction between differences in assignment and gender, F (1, 60)= 1.630, MSE= .315, p = .207. Measures of effect size for the main effects of assignment (Cohen’s f= 1.078, eta squared= .538) and gender (Cohen’s f= .7639, eta squared= .369) indicated that assignment accounted for a larger amount of explained variance than gender in the average number of responses by students. Data reported are expressed in natural log units. For ease of interpretation, geometric means transformed from natural log for the mean and confidence intervals and the corresponding natural log data are included in Table 1.

Table 1
Logs and anti logs for selected data


Variable

Mean

Confidence Interval

 

 

Lower

Upper

Assignment*

12.67

10.4

15.44

No Assignment*

24.53

22.12

27.11

Assignment LN

2.539

2.342

2.737

No Assignment LN

3.200

3.100

3.300

Female*

20.55

16.89

25.03

Male*

15.12

12.58

18.19

Female LN

3.023

2.827

3.22

Male LN

2.716

2.532

2.901

*Geometric mean

Discussion
           
The results indicated that the presence of a major assignment significantly reduced the level of participation in the online courses. This is important because it confirms the researcher’s initial hypothesis and reinforces previous findings (Bullen, 1998) that course design can have a significant impact on level of participation. Unlike in a face to face course, where students might attend less frequently when a major assignment is due, these students reduced participation occurred in one half of the course time (four out of eight weeks measured in a ten week course) indicating a significant reduction in participation and therefore potential reduction in learning. It is encouraging that the impact of assignments was similar on males and females in light of the previous emphasis on the role of gender in online participation. The act of reducing the amount of participation is one area where students, regardless of gender, behaved in a similar way in an online environment. This finding is in contrast to Barrett and Lally (1999) and Blum (1999) that indicated that males dominated online participation.
           
Controlling for gender allowed the researcher to compare the relative impact of course design and gender. Females consistently participated at a higher rate than their male counterparts, but the effect size for the impact of assignment accounted for more variation than the differences in gender indicating that course design has a larger impact on student participation in online courses than gender. This finding lends credence to the idea that “gender based access and computer literacy levels… are disappearing problems” (Gunn, 2003, p. 27).
           
The idea that student participation drops off significantly during the weeks when major assignments are due is troubling when one considers the importance other scholars (Eom, Wen, & Ashill, 2006; Parker, 1999) have placed on participation in online learning. If participation is the key to success or at least enjoyment and perceived success, then anything that reduces the amount of participation should be a concern. The issue here is that the culprits are assignments. Assignments are necessary to evaluate student progress and understanding, but the act of completing these necessary tasks results in less participation.   

Limitations

           
The study has several limitations that might limit the ability of readers to generalize as well as limit the level of analysis. The study was completed in a student population that may differ from other online student populations (these students are mostly members of the military or military dependents); therefore, the ability to generalize may be limited in some reader’s minds. Additionally, the number of posts was used as the measure for level of participation rather than the total number of words or some other metric. The researcher deliberately focused on the number of posts rather than the quality of posts. This study did not address changes in the quality of posts. Further study should evaluate the quality of the posts to give a more nuanced understanding of the impact of assignments on level and quality of participation.

Implications

Previous research has focused on factors instructors have little control over and required instructors to make accommodations for expected differences in students’ participation based on gender or other student characteristics. The findings of the current study should encourage course instructors, administrators, and designers; because there is evidence that some of these external factors may have less influence on participation than actual course design. Patterson and McFadden (2012) recommended that further research be conducted to study characteristics that impact attrition and tools to increase interaction. This article addressed one specific characteristic of online courses and its impact on interaction. The results of this study indicated that there may be positive steps that instructors can take in course design to influence the amount of student participation in online courses.

Since the presence of major assignments leads to lower student participation, what options exist for administrators, course developers and instructors to prevent this reduction or mitigate the effects of the reduction? One way to reduce participation is to make a direct link between the assignment and at least one discussion the week the assignment is due. Instructors can further tie discussions to the major assignment by including participation in a discussion related to the content covered in the assignment as one of the criteria for assessment. This would increase the value of the discussion to the students by linking it directly to the content covered in the assignment and linking part of the overall grade to student participation in the discussion related to the assignment In addition to links between a discussion and the assignment, the instructor can plan more sensitive and intriguing discussion during weeks with assignments (Blau & Barak, 2012) to encourage student participation.

Other ways to prevent a decrease in participation include actions by the instructor to support student participation and time management. Instructors should actively encourage students to spread the assignment work out over the intervening weeks between major assignments by requiring students to turn in intermediate steps of the assignment to mitigate the impact of procrastination on participation (Michinov et al, 2011). This support would make it less likely that students would be forced to make an opportunity cost decision between time spent in discussions and time spent completing an assignment at the last minute.

 Instructors can also help students monitor their own level of participation. Active and early intervention that informs students of their level of participation in relation to others and over time has the potential to increase overall participation as well as participation during weeks when students have assignments due (Michinov et al, 2011). Further, instructors could heed Song and McNary’s (2011) research and provide guidance on how to effectively participate at the beginning of the course and provide timely reminders during the weeks when major assignments are due to encourage students to maintain their level of participation.

Increasing student collaboration on assignments and in online discussions can encourage more participation (Michinov et al, 2011) in weeks that students have major assignments. One technique that the researcher used in his courses after the study was completed was to conduct some discussions within existing study groups (originally set up for group assignments). Because students had developed close relationships with their study group members, they felt more responsibility to the group to participate in the discussions.  Finally, instructors can decide to accept and understand that many students will have a lower level of participation in the online discussions the weeks that assignments are due and adjust the entire course plan to accommodate these reductions.

The findings of the study are significant to instructors, program designers, and program administrators. Of the many factors that influence student participation and ultimately student attrition, course design is one factor that all parties can directly influence. Instructors can adopt instructional practices that maintain adequate student participation, program designers can design courses that take into account the impact of major assignments on participation, and administrators can develop and enforce policies related to course design and specifically assignments that support maintaining student participation.

Further research is needed into other elements of course design that influence participation to help course designers and instructors maximize student participation regardless of their background. This article has proposed several ways to potentially reduce the variance in participation between weeks when major assignments are due and weeks when assignments are not due, but additional research is necessary to validate these and other interventions.

Conclusion

The study described in this article adds to the understanding of the factors that affect student participation in online learning and by extension overall quality of online programs. The research resulted from an instructor’s practice and concern about the impact of assignments on discussions. The findings suggest that an element of course design, the presence of a major assignment, can reduce the amount of participation in an online course. This finding validates the researcher’s original hypothesis as well as reinforces earlier research that students reported they reduced the amount of participation due to assignments (Bullen, 1998). These findings held true for both males and females. The findings challenge online instructors, administrators, and course designers to find ways to balance course assignments with the desire to maintain levels of course interaction that are key to student success in online courses. The researcher presented several ways to prevent or mitigate the impact of the reduction in participation during the weeks assignments were due. As online learning continues to grow as an option for college students, an increased understanding of the factors that contribute to their level of participation and consequently success in these types of courses is important to administrators, course designers and instructors.


References

Angelino, L, Williams, F., Natvig, D. (2007). Strategies to engage online students and reduce attrition rates. The Journal of Educators Online, 4 (2).

Barrett, E., & Lally, V. (1999). Gender differences in an on-line learning environment. Journal of Computer Assisted Learning, 15, 48-60.

Blau, I. & Barak, A. (2012). How do personality, synchronous media, and discussion topic affect participation? Educational Technology & Society, 15 (2), 12-24.

Blum, K. (1999) Gender differences in asynchronous learning in higher education: Learning styles, participation barriers and communication patterns. The Journal of Asynchronous Learning Networks, 3 (1), 1-21.

Brown, J. (2012). Online learning: A comparison of web-based and land-based courses. The Quarterly Review of Distance Education, 13 (1), 39-42.

Bullen, M. (1998). Participation and critical thinking in online university distance education. Journal of Distance Education, 13 (2).

Eom, S.B., Wen, H.J., & Ashill, N. (2006). The determinants of students’ perceived learning outcomes and satisfaction in university online education: An empirical investigation. Decision Science Journal of Innovative Education, 4 (2), 215- 235.

Gunn, C. (2003). Dominant or different? Gender issues in computers supported learning. The Journal of Asynchronous Learning Networks, 7 (1), 14-30.

Michinov, N., Brunot, S., LeBohec, O., Juhel, J., & Deleval, M. (2011). Procrastination, participation, and performance in online learning environments. Computers and Education, 56, 243-252.

Moody, J. (2004). Distance education: Why are attrition rates so high? The Quarterly Review of Distance Education, 5 (3), 205-210.

Parker, A. (1999, Autumn/Winter). Interaction in distance education: The critical conversation. Educational Technology Review, 13- 17.

Patterson, P., & McFadden, C. (2009). Attrition in online and campus degree programs. Online Journal of Distance Learning Administration, 12 (2).

Song, L., & McNary, S.W. (2011). Understanding students’ online interaction: Analysis of discussion board postings. Journal of Interactive Online Learning, 10 (1), 1-14.

Topcu (2006). Gender difference in an online asynchronous discussion performance. The Turkish Online Journal of Educational Technology- TOJET, 5 (4), 44-51.


Online Journal of Distance Learning Administration, Volume XII, Number III, Fall 2012
University of West Georgia, Distance Education Center
Back to the Online Journal of Distance Learning Administration Contents