Student Access to Online Interaction Technologies: The Impact on Grade Delta Variance and Student Satisfaction


Mark Revels
Western Kentucky University
mrevels2@earthlink.net

Mark Ciampa
Western Kentucky University
mark.ciampa@gmail.com


Abstract


Online learning has significantly changed the educational landscape in recent years, offering advantages to both schools as well as students.  Despite the fact that some faculty members are not supportive of online learning, researchers have demonstrated that the quality of online learning to be as effective as classroom learning.  It has been stated by researchers that there is a need to use metrics to assess the value achieved through the use of online learning.  This study measured the impact that student access to interactive technologies (discussion boards, e-mail, chats, videoconferencing, etc.) played in an online course.  By restricting these technologies would they have an impact on grade delta variance and the student’s perceived satisfaction? The results of this study seem to indicate that in an online course student access to a variety of student-to-student collaborative technologies had no impact on five of the seven given student survey questions or on grade delta variance.  In fact, lack of access to the interactive technologies only had an impact on two survey questions, namely “I have learned a lot in this course” and “My instructor treats me fairly.”  Students in the restricted class responded more positively on these two questions. 

Literature Review

Definition Of Online Learning


Advances in technology and the Internet have changed the way in which individuals access and use information. This advanced technology can enhance education delivery and knowledge acquisition in the form of online learning where learners and faculty members are at a distance from one another but are connected through the Internet by a learning management system (LMS) like Blackboard or WebCT  (Sava, 2005).  Educational institutions across the world now offer classroom instruction through LMS (Dorado, Hernandez, Sani, Griffin, & Barnette, 2009).  And this online learning is increasing at a rapid pace: a 2010 higher education study revealed that over 4.6 million students were taking at least one online course during the 2008-2009 Fall semesters, which reflected a 17 percent increase from the previous year.  This growth far exceeded the 1.2 percent increase of the overall higher education student population (Allen & Seaman, 2010).  Chawdhry, Paullet and Benjamin state that an  increasing  number  of  students are enrolling in online learning to complete  their  degrees, compete  in today’s  job  market,  and  advance  in  their careers (2011).

Advantages Of Online Learning

Online learning offers advantages to both schools as well as students.  For schools online courses offer higher education institutions innovative ways to target adult learners wanting to continue their education but are constrained (Coppola, Hiltz, & Roxanne, 2002).  Lessen and Sorensen identify these constraints as work schedule, family and time (2006). Additional constraints such as distance, cost, time, job requirements, and family demands, can preclude students from attending traditional classes. Online courses enable students to balance the demands of their daily lives by setting their own schedule for learning (Chawdhry, Paullet, & Benjamin, 2011).  Due to the flexibility in online learning most schools are able to reach a larger student population, thus increasing their enrollments in times of decreasing financial support from external entities.  One population segment that benefits from online learning is the adult learners who can engage in just-in-time skill acquisition without time and location constraints (Zhang, 2004).

Online learning also offers significant advantages to students.  First, as previously noted, online courses provide opportunities for individuals who would otherwise not have opportunities for learning (Deal, 2002).  One study indicated that students prefer online learning because this delivery mechanism allows them to balance their life demands while at the same time going to school.  Almost 88 percent of students chose online learning because they had other commitments that prevented them from attending courses on campus in classrooms (Hannay & Newvine, 2006). 

Second, students perceive online learning and its associated technology as a strategic advantage to them.  Several researchers have reviewed an entire body of literature that reflects the importance students place on the Internet to their academic careers (Budden, Anthony, Budden, & Jones, 2007).  Almost 72 percent of students reported a preference for interacting online instead of face-to-face (F2F) with admission counselors (Hayes, Ruschman, & Walker, 2009).  The importance of technology, particularly social networking, is especially important to graduate students, who recognize the benefits to their career development (Benson, Filipaios, & Morgan, 2010). 

Third, students view online courses as convenient and beneficial.  An exploratory study  of  factors  that  influence  a  student’s decision  to  take  online  courses was conducted in 2009 by examining four key elements: convenience, level of difficulty, effectiveness, and social interaction. Convenience and effectiveness were both perceived by students as a positive influence in their decision to take online courses.  Level of difficulty and social interaction were perceived by students as negative influences in their decision to take online courses.  The study revealed that convenience was the major factor that influenced a student to take online courses (Dorado et al., 2009).  Another study indicated that 59 percent of students surveyed reported that their grades were higher in online courses than in traditional courses, and overall 70 percent of students indicated that they preferred online courses. One interesting finding of this study was that 90 percent of students said that they read the textbooks associated with their online courses while only 60 percent of students in traditional classes read the textbooks (Hannay & Newvine, 2006).

Fourth, online courses are seen as enhancements to communications and interactions.  A 2008 study of student perceptions of various components of the Blackboard LMS found an increased level of communications and interactions in online classrooms. Over 63  percent  of students indicated  increased learner-to-instructor  interactions,  almost 62  percent agreed  that  there  was  a  significant  increase  in the  overall  volume  of  communications in the online classroom, and 52 percent  said that the LMS fostered a sense of community in the course.  The respondents  also found online learning to  be effective  and  accessible:  68  percent  stated that  the  online  discussions helped  them  to understand  and  assimilate  the  course  content and  while almost 80  percent  preferred  submitting assignments online.  Students  also liked  the functionality  of  the online learning: almost 81  percent  agreed  that  the LMS makes  the  classroom handouts readily available and accessible (Buzzetto-More, 2008). 

Is Online Learning Viable?

Although many faculty members are supportive of online learning, some faculty members believe technology cannot improve teaching and learning (Cheng & Miller, 2009).  In addition, they believe that online courses are inferior to classroom courses in terms of quality and learning outcomes (Anstine & Skidmore, 2005).   Some faculty members note that the benefits of online learning may be outweighed by the disadvantages, such as the lack of peer interaction and less dynamic modes of instruction (Welsch, Wanberg, Brown, & Simmering, 2006). 

Yet researchers demonstrate that the quality of online learning is as effective as F2F learning (Neuhauser, 2002). An analysis by the Department of Defense's Advanced Distributed Learning Initiative and the University of Tulsa found learning effectiveness of online courses comparable to that of classroom instruction (Sitzmann, Kraiger, Stewart, & Wisher, 2006).

There may be several reasons why some faculty members are resistant to online learning.  The implementation of online courses should be done only through careful analysis of online learning environments and an analysis of the online student’s characteristics (Singleton, Hill, & Koh, 2004).  Without this analysis online courses can change the traditional student-teacher relationship from personalized attention to “just another number”, with the result being that the efficiency of online instruction is less effective than the traditional classroom (Bressler, Bressler, & Bressler, 2011).  Online instructors should understand that adults prefer to be actively engaged and involved in the learning process and come ready to learn what they need to know in order to cope effectively with their real-life situations (Knowles, 1980)

Constructivism And Collaborative Learning

One approach to online learning that is currently being promoted is called “constructivism”.  Carlson sates that constructivism has gained a foothold in education, including traditional higher education (2001).  Constructivism says that for learning to occur it is necessary for learners to construct their own understandings of the world in which they live (Brooks & Brooks, 1993). Individuals “construct” new learning based on their past experiences, motives, and intentions. In short, learning is inherently personal, built sequentially upon a scaffold of experiences, and deepening in complexity as learners develop and gain new information and understandings. Education becomes a conceptual change, not just the acquisition of information (Biggs, 1999).  Constructivism encourages faculty members to cultivate a learning environment by infusing students with a desire to engage in learning experiences that are self-directed, self-reflective, interactive, and collaborative.  This self-paced  and autonomous  learning,  which  are  key principles of constructivism, are enhanced in an online learning environment where students can engage in learning anytime, anywhere,  and  at  their  own  pace (Bellefeuille, 2006).  Faculty  members  can  incorporate technology  to  elevate  a  student’s  cognitive level  with  modeling,  support,  and  fading.  Modeling  provides students  with  adequate  learning  structures, leading  students  to  the  desired  learning behavior. Supporting provides students with feedback so that students can independently perform tasks or assignments.  Fading reduces the amount of support over time so that students can become confident and self-reliant (Bellefeuille, 2006). Using constructivism in online learning, the faculty members then reposition  themselves  as  facilitators  whose collaborative  presence  invites  peer interaction  and  participation  among learners  in  a virtual  learning  environment (Conrad, 2002). In addition, online faculty members  foster a  supportive  collegial,  collaborative,  and interactive learning environment to enhance the  sense  of  community  by  providing students  with  material  and  technology resources (De Simone, 2006).

Another approach to online learning is collaborative learning, which is a relationship among learners that requires response interdependence (a sense of sink or swim together), individual accountability (everyone has to contribute and learn), interpersonal skills (communication, trust, leadership, decision making, and conflict resolution), face-to-face promotive interaction, and processing (reflecting on how well the team is functioning and how to function even better) (Srinivas, 2010).  Collaborative learning is different than cooperative learning, which states that people who help each other and who join forces to achieve a common goal will generally grow to feel more positively about each other and will be willing and able to interact constructively when performing a collective task? (Sharan, 1985). Collaborative learning develops higher level thinking skills, promotes learners-leader interaction and familiarity, builds self-esteem in learners, and promotes a positive attitude toward the subject matter (Srinivas, 2010).  Collaborative learning helps to maximize student achievement through personalized learning and assessment while adhering to compliance and government regulations (Ogunlade, 2011). 

Metrics To Assess Online Learning

Koch notes that it is important to use metrics such as student learning, reduced cost, user satisfaction, and other similar measurements to assess the value achieved through the use of online learning (2006). There is extensive research that attempts to understand and measure what influences student satisfaction, attention and retention in an academic environment (Li, Finley, Pitts, & Guo, 2011). Studies have indicated that student engagement in college activities outside the classroom and interactions with other students and faculty tends to have a substantial impact in terms of student retention, academic performance, and overall satisfaction (Astin, 1999). Kuh found that participation in college activities, living on campus, and conversing frequently with other students and faculty positively influenced students’ learning and personal development (1995).

The most common forms of communication used by faculty to facilitate interaction with students include the use of asynchronous (e.g., email and online discussion boards) and synchronous communication (e.g., chat or instant messaging) (Li et al., 2011). The majority of research related to the use of asynchronous communication in higher education has focused on online learning that use Web-based communication technologies to deliver course content virtually and involves extensive student-instructor communications (Dezhi, Bieber, & Hilz, 2009). 

Study

Following Koch’s statement regarding the need to use metrics to assess the value achieved through the use of online learning, this study measures the impact that student access to interactive technologies plays in an online course.  Specifically the study looks at two metrics: grade delta variance and the student’s perceived satisfaction of the course and instructor.  In many online courses students use a variety of online interactive technologies to collaborate with other students.  By restricting these technologies would they have an impact on grade delta variance and the student’s perceived satisfaction?

The primary research hypothesis is as follows:

H0 – No significant difference exists between grade delta variance and satisfaction regarding the use of online interaction technologies. 

H1 – Students who use interaction technologies demonstrate a higher grade delta variance and satisfaction.

Student participants in the study were undergraduate students at a mid-South public university enrolled in two separate online courses, “Database Systems II” and “Telecommunications II”, over two consecutive semesters.  Thus the study looked at students enrolled in a total of four classes.  In the first semester students enrolled in these two courses had full access to a variety of student-to-student collaborative technologies through which they could interact with all other students on both a scheduled as well as ad-hoc basis.  For example, multiple discussion boards were available for students to post personal information about them as well as ask questions and receive answers from other students.  In addition, students had full access to e-mail, chat, and videoconferencing tools.  These were called the “Open” sections.  In the second semester students who enrolled in these two courses had no access to student-to-student collaborative technologies.  All other activities were the same.  These were called the “Closed” sections.

In each of the four courses students took a pre-assessment test at the beginning of the course and a final exam at the end of the course.  These were used to measure grade delta variance.  In addition, students completed a seven-question satisfaction survey of the course and instructor at the end of the course.  Using a 5-point Lickert scale the survey examined student perceptions regarding the following seven statements:

  1. My instructor displays a clear understanding of course topics
  2. My instructor is well prepared for class
  3. Performance measures (exams, assignments, etc.) are well constructed
  4. My instructor provides helpful feedback
  5. Overall, my instructor is effective
  6. I have learned a lot in this course
  7. My instructor treats me fairly

The purpose of this study was to measure the grade delta variance for final grades and student satisfaction responses comparing “Open” (full access to student-to-student collaborative technologies) and “Closed” (no access) courses.

Results

Initially, a series of descriptive statistics were conducted on these data. The same variables included in the independent-samples t-tests were included in these initial analyses, and consisted of delta grade (change in grade from the preassessment test to the final grade, using the final grade range), and survey questions 1 through 7. The Table 1 presents the results of these initial descriptive statistics, which consist of the valid and missing sample sizes for each measure, as well as the mean, median, standard deviation, and minimum and maximum scores.

Next, a series of tests were conducted in order to determine the extent of normality associated with these measures. While larger sample sizes do not require a perfectly normal distribution with regard to the t-test, and in fact, markedly non-normal data can be used without producing invalid results, it is ideal to initially determine the extent of the normality of any measures analyzed. The following table presents the results of a series of one-sample Kolmogorov-Smirnov tests of normality conducted on these data. This test of normality, when statistically significant, indicates significant non-normality, while a non-significant result indicates a normal distribution. As indicated in Table 2, change in grade was not found to achieve statistical significance, indicating normality. However, the remaining measures (i.e., all seven survey questions) were found to be significantly non-normal on the basis of this analysis.

Additionally, Table 3 presents the results of measures of skewness and kurtosis calculated in order to further explore the normality of these measures. Measures of skewness or kurtosis divided by their respective standard errors which are above the absolute value of 3 is generally accepted as indicating substantial skewness or kurtosis. As indicated in the following table, these measures are found to have substantial negative skewness as well as substantial positive kurtosis.

Furthermore, a series of histograms were also constructed in order to visually illustrate the distribution of these measures. These histograms are included in the appendix. Table 4 presents a series of descriptive statistics associated with the independent-samples t-tests conducted. As presented in the table, the sample size, mean, standard deviation, and standard error of the mean are reported for each of these measures separately on the basis of class. Change in grade was found to be lower with regard to the closed class, while scores on the seven survey questions were found to be higher in all cases with regard to the closed class as compared with the open class.

Finally, Table 5 presents the results of the independent-samples t-tests conducted. The results presented in this table consist of the results of Levene's test for the equality of variance, the independent-samples t-test, the mean difference and standard error of the difference between groups, as well as the 95% lower and upper confidence levels. Levene's test, if found to be statistically significant, would indicate that the results of the t-test in which equal variances was not assumed should be utilized, as this indicates that the assumption of the equality of variance has been violated. With regard to these analyses, significant differences between classes were found to be significant at the .10 alpha level with regard to survey questions 6 and 7. In both of these cases, the closed class had a significantly higher score on these measures as compared with the open class. None of the remaining independent-samples t-tests were found to achieve statistical significance.

In conclusion, the results of these analyses indicated that the closed class had a significantly higher score on survey questions 6 and 7 as compared with the open class. However, no other significant results were found. This indicates that with regard to the majority of the survey questions, as well as with regard to the change in grade from pretest and posttest scores, no significant differences were indicated between open and closed classes. Levene’s test was found to be significant in many cases, indicating significant differences between classes with regard to the variation in scores on a number of the survey questions.

Conclusion And Future Study

The results from this study indicate that in an online course student access to a variety of student-to-student collaborative technologies—such as multiple discussion boards for students to post personal information about themselves as well as ask questions and receive answers from other students, e-mail, chat, and videoconferencing tools—had no impact on five of the seven survey questions or on grade delta variance.  In fact, lack of access to the interactive technologies only had an impact on two survey questions, namely “I have learned a lot in this course” and “My instructor treats me fairly.”  Students in the closed class had a significantly higher score on these two questions.  It is difficult to surmise precisely why students in the closed section had a higher score on these questions than students in the open section.  To the casual observer it may be supposed that students in the open class, who had access to other students, would have indicated a higher score on learning since they had access to interaction with other students; however, that was not the case. 

Further study in this area needed to examine why students in a closed section scored higher on the two survey questions than in the open section. 


References

Allen, E., & Seaman, J. (2010). Publications. Retrieved June 13, 2012, from Sloan Learning On Demand: http://sloanconsortium.org/publications/survey/pdf/learningondemand.pdf

Anstine, J., & Skidmore, M. (2005, Spring). A small sample study of traditional and online courses with sample selection adjustment. Journal of Economic Education, 36(2), 107-127.

Astin, A. (1999). Student involvement: A developmental theory for higher education. Journal of College Student Development, 40(5), 518-529.

Bellefeuille, G. (2006). Rethinking reflective practice education in social work education: A blended constructivist and objectivist instructional design strategy for a web-based child welfare practice course. Journal of Social Word Education, 42(1), 85-103.

Benson, V., Filipaios, F., & Morgan, S. (2010). Online social networks: Changing the face of business education and career planning. International Journal of e-Business Management, 4(1), 20-33.

Biggs, J. (1999). Teaching for Quality Learning at University. Buckingham, England: The Society for Research into Higher Education and Open University Press.

Bressler, L., Bressler, M., & Bressler, M. (2011, September). Demographic and psychographic variables and the effect on online student success. Journal of Technology Research, 2, 1-16.

Brooks, G., & Brooks, J. (1993). In Search of Understanding: The Case for Constructivist Classrooms. Alexandria, VA: Association for Supervision and Curriculum .

Budden, C., Anthony, J., Budden, M., & Jones, M. (2007). Managing the evolution of a revolution: Marketing implications of Internet media usage among college students. College Teaching Methods and Styles Journal, 3(1), 5-10.

Buzzetto-More, N. (2008). Student perceptions of various e-learning components. Interdisciplinary Journal of Knowledge and Learning Objects, 4, 113-135.

Carlson, A. (2001, June). “Did I learn anything?” The use of self-assessment to evaluate authentic learning competencies of wwu freshman interest group seminar students. Thesis . Washington, USA.

Chawdhry, A., Paullet, K., & Benjamin, D. (2011, September). Assessing Blackboard: Improving online instructional delivery. Information Systems Education Journal, 9(4), 20-26.

Cheng, J., & Miller, L. (2009). A correlation study: The effect of online quality on student learning outcomes and attrition. Association of Business Information Systems, (pp. 21-31).

Conrad, D. (2002). Engagement, excitement, anxiety, and fear: Learner's experiences of starting an online course. The American Journal of Distance Education, 16(4), 205-226.

Coppola, N., Hiltz, S., & Roxanne, R. (2002, Spring). Becoming a virtual professor: Pedagogical roles and asynchronous learning networks. Journal of Management Information Systems, 18(4), 169-189.

De Simone, C. (2006). Preparing our teachers for distance education. College Teaching, 54(1), 183-184.

Deal, W. (2002). Distance learning: Teaching technology online resources in technology. Technology Teacher, 61(8), 21-26.

Dezhi, W., Bieber, M., & Hilz, S. (2009). Engaging students with constructivist participatory examinations in asynchronous learning networks. Journal of Information Systems, 19(3), 321-330.

Dorado, C., Hernandez, J., Sani, B., Griffin, C., & Barnette, W. (2009). An exploratory analysis of factors influencing student decisions to take online courses. Issues in Information Systems, 10(1).

Hannay, M., & Newvine, T. (2006). Distance learning: A comparison of online and traditional learning. Merlot Journal of Online Learning and Teaching, 2(1), 1-11.

Hayes, T., Ruschman, D., & Walker, M. (2009). Social networking as an admission tool: A case study in success. Journal of Marketing for Higher Education, 19, 109-124.

Knowles, M. (1980). The modern practice of adult education: From pedagogy to andragogy. New York: Cambridge Books.

Koch, J. (2006). Public investment in university distance learning programs: Some performance- based evidence. Atlantic Economic Journal, 34, 23-32.

Kuh, G. (1995). Student Learning Outside the Classroom: Transcending Artificial Boundaries. San Francisco: Jossey-Bass, Inc.

Lessen, E., & Sorensen, C. (2006). Integrating technology in schools, colleges, and departments of education: A primer for deans. Change, 38(2), 45-49.

Li, L., Finley, J., Pitts, J., & Guo, R. (2011, September). Which is a better choice for student-faculty ineraction: synchronous or asynchronous communication? Journal of Technology Research, 2, 1-12.

Neuhauser, C. (2002). Learning style and effectiveness of online and face-to-face instruction. The American Journal of Distance Education, 16(2), 99-113.

Ogunlade, J. (2011). Collaborative learning. Association of Business Information Systems, (pp. 17-19).

Sava, F. (2005). Critical issues in distance education: A report from the United States. Distances Education, 26(2), 255-272.

Sharan, S. (1985). Cooperative Learning and the multiethnic classroo. In R. Slavin, Learning to Cooperating, Cooperating to Learn (p. 255). New York: Plenum Press.

Singleton, E., Hill, J., & Koh, M. (2004). Imporving online learning: Students perceptions of useful and challenging characteristics. The Internet and Higher Education, 7(1), 59-70.

Sitzmann, T., Kraiger, K., Stewart, D., & Wisher, R. (2006). The comparative effectiveness of web-based classroom instruction: A meta-analysis. Personnel Psychology, 59(3), 623-664.

Srinivas, H. (2010, September 17). Learning. Retrieved from The Global Development Research Center: http://www.gdrc.org/kmgmt/c-learn/index.html

Welsch, L., Wanberg, C., Brown, K., & Simmering, M. (2006). The comparative effectiveness of web-based and classroom instruction: A meta-analysis. Personnel Psychology, 59(3), 623-644.

Zhang, D. (2004). Virtual Mentor and the Lab system-toward building an interactive, personalized, and intelligent e-learning environment. Computer Information Systems, 44(3), 35-43

Table 1: Descriptive Statistics                                                                                                           

Measure                                 Delta      Q1          Q2          Q3          Q4          Q5          Q6         Q7

                                              Grade

N                       Valid               59         46           46           46           46           46           46          46

                          Missing            0         13           13           13           13           13           13          13

Mean                                       .310      4.70        4.70        4.26        4.52        4.50        4.30        4.72

Median                                    .335      5.00        5.00        4.00        5.00        5.00        5.00        5.00

Std. Deviation                         .192        .591        .662        .929        .752        .782        .940       .584

Minimum                                  -.48        3             2             1             2             1             1            3

Maximum                                   .69        5             5             5             5             5             5            5



Table 2: Tests of Normality: One-Sample Kolmogorov-Smirnov Tests       

Measure                          Test Statistic                    p     

Delta Grade                             .972                       .301

Survey: Q1                            3.103                     <.001

Survey: Q2                            3.118                     <.001

Survey: Q3                            1.798                       .003

Survey: Q4                            2.495                     <.001

Survey: Q5                            2.357                     <.001

Survey: Q6                            2.129                     <.001

Survey: Q7                            3.178                     <.001                                  



Table 3: Tests of Normality: Skewness and Kurtosis                                                                                               

Measure                                   Delta         Q1          Q2          Q3          Q4          Q5          Q6         Q7

                                                Grade

N (Valid)                                     59           46           46           46           46           46           46           46

N (Missing)                                  0           13           13           13           13           13           13           13

Skewness                                -1.939     -1.834     -2.453     -1.599     -1.877     -2.334     -1.500     -1.983

Std. Error of Skewness              .311        .350        .350        .350        .350        .350        .350        .350

Skewness/SE                           -6.235     -5.240     -7.009     -4.569     -5.363     -6.669     -4.286     -5.666

 

Kurtosis                                   6.501      2.389      6.197      2.900      3.823      7.864      2.349      2.959

Std. Error of Kurtosis                .613        .688        .688        .688        .688        .688        .688        .688                                   

Kurtosis/SE                            10.605      3.472      9.007      4.215      5.557      11.430    3.414      4.301



Table 4: Independent-Samples t-Tests: Descriptive Statistics                                  

Measure                      Class               N            Mean         Std. Dev.  S.E. Mean

Delta Grade                 Open               28              .34             .244           .046

                                    Closed             31              .29             .130           .023

Survey: Q1                  Open               18            4.56             .705           .166

                                    Closed             28            4.79             .499           .094

Survey: Q2                  Open               18            4.50             .924           .218

                                    Closed             28            4.82             .390           .074

Survey: Q3                  Open               18            4.17           1.150           .271

                                    Closed             28            4.32             .772           .146

Survey: Q4                  Open               18            4.39           1.037           .244

                                    Closed             28            4.61             .497           .094

Survey: Q5                  Open               18            4.39           1.037           .244

                                    Closed             28            4.57             .573           .108

Survey: Q6                  Open               18            3.94           1.211           .286

                                    Closed             28            4.54             .637           .120

Survey: Q7                  Open               18            4.50             .786           .185

                                    Closed             28            4.86             .356           .067          



Appendix
Histogram
s

tfy

rty

tyrt

yrtyr

rtyr

rtyrty 

rtyrty


Online Journal of Distance Learning Administration, Volume XV, Number V, Winter 2012
University of West Georgia, Distance Education Center
Back to the Online Journal of Distance Learning Administration Contents