The Role of Technical Support and Pedagogical Guidance provided to Faculty in Online Programs: Considerations for Higher Education Administrators
Randy Wiesenmayer, Ph.D
West Virginia University
Lori Kupczynski, Ed.D
University of Texas-Pan American
Phil Ice, Ed.D
American Public University System
With growth of online course enrollments outpacing enrollments in traditional courses by 500%, institutions of higher education are experiencing significant changes in terms of long-term strategic planning. Along with providing for the infrastructure requirements associated with online course offerings, the issue of faculty preparedness and training is considered problematic from an administrative perspective. While instructors may be highly skilled in their content area and practiced in traditional teaching, they are likely not prepared for teaching in the online environment. Though several case studies detailing support mechanisms for faculty involved in online course initiatives exist, the associated program outcomes have been assessed in terms of faculty satisfaction and buy-in. This study examines the relationship between technical support and pedagogical guidance, two factors deemed critical to successful programs, and student satisfaction and perceived learning. The outcomes are considered important as analysis demonstrates that those measures that produce high degrees of faculty satisfaction do not significantly impact students in a similar manner. From an application perspective this leads to questions about how current training and support mechanisms can be enhanced to serve both populations.
Approximately 3.2 million students are currently taking online courses in the United States, with double digit annual growth rates expected to continue for the foreseeable future (Allen & Seaman, 2006). As a result, many traditional brick and mortar institutions have developed significant online offerings, while other colleges and universities have been founded to offer exclusively online programs (Glahn &Gen, 2002). The growth in only the last 15 years in Internet-based learning has been tremendous, so much so that it is no longer expensive and courses can provide excellent tools, such as message boards and libraries, at a cost similar to a user’s regular monthly internet account bill (Morabito, 1999). Among the nation's largest research institutions, 99% offer some online courses with over 55% offering complete programs online. With respect to strategic planning, 58% of all institutions consider online learning a key to growth (Allen & Seaman, 2006).
Against this backdrop, administrators are struggling with programmatic changes and support mechanisms to meet the demands of online learners in order to prevent their enrollment numbers from stagnating or falling prey to market forces (Pittinsky, 2002). However, institutional goals mean very little to individual instructors who are being asked to apply familiar course content and pedagogical strategies in an unfamiliar environment. For faculty who are on the frontlines of the paradigimatic shift toward anyplace, anytime learning, administrative effectiveness is assessed in terms of the adequacy of support mechanisms rather than the perceived value of long-term goal setting (Brown, 2003).
Several anecdotal accounts (Epper & Bates, 2001; Brown, 2003: Shepard, Alpert & Kohler, 2007) indicate that administrative failure to provide faculty with satisfactory support mechanisms is likely to produce widespread dissatisfaction, resistance to new online initiatives and a generalized sense of apathy toward all forms of technology mediated learning. Understandably, faculty are reluctant to publish comprehensive evaluations of such failures at their own institutions, however, one prominent study by Ferrazzi (2003) reinforces these observations through an account of how limited resources and support caused a deterioration of faculty morale on and online program quality at Brandon University . These responses are assuredly not desirable and may eventually lead to issues with retention and recruiting. Shepherd, Alpert and Koehler (2007) clarify the importance of some type of faculty mentoring programs for retention as well as the achievement of academic and institutional goals. Beyond issues related to faculty retention and recruitment, inadequate training and support programs do not allow institutions to fully capitalize on the projected increases in online enrollments, thereby limiting growth to the student demographic traditionally served.
As described in the literature review, administrative support for programs that address faculty needs in a comprehensive and ongoing fashion is a prerequisite for successful online program development and growth. To achieve this, administrators need to become informed of the possibilities of collaboration, teaching and professional development via the internet; understanding that this venue differs significantly from the face-to-face environment with which the vast majority of faculty are familiar. As well, they must be concerned about the impact of online learning in reference to retention of students and staff as well as the tools necessary for delivery of a successful and meaningful learning experience (Dietz, 2008). The literature provides exemplary initiatives in which institutionally sponsored technical support and pedagogical guidance structures have produced faculty buy-in, high levels of satisfaction, and dramatic program growth (Fredrickson, Pickett, Shea, Pelz & Swan, 2000; Hartman, Dzuiban & Moskal, 2000; Kaminski & Milheim, 2002; Moore, 2001; Schwartz & Phillips, 2003).
Despite the apparent success, measured in terms of faculty satisfaction and institutional growth, of model support programs, little research has been done with respect to how such programs impact student satisfaction. In response, this study was designed to explore the relationship between elements of a faculty support program and student satisfaction outcomes.
A review of the literature explores programs that provide for effective role transition for faculty. It also provides a theoretical explanation of how these factors are related to student satisfaction in online courses. The program on which this study was based is then contextualized to provide a clearer understanding of the data collection and analysis. Findings are presented in a format designed to inform further research and initiate discussion with respect to best practice in program development as clarified through the administrative perspective and understood through all elements involved in the process.
Review of the Literature
Supporting Faculty Role Adjustment
Within complex social organizations, there are multiple roles that define the place and importance of the individual. These may consist of acts that fall within the cognitive, affective or social domains with the boundaries often overlapping. In some instances, the roles are well defined through rule setting or traditional practice. In other cases, the roles can be amorphous, continuously evolving or defined largely by peer and administrative expectations. When any of these later conditions manifest, institutional transition slows until roles are formalized (Etzioni, 1964).
Traditionally, higher education faculty roles have been defined by achievement in teaching, research and service (McKeachie, 1986). However, when faculty are asked to develop and deliver online courses, conflicts with the traditional rubrics by which performance is assessed often arise (Coppola, 2005). With respect to the act of teaching, Berliner (1988) notes that when interacting with technology many teachers revert to novice status. For the instructor who, through years of practice, has developed a teaching style that allows him or her to teach in a seamless, fluid manner, this reversion can often lead to the belief that online learning, in general, is inferior to the traditional mode in which they are well versed (Bennett & Lockyer, 2004).
Development and delivery of online courses differ from face-to-face teaching in that they require access to administrative units, technical support staff, instructional consultants and specialized software/hardware, all of which require well-managed collaboration and scheduling. This team-based approach to delivery differs dramatically from typical higher education teaching scenarios and individual teaching roles but is considered essential to the development of successful online course development (Coppola, Hiltz & Rotter, 2002). For faculty who have based their careers on traditional course development and teaching practices, there are often serious misgivings about and resistance to this degree of collaboration, creating a significant obstacle to institutional change (Botsch & Botsch, 2000; Kaminski & Milheim, 2002).
Administrators that have developed successful, large-scale online learning programs have done so by systematically addressing the needs of faculty as they transition from traditional to online teaching (Zotti, 2005). Though the mechanisms differ from institution to institution, practices that adequately support instructors in course preparation and delivery, while promoting participation in institutional decisions related to online programs, have been found to result in high levels of faculty satisfaction despite changes to their role expectations within the organization (Brown, 2003).
To illustrate what an exemplary faculty support program might look like, a review of the University of Central Florida (UCF) experience is informative. In their program evaluation of the online course development initiative at the UCF, Hartman, Dzuiban and Moskal (2000) reported that faculty satisfaction and overall program success were rooted in the institution’s broad based support program. Beginning with the Interactive Distributed Learning for Technology-Mediated Course Delivery 6543 (IDL6543) course, faculty were systematically exposed to increasingly complex strategies and applications for online course development and delivery. At the lowest level, faculty experience online learning from a student perspective and share their perceptions of pedagogical techniques with their peers. This acts as the basis for individualized course design strategies, which are developed in a collaborative process with design team members who provide specialized instructional design and technical support.
As UCF faculty become more comfortable with the design and delivery process, individualized training and support programs are available to assist faculty with the development of technical and/or pedagogical tools and strategies (Hartman, Dzuiban & Moskal, 2000). Regardless of the stage of expertise a faculty member may wish to achieve, there are available support mechanisms present that help counter feelings of isolation or frustration. In addition, peer support and research among faculty members provides for growth of both individuals and strategic initiatives.
When surveyed, 83.4% of UCF faculty described their experience with online teaching as being satisfying. Further, 93.5% believed that the quality of interaction with students was higher than in traditional classes and 93.6% indicated they would like to teach another online course (Hartman, Dzuiban & Moskal, 2000).
Though specific support mechanisms differ, evaluations and narrative reports from the State University of New York’s Learning Network (SLN), Virginia Tech and Colorado State University describe thematically similar programs that emphasize the need for high levels of technical support and pedagogical guidance, which at each of the aforementioned institutions have had a positive impact on faculty satisfaction (Fredrickson et al., 2000; Kaminski & Milheim, 2002; Moore, 2001; Schwartz & Phillips, 2003). In stark contrast, Ferrazzi (2003) details a failed program at Brandon University in which administrators provided minimal ongoing support, relying instead on a strategy that sought to fully empower faculty with respect to course design roles. Faculty quickly became dissatisfied and the initiative stagnated with only a few early adopters continuing to teach online courses, thus reinforcing the importance of appropriate institutional support structures for program continuation.
Assessing Student Satisfaction
Assessing student satisfaction with the online learning experience has been approached through the use of several theoretical models (Alavi & Leidner, 2001; Benbunan-Fich, Hiltz & Harasim, 2005); however, the one most frequently cited is Garrison and colleagues’ (2000) Community of Inquiry Framework (CoI) (Arbuagh, 2007). Consisting of three overlapping presences (teaching, social and cognitive), which coalesce in asynchronous learning communities, the CoI provides a validated model for how various catalysts interact to facilitate the co-construction of knowledge (Arbaugh, Cleveland-Innes, Diaz, Garrison, Ice, Richardson, Shea & Swan, 2007; Swan, Richardson, Ice, Garrison, Cleveland-Innes & Arbaugh, 2008) . For purposes of this study, teaching presence is considered the most important; however, brief synopses of social and cognitive presence are provided to allow for contextualization.
Social presence, in the context of online learning, is described as the ability to project one's self through media and establish personal and meaningful relationships. The three main factors that allow for the effective projection and establishment of social presence are effective communication, open communication and group cohesion (Richardson & Swan, 2003; Swan & Shih, 2005).
Grounded in the work of Dewey (1933), cognitive presence is defined as the exploration, construction, resolution and confirmation of understanding through collaboration and reflection (Garrison, 2007). Garrison and Archer (2003) describe this process of consisting of four phases, beginning with creating a sense of puzzlement or posing a problem that piques learners curiosity. As a community, course participants exchange information and integrate understandings to answer the initial problem, culminating in the resolution phase were learners are able to apply the knowledge to both course and non-course related issues.
Teaching presence, the third component of the CoI, was initially described by Anderson and colleagues (2001) as a three-part structure consisting of: instructional design and organization, facilitation of discourse, and direct instruction. With respect to instructional design and organization, the authors include the following indicators:
- setting curriculum
- designing methods
- establishing time parameters
- utilizing the medium effectively
- establishing netiquette
The second component, facilitation of discourse, is necessary to maintain focus and engagement in course discussions. It also allows the instructor to set the appropriate climate for academic exchanges (Anderson et al., 2001). The authors include the following as indicators of facilitation of discourse:
- identifying areas of agreement and disagreement
- seeking to reach consensus and understanding
- encouraging, acknowledging, and reinforcing student contributions
- setting the climate for learning
- drawing in participants and prompting discussion
- assessing the efficacy of the process
Anderson et al. (2001) also include indicators of direct instruction in their framework for the analysis of teaching presence. These indicators include:
- presenting content and questions
- focusing the discussion on specific issues
- summarizing discussion
- confirming understanding
- diagnosing misperceptions
- injecting knowledge from diverse sources
- responding to technical concerns
Over the last five years, several attempts at designing and refining a universal CoI instrument have been made in an attempt to accurately assess student satisfaction with indicators of each of the three presences (Arbaugh et al., 2007; Garrison, 2007). Though the need for continued refinement persists, the overall structure of the three presences has been validated in several studies (Arbuagh & Hwang, 2006; Richardson & Swan, 2003; Shea, Li, Swan & Pickett, 2005; Swan & Shih, 2005), and the CoI as a whole has been confirmed through factor analysis of a common survey instrument (Appendix A) by Garrison, Cleveland-Innes & (2004); Arbaugh (2007); Arbaugh, Cleveland-Innes, Diaz, Garrison, Ice, Richardson, Shea and Swan (2007); and Swan, Richardson, Ice, Garrison, Cleveland-Innes & Arbaugh, 2008). The aforementioned instrument was used as an end of course survey in this study, with the instructional design and organization, facilitation of discourse and direct instruction subscales of Teaching Presence used as criterion variables.
Connecting Institutional Support Structures to Student Satisfaction
As previously described, several institutions have been successful in developing programs that promote high degrees of faculty satisfaction and the growth of online initiatives. At the core of these programs are mechanisms that focus on training and supporting faculty through providing ongoing technical support and pedagogical guidance. In theory, technical support should readily manifest in student satisfaction with instructors’ ability to efficiently design and organize online courses. Likewise, pedagogical training should manifest in student satisfaction with instructors’ ability to effectively facilitate discourse and provide direct instruction.
The focus on training and support combined with technical support cannot withstand without the additional support from the administrational leadership. Moore and Kearsley (1996) note that distance education requires special techniques of course design and instruction, unique methods of communication through technology and organizational and administrative arrangements. Thus, the understanding of all essential elements leads to a need for this study which assesses the relationship between elements of training programs and student satisfaction with an emphasis on the role of administration.
Context and Setting
As part of West Virginia University’s (WVU) mission to expand learning access to an increasingly diverse student body, the College of Human Resources and Education’s (HR&E) administration has encouraged faculty to offer a number of courses either fully online or in hybrid online/face-to-face formats. In this process, HR&E administration has struggled with finding the best means by which to support faculty. In a process similar to what is occurring at many other institutions, the support initiative is constantly evolving in an effort to provide faculty with the needed resources while optimizing student satisfaction and learning.
Though not formally recognized as such, elements of the HR&E support initiative bear a close resemblance to those utilized by other institutions in that it attempts to promote learning effectiveness, cost effectiveness, access, student satisfaction and faculty satisfaction; i.e., the five pillars of quality effectiveness. Since 2002, faculty have been offered access to an intensive 40-hour program, known as the Faculty Academy, that emphasizes the analysis of the pedagogical basis for exemplar courses. Faculty Academy workshops provide hands-on experiences that are intended to develop skills related to webpage design and use of the course management system, currently Blackboard Vista. Follow-up support after the end of the Faculty Academy is less formalized than the core program and varies from department to department within the college.
In some cases, more experienced faculty mentor their peers who are developing online courses on an as-needed-basis. Utilizing Graduate Assistants (GA’s) in the HR&E’s computer lab is another resource available to faculty who need help with the technical aspects of webpage design and importing resources into the content management system.
Surveys conducted at the end of the 2006 Faculty Academy indicated that participants were generally satisfied or very satisfied (mean = 4.3 on a 5 point Likert type scale) with the program. Internal pilot studies confirmed this finding and revealed that faculty place a very high value on technical support and pedagogical guidance. Despite the high levels of faculty satisfaction generated by this initiative, it remained unclear as to how effectively the training and guidance translated into online teaching efficacy or student satisfaction.
The online instructors consisted of 14 HR&E faculty, adjunct faculty and teaching assistants. Online teaching experience levels varied from early adopters who had taught multiple online courses to first time online instructors. Employment status included graduate teaching assistants, adjunct faculty and full-time faculty.
Students were registered (n = 519) in the 30 classes taught by the aforementioned instructors. All students were pursuing either master’s or doctoral degrees, with the majority of courses (n = 27) containing students at both levels.
This study used multiple regression analysis to answer the following research questions:
- What is the relationship between technical support and pedagogical guidance provided to faculty (the predictor variables) and student satisfaction with instructional design and organization (the criterion variable) in online courses?
- What is the relationship between technical support and pedagogical guidance provided to faculty (the predictor variables) and student satisfaction with facilitation of discourse (the criterion variable) in online courses?
- What is the relationship between technical support and pedagogical guidance provided to faculty (the predictor variables) and student satisfaction with direct instruction (the criterion variable) in online courses?
The Community of Inquiry Framework survey instrument, designed by Arbaugh and colleagues (2007) (Appendix A) was used to collect data from students (n = 519) in 30 online courses offered at the College of HR&E, by 14 instructors, over three semesters. Scores on the Instructional Design, Facilitation of Discourse and Direct Instruction sub-scales served as the criterion variables in this study.
Faculty were asked whether they had attended the Faculty Academy and if they had received any additional technical support or pedagogical guidance for the development and delivery of online courses. If any additional support or guidance was noted, they were asked about the source and number of hours. If faculty were unsure of the number of hours, the provider was contacted for clarification.
For those attending the Faculty Academy, 25 hours of technical support and 15 hours of pedagogical guidance were recorded. Additional hours of technical support and pedagogical guidance were provided after the Faculty Academy was added for this group. For those who did not participate in Faculty Academy, stand-alone hours of technical support and pedagogical guidance were recorded. These figures served as the predictor variables in this study.
A correlation research design was used to assess the impact of the two predictor variables on the three criterion variables, using a relationship approach (Mertens, 2005). As a precautionary measure, a test of covariance between number of courses previously taught by instructors and the measures of student satisfaction was also conducted. Table 1, below, quantifies the predictor variables by instructor as well as the number of courses, by instructor, used in this study.
Predictor variables by instructor
Number of Online Courses Previously Taught
Cumulative Hours of Technical Support
Cumulative Hours of Pedagogical Guidance
Number of Courses Used in Study
As previously noted in the review of literature, the CoI survey instrument has previously been validated (Arbaugh, Cleveland-Innes, Diaz, Garrison, Ice, Richardson, Shea and Swan, 2007; Swan, Richardson, Ice, Garrison, Cleveland-Innes & Arbaugh, 2008) through factor analysis, with a very high degree of reliability associated with each of its subscales. However, as a precautionary measure, reliability analysis was applied to examine the internal consistency of the instructional design and organization, facilitation of discourse, and direct instruction subscales. The reliability coefficient of the instructional design and organization subscale was .92. The reliability coefficient was .94 for both the facilitation of discourse and direct instruction subscales. These reliability coefficients were not significantly different than the reliability coefficients in the aforementioned instrument validation studies.
Multiple Regression Analysis – Instructional Design and Organization
A multiple regression analysis was applied to examine the relationship between the criterion variable (student satisfaction with instructional design and organization) and the predictor variables (hours of technical support, hours of pedagogical guidance and number of online courses previously taught). No violations were found in the assumptions of normality, linearity, and homoscedasticity of residuals. Twenty outliers were found based on the criteria of beyond 3 standard deviations; these were removed, and 499 cases were used in the analysis.
Presented in Table 2 are the unstandardized betas (B), standard error (SE B), standardized betas (Beta) and significance level of the predictor variables. The results of the regression model were found to be significant, F (3, 498) = 4.990, p<.05. However, the multiple correlation coefficient was .171, indicating that only 2.9% of total variance of student satisfaction with instructional design and organization could be accounted for by the predictor variables. Further, the number of courses previously taught was not a significant predictor within the regression equation.
Multiple Regression on Instructional Design / Organization Using Two Predictors and One Co-Predictor
Courses Previously Taught
Multiple Regression Analysis – Facilitation of Discourse
A multiple regression analysis was applied to examine the relationship between the criterion variable (student satisfaction with facilitation of discourse) and the predictor variables (hours of technical support, hours of pedagogical guidance and number of online courses previously taught). No violations were found in the assumptions of normality, linearity, and homoscedasticity of residuals. Twenty outliers were found based on the criteria of beyond 3 standard deviations; these were removed, and 499 cases were used in the analysis.
Presented in Table 3 are the unstandardized betas (B), standard error (SE B), standardized betas (Beta) and significance level of the predictor variables. The results of the regression model were found to be significant, F (3, 498) = 3.075, p<.05. However, the multiple correlation coefficient was .135, indicating that only 1.8% of total variance of student satisfaction with facilitation of discourse could be accounted for by the predictor variables. Further, the number of courses previously taught was not a significant predictor within the regression equation.
Multiple Regression on Facilitation of Discourse Using Two Predictors and One Co-Predictor
Courses Previously Taught
Multiple Regression Analysis – Direct Instruction
A multiple regression analysis was applied to examine the relationship between the criterion variable (student satisfaction with direct instruction) and the predictor variables (hours of technical support, hours of pedagogical guidance and number of online courses previously taught). No violations were found in the assumptions of normality, linearity, and homoscedasticity of residuals. Twenty outliers were found based on the criteria of beyond 3 standard deviations; these were removed, and 499 cases were used in the analysis.
Presented in Table 4 are the unstandardized betas (B), standard error (SE B), standardized betas (Beta) and significance level of the predictor variables. The results of the regression model were found to be significant, F (3, 498) = 3.285, p<.05. However, the multiple correlation coefficient was .135, indicating that only 1.8% of total variance of student satisfaction with instructional design and organization could be accounted for by the predictor variables. Further, the number of courses previously taught was not a significant predictor within the regression equation.
Multiple Regression on Facilitation of Discourse Using Two Predictors and One Co-Predictor
Courses Previously Taught
While previous research has revealed that technical support and pedagogical guidance are highly valued support structures by faculty teaching online courses, this study indicates that they are not necessarily factors that can be correlated with student satisfaction. Even though the regression analyses conducted in this study were found to be significant at the α = .05 level, the predictor variables accounted for an extremely low percentage (less than 3% in all cases) of the variability in student satisfaction. Thus, it can be assumed that while technical support and pedagogical guidance are historically responsible for faculty satisfaction, other factors are likely more important in promoting student satisfaction with indicators of facilitation of discourse and direct instruction in online environments. Equally important, the co-predictor, number of online courses previously taught was not found to be a significant predictor of student satisfaction with these indicators.
Currently, many faculty development programs are based largely on hours of faculty support. This study demonstrates that this approach should not be expected to necessarily produce high levels of student satisfaction. This conclusion is in alignment with suggestions by Dzuiban, Shea and Arbaugh (2005) that, as yet, largely unexplored personal factors may be responsible for success in teaching online. This proposition is potentially reinforced by the findings of this study in which some instructors with relatively low quantities of technical support and pedagogical guidance were able to generate student satisfaction scores comparable to or better than instructors who received substantively greater levels of support in each of these areas.
An example of how personal factors may influence student satisfaction can be found in work by Phillips and colleagues (2007) which revealed a tendency among faculty with highly objectivist teaching orientations to confuse technological tools with applications of pedagogy in the online environment. In practice, this tendency led to extremely rigid activities that did not allow for adequate constructivist interaction. Other possible personal traits include the ability of an instructor to project caring in the online environment to reinforce the sense of community (Swan-Dagen & Ice, 2008). Another possible factor that should be considered is the ability of instructors to view themselves as learners and the belief that they can benefit from academic propositions put forth by students (Ley, 2006).
From an administrative perspective, the implications are far reaching. Those who choose to lead may find success through a clear understanding of all elements involved in the learning process, from course preparation to faculty involvement to projections in the online environment. With an understanding of the unique techniques required for design and instruction as well as electronic communication, those leading will be able to clarify special organizational and administrative arrangements (Moore and Kearsley, 1996).
As discussed above, contemporary practices in training faculty to teach online are often based on an hours of contact approach. While such practices have been shown to produce high levels of faculty satisfaction at various institutions, including the site of this study, there appears to be no meaningful relationship between hours of technical support and pedagogical guidance provided to faculty and student satisfaction in online courses. Therefore, it may be necessary for those responsible for faculty training to rethink the effectiveness of an hours based approach when designing training and support programs.
However, determining the structure of training and support programs that produce both high levels of instructor and student satisfaction, in the most efficient manner, may be problematic. In a metastudy of best practices in online training, Wolf (2004) concluded that there is no consensus among leading practitioners as to what constitutes best practice in online faculty development. Further, her study concludes that the online teaching and learning environment is considered so different, by experts in the field, from traditional teaching that few if any face to face teacher training strategies would be of significant use in online faculty development programs.
While this study does not provide insight into the dilemma noted by Wolf or suggest proactive measures related to online faculty training; however, it does highlight the general ineffectiveness of assuming that higher contact hours of technical support and pedagogical guidance will necessarily result in increased student satisfaction and perceived learning. For administrators this finding translates into the need to engage in fine grained program evaluation that assess training and support mechanisms in terms of both faculty satisfaction and student oriented outcomes.
To become successful in promoting student satisfaction in online courses, this study indicates that it may imperative that administrators begin to shift their focus from programs based on contact hours and explore how personal and pedagogical characteristics can be cultivated in faculty. This should be considered a top priority for institutions that are willing to recognize the general ineffectiveness of contact hours based training and support programs.
Recommendations for Further Research
As noted in the implications section, there exist suggestions in the literature that as yet unexplored personal factors may play an integral role in developing effective online praxis. An effective way to further research this area would be to initiate highly qualitative case study analyses of factors that promote student satisfaction across a broad spectrum of courses. In this process, students would be engaged in open ended dialogue regarding their experiences in online courses to unearth those elements most responsible for producing high degrees of satisfaction. From these, thematic similarities could be identified and specific acts or practices identified. Assigning dummy variables to such themes and using hierarchical linear modeling to correlate this data with CoI indicators should produce useful data. Other mixed methods studies that explore relationships between qualitative themes and CoI data could utilize structural equation modeling and chi square automatic interaction detection.
This study utilized only data from a college of education. Confirmatory analysis involving instructors and students from other disciplines should be undertaken. Such confirmatory analysis would be of benefit to all areas of education with regards to the online environment.
Alavi, M., & Leidner, D. (2001). Research commentary: Technology mediated learning--A call for greater depth and breadth of research. Information Systems Research, 12(1), 1 - 10.
Allen, I. E., & Seaman, J. (2006). Making the grade: Online education in the United States. Needham, MA: Sloan Consortium.
Anderson, T., Rourke, L., Garrison, D., & Archer, W. (2001). Assessing teaching presence in a computer conferencing context. Journal of Asynchronous Learning Networks 5 (2), 1–17.
Arbaugh, J. B. (2007). An empirical verification of the community of inquiry framework. Journal of Asynchronous Learning Networks, 11(1), 73-85.
Arbaugh, J. B., Cleveland-Innes, M. Diaz, S., Garrison, D. R., Ice, P., Richardson, J. C., Shea, P., & Swan, K. (2007, November). “Community of Inquiry Framework: Validation and Instrument Development.” Orlando, FL: 13th Annual Sloan-C International Conference on Online Learning.
Arbaugh, J. B., & Hwang, (2006). Does "teaching presence" exist in online MBA courses? The Internet and Higher Education 9: 9-21
Benbunan-Fich, R., Hiltz, R., & Harasim, L. (2005). The online interaction learning model: An integrated theoretical framework for learning networks. In S. Hiltz & R. Goldman (Eds.), Learning together online: Research on asynchronous learning networks (pp. 19 – 37). Mahwah, NJ: Lawrence Erlbaum.
Bennett, S., & Lockyer, L. (2004). Becoming an online teacher: Adapting to a changed environment for teaching and learning in higher education. Educational Media International, 41(3), 231-244.
Berliner, D. (1988, February). The development of expertise in pedagogy. Charles W. Hunt Memorial Lecture presented at the Annual Meeting of the American Association of Colleges for Teacher Education, New Orleans, LA.
Botsch, C., & Botsch, R. (2000, July/August). Gaining Faculty Acceptance for Online Courses at a Traditional College. The Technology Source – Michigan Virtual University, article 1. Retrieved March 30, 2005, from http://ts.mivu.org/default.asp?show=article&id=788
Brown, D. (Ed.). (2003). Developing faculty to use technology: Programs and strategies to enhance teaching. Bolton, MA: Anker.
Coppola, N. (2005). Changing roles for online teachers of technical communication. In K. Cook & K. Grant-Davie (Eds.), Online education: Global Questions, local answers (pp. 89-100). Amityville, NY: Baywood.
Coppola, N. W., Hiltz, S. R., & Rotter, N. G. (2002). Becoming a virtual professor: Pedagogical roles and asynchronous learning networks. Journal of Management Information Systems, 18(4), 169-189.
Dewey, J. (1933). How we think (Rev. Ed.). Boston: D.C. Heath.
Dietz, (2008). The future is now: How online learning is growing as an accepted tool for teaching. University Business, (11)5, 68.
Dziuban, C., Shea, P., & Arbaugh, J. (2005). Faculty roles and satisfaction in asynchronous learning networks. In S. Hiltz & R. Goldman (Eds.), Learning together online: Research on asynchronous learning networks (pp. 19 – 37). Mahwah, NJ: Lawrence Erlbaum.
Etzioni, A. (1964). Modern organizations. Englewood Cliffs, NJ: Prentice-Hall.
Epper, R. & Bates, A. (Eds.), Teaching faculty how to use technology: Best practices from leading institutions. (pp. 79-92). Westport, CT: Onyx
Ferrazzi, G. (July / August, 2003). Ambitious Vision, Limited Resources: A Flexible Approach to Distributed Learning. The Technology Source – Michigan Virtual University. Retrieved March 30, 2005, from http://ts.mivu.org/default.asp?show=article&id=946
Fredrickson, E., Pickett, A., Shea, P., Pelz, W., & Swan, K. (2000). Factors influencing faculty satisfaction with asynchronous teaching and learning in the SUNY learning network. Journal of Asynchronous Learning Networks, 4(3), 245 - 278.
Garrison, D. R. (2007). Online community of inquiry review: Social, cognitive, and teaching presence issues. Journal of Asynchronous Learning Networks 11(1), 61- 72.
Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-based environment: Computer conferencing in higher education. The Internet and Higher Education 2(2-3), 87-105.
Garrison, D. R. & Archer, W. A. (2003). Community of Inquiry Framework for Online Learning. (M. G. Moore & W. G. Anderson, Eds.). Handbook of distance education. New York: Erlbaum.
Garrison, D. R., Cleveland-Innes, M., and Fung, T. (2004). Student role adjustment in online communities of inquiry: Model and instrument validation. Journal of Asynchronous Learning Networks, 8(2), 61-74.
Glahn, R. & Gen, R. (2002). Progenies in education: The evolution of internet teaching. Community College Journal of Research & Practice. 26(10), 777-785.
Hartman, J., Dzuiban, C., & Moskal, P. (2000). Faculty satisfaction in ALNs: A dependent or independent variable? Journal of Asynchronous Learning Networks, 4(3), 155-179.
Kaminski, K., & Milheim, W. (2002, November/December). Institutional challenges in the creation and delivery of an online degree program. The Technology Source – Michigan Virtual University. Retrieved March 30, 2005, from http://ts.mivu.org/default.asp?show=article&id=929
Ley, K. (2006, August). Virtually being there: Establishing social presence. Presentation at the 22nd Annual Conference on Distance Teaching and Learning, Madison, WI.
McKeachie, W. (Ed.) (1986). Teaching and learning in the college classroom: A review of the research literature. Ann Arbor, MI: University of Michigan.
Mertens, D. (2005). Research and evaluation in education and psychology: Integrating diversity with quantitative, qualitative and mixed methods, 2nd edition. Thousand Oaks, CA: Sage.
Moore, A. (2001). Designing advanced learning communities: Virginia Tech’s story. In R. Epper & A. Bates (Eds.), Teaching faculty how to use technology: Best practices from leading institutions. (pp. 79-92). Westport, CT: Onyx.
Moore, M. G. & Kearsey, G. (1996). Distance education: A systems view. Belmont, CA.: Wadsworth.
Morabito, M. G. (1999). Online distance education: Historical perspective and practical application. Universal Publishers/uPUBLISH.com.
Phillips, P., Wells, J., Ice, P., Curtis, R. & Kennedy, R. (2007). A case study of the relationship between socio-epistemological teaching orientations and instructor perceptions of pedagogy in online environments. Electronic Journal for the Integration of Technology in Teacher Education. 6, 3-27. http://ejite.isu.edu/Volume6/phillips.pdf
Pittinsky, M. (2002). The wired tower: Perspectives on the impact of the internet on higher education. Upper Saddle River, NJ: Prentice Hall.
Richardson, J. C., and Swan, K. (2003). Examining social presence in online courses in relation to students’ perceived learning and satisfaction," Journal of Asynchronous Learning Networks, 7(1), 68-88.
Schwartz, E., & Phillips, S. (2003). Supporting faculty in the use of technology. In D. Brown (Ed.), Developing faculty to use technology: Programs and strategies to enhance teaching (pp. 103-107). Bolton, MA: Anker.
Shea, P., Swan, K., Li, C., & Pickett, A. (2005). Developing learning community in online asynchronous college courses: The role of teaching presence. Journal of Asynchronous Learning Networks, 9(4), 59-82.
Shepherd, C., Alpert, M., & Koeller, M. (2007). Increasing the efficacy of educators teaching online. International Journal of Social Sciences, (2)3, 173-179.
Swan, K. Richardson, J. C., Ice, P., Garrison, D. R., Cleveland-Innes, M. & Arbaugh, J. B. (2008). Validating a measurement tool of presence in online communities of inquiry, eMentor, 24I(2). Retrieved August 8, 2008 from http://www.e-mentor.edu.pl/artykul_v2.php?numer=24&id=543.
Swan, K., & Shih, L. F. (2005). On the nature and development of social presence in online course discussions. Journal of Asynchronous Learning Networks 9(3), 115-136.
Swan-Dagen, A. & Ice, P. (2008). A voyage across the three C’s: Content literacy, computers and community of learners. Journal of Reading Education. 33 (3), 2- 8.
Wolf, Patricia. (2004). Best practices in the training of faculty to teach online. Unpublished doctoral dissertation, University of Maryland University College, Adelphi, MD.
Zotti, R. (2005, November). Strategies for successful growth of online learning programs. Presentation at the 11th Sloan-C International Conference on Asynchronous Learning Networks, Orlando, FL.
Community of Inquiry Survey Instrument
Developed by Ben Arbaugh, Marti Cleveland-Innes, Sebastian Diaz, Randy Garrison, Phil Ice, Jennifer Richardson, Peter Shea & Karen Swan
Design & Organization
1. The instructor clearly communicated important course topics.
2. The instructor clearly communicated important course goals.
3. The instructor provided clear instructions on how to participate in course learning activities.
4. The instructor clearly communicated important due dates/time frames for learning activities.
5. The instructor was helpful in identifying areas of agreement and disagreement on course topics that helped me to learn.
6. The instructor was helpful in guiding the class towards understanding course topics in a way that helped me clarify my thinking.
7. The instructor helped to keep course participants engaged and participating in productive dialogue.
8. The instructor helped keep the course participants on task in a way that helped me to learn.
9. The instructor encouraged course participants to explore new concepts in this course.
10. Instructor actions reinforced the development of a sense of community among course participants.
11. The instructor helped to focus discussion on relevant issues in a way that helped me to learn.
12. The instructor provided feedback that helped me understand my strengths and weaknesses.
13. The instructor provided feedback in a timely fashion.
14. Getting to know other course participants gave me a sense of belonging in the course.
15. I was able to form distinct impressions of some course participants.
16. Online or web-based communication is an excellent medium for social interaction.
17. I felt comfortable conversing through the online medium.
18. I felt comfortable participating in the course discussions.
19. I felt comfortable interacting with other course participants.
20. I felt comfortable disagreeing with other course participants while still maintaining a sense of trust.
21. I felt that my point of view was acknowledged by other course participants.
22. Online discussions help me to develop a sense of collaboration.
23. Problems posed increased my interest in course issues.
24. Course activities piqued my curiosity.
25. I felt motivated to explore content related questions.
26. I utilized a variety of information sources to explore problems posed in this course.
27. Brainstorming and finding relevant information helped me resolve content related questions.
28. Discussing course content with my classmates was valuable in helping me appreciate different perspectives.
29. Combining new information helped me answer questions raised in course activities.
30. Learning activities helped me construct explanations/solutions.
31. Reflection on course content and discussions helped me understand fundamental concepts in this class.
32. I can describe ways to test and apply the knowledge created in this course.
33. I have developed solutions to course problems that can be applied in practice.
34. I can apply the knowledge created in this course to my work or other non-class related activities.
5 point Likert-type scale
1 = strongly disagree, 2 = disagree, 3 = neutral, 4 = agree, 5 = strongly agree