A Confirmatory Factor Analysis of a Teaching Presence Instrument in an Online Computer Applications Course


Laura McNeill
The University of Alabama
lmcneal@ccga.edu

Margaret Rice
The University of Alabama
mrice@ua.edu

Vivian Wright
The University of Alabama
vwright@ua.edu

Abstract

Academic research has consistently shown effective teacher presence to be a significant factor in stu-dent satisfaction, engagement, perceived learning, and sense of community. The need for effective teaching presence remains of significant importance, particularly with the vast growth of online courses and online degree programs. It is, therefore, also necessary to evaluate the instruments used to measure effective teaching presence. The purpose of this study was an examination of the construct validity of a survey instrument developed to assess effective online teaching presence. Data included teaching pres-ence survey results from undergraduate students enrolled at a large research university in the United States. No demographic data were collected, other than gender, which did not reveal significant data in the study. This study has contributed to the literature of the field by concluding the 13-item Arbaugh Teaching Presence instrument did not measure the teaching presence construct as intended. Through examination of academic literature, modifications to the teaching presence instrument since 2003 may have compromised structure and validity of the original teaching presence survey. Development and testing of a new teaching presence instrument, utilizing empirical research and learning outcomes, is recommended.


Introduction

Faced with limited resources, a demand for high quality courses, and a growing challenge to retain students (Meyer, 2014), educators must set aside traditional face-to-face instructional strategies (Stern, 2004) and create new methods to connect effectively with college students, particularly those in online environments (Serdyukov, 2017). Sorensen and Donovan (2107) identified lack of support and absence of instructor interaction as two common factors that contribute to poor course completion rates in higher education. The lack of teacher presence can add to feelings of student isolation and disengagement in online courses and may have a negative influence on learner performance and/or motivation (Bawa, 2016; Sorensen & Donovan, 2017).

The perception of teacher presence in online courses has been researched largely from the instructor’s perspective (Richardson, Besser, Koehler, Lim, & Strait, 2016), and to a smaller degree, from the students’ perspective (Martin, Wang, & Sadaf, 2018). While instructional best practices are often examined (Richardson et al., 2016), rarely is student engagement behavior studied in relation to effective teaching presence. For example, Martin and Bollinger (2018) asked learners to score valuable student-instructor engagement strategies but did not explore the effect of those approaches on student behavior. The lack of empirical evidence underscores the need for the examination of instruments assessing online teaching presence best practices (Kennan, Bigatel, Stockdale, & Hoewe, 2018).

Literature Review

Across the nation, colleges and universities are under intense scrutiny and pressure to recruit more students every year (Meyer, 2014), increase graduation rates (Meyer, 2014), and increase their online presence (Redmond, Abawi, Brown, & Henderson, 2018), all while managing funding cuts (Finkelstein, Conley, & Schuster, 2016), an aging faculty (Kaskie, 2017), and fewer resources (Bowen & Tobin, 2015; Boyer, 2016). While higher education institutions will see some success in attracting college students to enroll in online courses with scholarships (Korn, 2018) and grants (MarksJarvis, 2016), after 6 years, fewer than half of those students complete a degree or certificate at the same institution (National Student Clearinghouse, 2017). Twelve percent of students are still enrolled in college after that same 6 years without completing a degree, and the National Student Clearinghouse (2017) reports that one in three students leaves higher education permanently.

According to the NSC (2017), 4-year public colleges graduate 65% of students, with private not-for-profit institutions topping that margin by 11%, and more than three-fourths of all enrollees securing a credential. Students at community colleges fare far worse, with a 38% graduation rate; the rate is identical for those students who transfer from community college to a 4-year institution (National Student Clearinghouse, 2017). The burden to increase student populations has many administrators reaching beyond typical demographic boundaries to recruit students (Jaschik, 2018) into online programs which promise flexibility (Watson, Watson, Richardson, & Loizzo, 2016), technological sophistication (Bonk & Zhang, 2006; Serdyukov, 2017), and access to courses 24/7 (Stoltz-Loike, 2017) as long as the  students have an Internet connection and a reliable laptop or tablet (Foley-McCabe & Gonzalez- Flores, 2017).

With the rapid growth of online learning programs comes concern over lagging student retention. Many students simply are not prepared for the rigors of learning without an instructor’s presence in the classroom, as illustrated by low achievement (Fredricks & McColsky, 2012) and lack of connection to a group of learners (Dixson, 2015). It has been concluded that instructors must implement different teaching practices to ensure those students’ satisfaction in the virtual learning environment (Cole, Shelley, & Swartz, 2014). Those instructors, committed to learning excellence, are crucial to a university’s success (Leroy, Palanski, & Simons, 2012; Serdyukov, 2017). Researchers stressed the significance of utilizing student engagement techniques to boost student engagement (Meyer, 2014), which has been shown to produce a higher quality of graduate (Teacher Education Ministerial Advisory Group, 2014). As Meyer (2014) noted, engagement is connected to encouraging student outcomes, with some research suggesting a connection between student engagement and student completion. Meyer (2014) also explained that the failure to note the importance of positive student engagement in online learning environments meant endangering the retention of students who are academically qualified but feel severed from the learning environment.

The Concept of Teaching Presence

The idea of teaching presence was first developed as part of the Community of Inquiry (CoI) framework (Garrison, Anderson, & Archer, 2001). CoI, which combines social, cognitive, and teaching presence, grew out of the work of John Dewey and was established to guide online learning research and practice (Garrison & Arbaugh, 2007). It contended that higher-order learning is best experienced as a “community of inquiry” composed of teachers and learners (Arbaugh et al., 2008). That said, while both social and cognitive learner interactions are necessary to create an effective online learning community (Garrison, Anderson, & Archer, 2000), teaching presence is deemed the binding element in the CoI, as communications and exchanges, on their own, are not enough to make sure that online learning occurs (Garrison et al., 2000). Garrison and Arbaugh (2007) further depicted teaching presence as the “design, facilitation, and direction of cognitive and social processes for the purpose of realizing personally meaningful and educationally worthwhile learning outcomes” (p. 163).

According to CoI researchers, teaching presence greatly influences online learning, as it connects students and instructors separated by distances and time zones (Garrison et al., 2001) and affects perceived learning and student satisfaction (Garrison & Arbaugh, 2007). The Community of Inquiry framework, according to Zhang, Lin, Zhan, and Ren, (2016), classified teachers as facilitators, with their responsibilities including online course design, schedule, and structure, as well conducting educational instruction and lectures. Garrison, Cleveland-Innes, and Fung (2010) explained that under CoI, teachers need to facilitate the following relationships: student-instructor, student-peer, and student-material. To enhance the student-instructor connection, it is necessary for teachers to provide timely feedback to students, observe online social activities, and deliver instructions (Zhang et al., 2016).

The Three Components of Teaching Presence

Arbaugh and Hwang (2006) contended that meaningful learning experiences require learners to demonstrate critical thinking and participate in social aspects of an educational environment to be successful; however, these interactions also require clearly defined parameters and a focused direction, and consequently, the need for teaching presence. As identified in Anderson, Rourke, Garrison, and Archer (2001), teaching presence contains three components: instructional design and organization, facilitating discourse, and direct instruction.

Instructional Design and Organization
Under the first component of teaching presence, Garrison et al. (2000) emphasized structure, including timeline, protocol, and format (Anderson et al., 2001). It is described as the clear planning and consistent design of the online course organization (Anderson et al., 2001), most of which occurs before the course begins (Arbaugh et al., 2008). According to Arbaugh et al. (2008), of teaching presence’s three components, this one has the greatest likelihood of being accomplished entirely by the instructor.

Facilitating Discourse

The second teaching presence component identified by Anderson et al. (2001) can be defined as the instructor and learners engaging with each other and actually working in concert (Arbaugh et al., 2008) to provide a vibrant and viable online learning environment. Arbaugh and Hwang (2006) explained facilitating discourse as how students interact with the information provided in the course, as well as with the instructors and learners, to reach consensus and understanding.

Direct Instruction

The third teaching presence component requires the instructor to be a content expert. In this component, the instructor provides academic leadership through the sharing of subject matter expertise (Arbaugh et al., 2008). Arbaugh et al. (2008) emphasized that a subject matter expert, and not a facilitator, must serve in this role because of the need to diagnose comments for accurate understanding, injecting sources of information, and directing discussions in useful directions, scaffolding learner knowledge to raise it to a new level.

Previous Research on the Teaching Presence Instrument

While the three Community of Inquiry model components of social, cognitive, and teaching presence were initially only studied using content analysis methodology (Anderson et al., 2001; Garrison et al., 2000; Jefferies, Grodzinsky, & Griffin, 2003), an instrument was later piloted, modified, and utilized in subsequent studies shown in Table 1.

Table 1

Studies Involving the Teaching Presence Component of the Community of Inquiry Framework

Shea, Li, and Pickett (2006) later explored the factor structure of the teaching presence instrument as part of a 42-item Community of Inquiry (CoI) survey of online undergraduate learning communities. Seventeen of 20 items loaded successfully into only two of the three teaching presence components, direct instruction and instructional design and organization. The same year, Shea et al. (2006) analyzed both Rovai’s Classroom Community Index and the teaching presence survey in an online undergraduate environment with 17 of 20 items loading successfully into only two of the three teaching presence components, direct instruction and instructional design and organization. Arbaugh and Hwang (2006) later examined teaching presence in online MBA courses. Their study revealed that 16 of the 20 teaching presence items loaded successfully into the three teaching presence components: direct instruction, instructional design and organization, and facilitating discourse. Arbaugh et al. (2008) simultaneously examined all components of a 34-item Community of Inquiry (CoI) framework, testing the construct validity of the social, cognitive, and teaching presence sections with online graduate students; the teaching presence section contained 13 items. Zhang et al. (2016) utilized the 13-item teaching presence instrument (see Appendix A) identified by Arbaugh et al. (2008) to measure teaching presence and student engagement behaviors as identified by Chi and Wylie’s (2014) Interactive-Constructive-Active-Passive (ICAP) framework. The Zhang et al. (2016) study surveyed teaching professionals in China.

Setting of the Study


The current study was conducted at a 4-year public research university in the United States. Students surveyed in the study were enrolled in Computer Technology Applications (CTA), offered online through the university’s College of Education. Five online sections of CTA were offered during the Fall 2018 semester. All CTA course material was offered through the Blackboard Learning Management System (LMS). The course introduces students to computer applications relating to problem solving, critical thinking, instruction, data management, and web page development. Students completed 25 assignments. The number of modules, assignments, papers, and projects were identical in each section of the course.

Participants

The participants in the study were 122 undergraduate students classified as freshmen, sophomores, or juniors enrolled in a Computer Technology Applications course at a 4-year research university located in the United States. Though the course was offered in an asynchronous, online format with no in-person meetings between the instructor and students, the students were classified as on campus for tuition purposes. While the course was not restricted to education majors, the majority of the students who took the course were education majors.

Survey Instrument

For the current study, students completed the 13-item Arbaugh Teaching Presence (ATP) instrument, which reflects the three factors of teaching presence concept proposed by Anderson et al. (2001): instructional design and organization, facilitating discourse, and direct instruction. The 13-item ATP instrument uses a 5-point Likert-type scale ranging from 1 (Strongly Disagree) to 5 (Strongly Agree) to identify the student responses.

The 13-item ATP instrument was derived from a 28-item teaching presence instrument created by Anderson et al. (2001) and Shea, Fredericksen, Pickett, and Pelz (2003). Shea, Pickett, and Pelz (2003) used the 28-item instrument in two undergraduate student studies; however, the researchers did not utilize quantitative validation to examine the results (Miller, Haha-Vaughn, & Zygouris-Coe, 2014). Shea et al. (2006) later explored a modified, 20-item version of the original 28-item teaching presence instrument using principal component analysis (PCA). PCA revealed a two-factor model (instructional design and organization and directed facilitation, with Cronbach Alpha scores of .97 and .93, respectively), instead of the original three-factor model proposed by Anderson et al. (2001). Arbaugh and Hwang (2006) also examined the structure of the 20-item version of the instrument using confirmatory factor analysis with Cronbach Alpha scores of .90, .94, and .89 for instructional design and organization, facilitating discourse, and direct instruction, respectively.

Principal component analysis was later conducted by Arbaugh et al. (2008) on the three elements of the Community of Inquiry (CoI) Framework, which include social presence, cognitive presence, and teaching presence. The data from the Arbaugh et al. (2008) study supported the construct validity of the three CoI elements. The study produced a Cronbach’s Alpha of .94 for the teaching presence element in the CoI framework.

Zhang et al. (2016) utilized the 13-item teaching presence instrument from the Arbaugh et al. (2008) study to examine the impact of teaching presence on online engagement behaviors with middle-school teachers in China. Factor analysis in the Zhang et al. study (2016) produced a Cronbach’s Alpha of .98. The eigenvalue for the one factor extracted was 10.42 (Zhang et al., 2016). Using the ICAP (interactive>constructive>active>passive) framework, Zhang et al. (2016) examined the impact of teaching presence on student’s overt, observable engagement behaviors. Teaching presence had a significant impact on interactive and constructive modes of the ICAP framework, but did not influence the active and passive modes (Zhang et al., 2016).

Research Question

The following research question guided this study: Does the Arbaugh Teaching Presence instrument measure the teaching presence construct as intended?

Data Collection

The 13-item ATP instrument was administered in the final two weeks of the Fall 2018 semester to students enrolled in five online sections of CTA (n=122). This study used an anonymous online survey for data collection. The intent was to gather post-course data to determine students’ perceptions of teaching presence in the online course. Qualtrics survey software was used for data collection.

Data Analysis

For the population size of n=122, a 73% response rate was received from the CTA students (n=89). For the purposes of this research, the Likert-type scale responses were treated as interval ratio variables (Cooper & Schindler, 2014). Utilizing SPSS version 25 for statistical calculation, exploratory factor analysis was used with untransformed variables. Variables were kept untransformed as there was no need to change the distribution of variables (an assumption of normality was not relied upon for this analysis). Principle component analysis (PCA) was the chosen method of factor extraction. Orthogonal rotation was selected using Varimax with Kaiser normalization. Rotation converged in 10 iterations.

Findings

As the goals of the research were to assess construct validity of the ATP instrument, a cross-sectional quantitative study was designed. Further, a non-experimental design was chosen, as the research goals did not require any manipulation of participant behavior. The 13-item ATP instrument was administered during Fall 2018 to students at a research university in the United States. Of the student population (n=122), 91 students completed the ATP instrument, yielding a response rate of 75%. Two of the students’ scores (1.5%) were rejected as the student consent form was not completed, which resulted in 89 participants for a 73% response rate. Responses were scored using a Likert-type scale (1= Strongly Disagree) to (5 = Strongly Agree).

Mean responses for the 13 items ranged from 4.42 (Instructor actions reinforced the development of a sense of community among course participants) to 4.69 (The instructor clearly communicated important due dates/time frames for learning activities). Standard deviations were highest for Item 5 (s = 1.00) (The instructor was helpful in identifying areas of agreement and disagreement on course topics that helped me to learn), and lowest for Item 1 (s = 0.80) (The instructor clearly communicated important course topics). When considering all respondents' ratings, the 13 items collectively yield a mean score of 4.56 (s = 0.94).

Research Question Findings

To answer the research question, which asked whether the ATP instrument measured the teaching presence construct as intended, distinct statistical methodologies were employed, using SPSS version 25 to conduct all statistical analyses. Exploratory factor analysis (EFA) was utilized to examine the construct validity of the ATP instrument. For the purposes of this research, it was determined that all Likert scale responses would be treated as interval ratio variables (Cooper & Schindler, 2014).

First, it was determined that the dataset was suitable for exploratory factor analysis (EFA). It was observed that 13 of the 13 items correlated at least .3 with at least one other item, suggesting reasonable factorability. Principal component analysis (PCA) was performed to extract variables from the correlation matrix. PCA allowed for a more comprehensive analysis of variance, revealing significant detail related to the nature of the factors. Since the ATP instrument, as a sub-component of the Community of Inquiry (CoI) survey, is well-researched and utilized in academic research (Arbaugh et al., 2008; Garrison et al., 2001, 2010; Shea & Bidjerano, 2010; Zhang et al., 2016), no latent variables were sought in the analysis. Varimax with Kaiser normalization method of rotation was selected as an aid in data reduction (Field, 2014). Rotation converged in 10 iterations.

The Kaiser-Meyer-Olkin measure of sampling adequacy was .95, above the recommended value of .6. The scree plot (Appendix B) and Rotated Component Matrix (Figure 1) determined that three meaningful factors exist. The component loadings are desirable with at least three variables per factor loading above .540. The component plot in rotated space (Figure 1) shows the component loadings in three-dimensional space.

It was expected, based on previous teaching presence research (Shea, Fredericksen et al., 2003; Shea, Pickett et al., 2003), that the 13 items in the ATP instrument would load with four items loading into Instructional Design (ID) and Organization (Communicated Course Topics, Clarified Course Goals, Clarified Learning Activities, Communicated Due Dates), which Anderson et al. (2001) described as the structure of the course, largely organized before the course begins. Six items were expected to load into Facilitated Discussion (Interpreted Course Topics, Clarified Course Topics, Directed Engagement, Kept Students on Task, Encouraged Exploration of Concepts, Reinforced Community). The descriptors for this element included the participants in the course engaging with each other equally, seeking to reach consensus and understanding, setting the climate for learning, reinforcing student contributions, and identifying areas of agreement and disagreement (Shea, Fredericksen et al., 2003). Three items were expected to load into Directed Instruction (Focused Discussion, Clarified Areas to Improve, Provided Timely Feedback). Teaching presence’s directed instruction descriptors included the instructor leading the class as a subject matter expert, presenting content and questions, focusing the discussion, summarizing discussion, and confirming understanding (Shea, Fredericksen et al., 2003).

Table 2
Rotated Component Matrix Resulting From Factor Analysis

Figure 1. Component plot in rotated space. This figure represents a graphical view of the factor loading associated with each item.

The exploratory factor analysis revealed actual factor loadings different from the expected factor loadings (Table 3). In the actual factor loading, 5 of the total 13 items, or 38% of the items, matched the expected factor loading. The Instructional Design and Organization category matched three of four factors at 75%, the Facilitated Discussion category matched one of six factors at 17%, and the Directed Instruction category matched one of three factors at 33%. It was determined that the construct validity of the ATP instrument poorly measured the teaching presence construct as intended.

Table 3

Expected Versus Actual Factor Loadings


Evaluation of Findings

Before performing exploratory factor analysis to analyze the construct validity of the ATP instrument, the researchers first reviewed the ATP instrument results from the CTA students (n=89). When considering all respondents' ratings, the 13 items collectively yielded a mean score of 4.56 (s = 0.94). These results indicated the CTA students perceived a high degree of teaching presence for the Fall 2018 semester, indicating that the CTA teachers were viewed as clearly communicating with students in the course, helpful in guiding understanding, and reinforced a sense of community among participants.

These results can be interpreted to support previous academic research stating that teaching presence greatly influences online learning, as it connects students and instructors separated by distances and time zones (Garrison et al., 2001) and affects perceived learning and student satisfaction (Garrison & Arbaugh, 2007; Lowenthal, 2009). The Qualtrics results related to communication of important topics, goals, and learning activities also supports the Shea and Bidjerano (2010) conclusion that medium to high teaching contact and engagement with students resulted in improved learning outcomes for online learners versus face-to-face students. As demonstrated by Miller et al. (2014), as students’ perception of teaching presence increases, so do students’ opinions of course and instructor satisfaction.

Conclusions

It is the opinion of the researchers that development and testing of a new teaching presence instrument is needed for use in online learning environments. The results of the current study indicate significant changes made to the wording and structure of the original 28-item teaching presence instrument (Shea, Fredericksen et al., 2003) may have compromised the tool’s integrity and structure. The original instrument was modified by Shea, Swan, Li, and Pickett (2005), Shea et al. (2006), Arbaugh and Hwang (2006), and Arbaugh et al. (2008). The original instrument is based on the teaching presence section of the Community of Inquiry (CoI) framework (Garrison, Anderson, & Archer, 2000). In the CoI, teaching presence contains three components: instructional design and organization, facilitating discourse, and directed instruction (Anderson et al., 2001).

The current study explores several challenges and issues identified in academic literature. These issues include a lack of agreement on the factors and dynamics that demonstrate effective teaching presence (Kennan et al., 2018; Kennette & Redd, 2015). Despite the growth of Quality Matters (2018) and tools like the National Survey of Student Engagement (2019), which underscore the importance of teaching presence, many instructors, facilitators, and administrators are not clear on its definition and components. Many community college and undergraduate students are also unable to delineate between the components of teaching presence in surveys (Arbaugh, 2007; Garrison et al., 2010). This brings into question the efficacy of student-reported teaching presence instruments, which has implications for student engagement, support, and retention. Without a shared understanding of the significance, behaviors, and best practices of teaching presence, it is a challenge for educators to create an impactful, effective, and successful online experience for students.

Implications

The significance of teacher presence in online learning has been reliably demonstrated in academic literature (Garrison et al., 2010; Hung & Chou, 2015; Wicks, Craft, Mason, Gritter, & Bolding, 2015); however, further research and additional examination of current instruments assessing effective online teaching presence is necessary. In the literature, researchers examining the teaching presence instrument have pushed for a focus on outcomes vs. processes and empirical research vs. research based on student perceptions (Miller et al., 2014; Rourke & Kanuka, 2009). In addressing empirical research versus student perceptions, Rourke and Kanuka (2009) criticized the prevalence and use of student self-reported surveys versus descriptive statistics. The creation of a more objective teaching presence instrument based on pragmatic and practical measures of teaching presence, beyond the traditional student-reported surveys or the examination of discussion forums, would be useful for online instructors, academic researchers, and program administrators.

While the growth of Quality Matters (2018) and the existence of tools like the National Survey of Student Engagement (2019) and various national assessments underscore the importance of teaching presence, the results of this study indicate an existing gap among educators and facilitators on the definition and components that make up the concept of teaching presence. It would be worthwhile for educators to share a common understanding of the three components of teaching presence (Instructional Design And Organization, Facilitated Discussion, and Directed Instruction), so teachers understand principles that combine theory, academic research, and skills required for successful online teaching. In order to provide students a strong sense of teaching presence and a supportive online community, it is necessary for teachers to undergo training on the significance, importance, and best practices for teaching presence in order to create a successful online experience for students. Simply training teachers in online learning, LMS management, and technical skills is not enough.

Another implication involves past discrepancies over the strength of the three components of teaching presence (Instructional Design And Organization, Facilitated Discussion, and Directed Instruction). Further study should be undertaken to determine the importance, impact, and connection between the three areas that comprise the teaching presence construct. If the three components are studied and it is determined that one or more greatly influence student engagement, satisfaction, and learning outcomes, it would further define areas of needed focus and skill development for teachers who teach in online learning environments.

Recommendations for Future Research

There are a number of recommendations for future research based upon the review of the related literature, results, conclusions, and limitations of this study. Future research specific to the ATP instrument could include exploration using a replication of the exploratory factor analysis, confirmatory factor analysis, and structural equation modeling. Exploratory factor analysis, confirmatory factor analysis, and structural equation modeling could be considered if researchers determined that modification of the 13-item instrument’s wording may reveal additional insights regarding teaching presence. Utilizing a larger and wider sample for the ATP instrument, as well as incorporating demographic information, would possibly produce meaningful academic results. In addition, examining the ATP instrument across different courses and majors may yield valuable insights, as would a multi-institutional study. Employing mixed methods may also strengthen the existing research, as would a study employing only qualitative methods, allowing for deeper research into the specific teaching presence components of Instructional Design and Organization, Facilitated Discussion, and Directed Instruction. As such, each of the three components of teaching presence could be studied individually with both undergraduate and/or graduate students.

Future qualitative research could focus solely on learners, teachers, or both sets of participants. In terms of a focus on individual teachers, quantitative and qualitative information specific to teaching practices and communication style and the influence of both on teaching presence could provide significant insights for future educators. The use of multiple disciplines and institutions, as well as quantitative and qualitative methods to extract results, could prove useful in the academic and practical considerations of teaching presence and its influence on the success, satisfaction, retention, and engagement of students.

This study has contributed to the academic literature by concluding the 13-item ATP instrument did not measure the teaching presence construct as intended. This research contributes to the practice of the field by recommending the development and testing of a new teaching presence instrument. Even though the study outcome was not as expected, the research results serve as the groundwork for future studies focusing on or related to teaching presence.

References

Anderson, T., Rourke, L., Garrison, D. R., & Archer, W. (2001). Assessing teaching presence in a computer conferencing context. Journal of Asynchronous Learning Networks5(2), 1-17.

Arbaugh, J. B. (2007). An empirical verification of the community of inquiry framework. Journal of Asynchronous Learning Networks, 11. doi: 10.24059/olj.v11i1.1738.

Arbaugh, J. B., Cleveland-Innes, M., Diaz, S. R., Garrison, D. R., Ice, P., Richardson, J., & Swan (2008). Developing a community of inquiry instrument: Testing a measure of the community of inquiry framework using a multi-institutional sample. Internet and Higher Education, 11, 133-136.

Arbaugh, J. B. & Hwang, A. (2006). Internet and Higher Education, 9, 9-21.

Bawa, P. (2016). Retention in online courses: Exploring issues and solutions—A literature review. SAGE Open, 1-11.

Bonk, C. J., & Zhang, K. (2006). Introducing the R2D2 model: Online learning for the diverse learners of this world. Distance Education. 27(2), 249-264.

Bowen, W. G. & Tobin, E. M. (2015). Locus of authority: The evolution of faculty roles in the governance of higher education. Princeton, NJ: Princeton University Press.

Boyer, E. (2016). Scholarship reconsidered: Priorities of the professoriate. San Francisco: Jossey-Bass.

Chi, M. H. & Wiley, R. (2014). The ICAP Framework: Linking cognitive engagement to active learning outcomes. Educational Psychologist, 49(4), 219-243.

Cole, M. T., Shelley, D. J., & Swartz, L. B. (2014). Online instruction, e-learning, and student satisfaction: A three-year study. International Review of Research in Open & Distance Learning, 15(6), 111-131.

Cooper, D. R., & Schindler, P. S. (2014). Business research methods (12th ed.). Boston, MA: McGraw-Hill.

Dixson, M. D. (2015). Measuring student engagement in the online course: The online student engagement scale (OSU). Online Learning, 19(4).

Finkelstein, M. J., Conley, V. M., & Schuster, J. H. (2016). The faculty factor: Reassessing the American academy in a turbulent time. Baltimore, MD: John Hopkins University Press.

Foley-McCabe, M., & Gonzalez-Flores, P. (2017). Series: Essentials of online learning: Astandards-based guide. New York: Routledge.

Fredricks, J. A. & McColskey, W. (2012). The measurement of student engagement: A comparative analysis of various methods and student self-report instruments. In S. L. Christensen, A. L. Reschly, & C. Wylie (Eds.), Handbook of Research on Student Engagement (pp. 763-782). New York: Springer.

Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-based environment: Computer conferencing in higher education model. The Internet and Higher Education, 2(2-3), 87-105.

Garrison, D. R., Anderson, T., & Archer, W. (2001). Critical thinking and computer conferencing: A model and tool to assess cognitive presence. American Journal of Distance Education, 15(1), 7-23.

Garrison, R. & Arbaugh, J. B. (2007). Researching the community of inquiry framework: Review, issues, and future directions. Internet and Higher Education, 10, 157-172.

Garrison, R. Cleveland-Innes, M., & Fung, T. S. (2010). Exploring causal relationships among teaching, cognitive and social presence: Student perceptions of the community of inquiry framework. Internet and Higher Education, 13(1-2), 31-36.

Hung, M., & Chou, C. (2015). Students' perceptions of instructors' roles in blended and online learning environments: A comparative study. Computers & Education, 81, 315-325.

Jaschik, S. (2018). Where colleges recruit . . . and where they don’t. Inside Higher Ed. Retrieved from https://www.insidehighered.com/admissions/article/2018/04/16/study-analyzes- where-colleges-recruit-and-where-they-dont

Kaskie, B. (2017). The academy is gaining in place: Assessing alternatives for modifying higher education. The Gerontologist, 57(5), 816-823.

Kennan, S., Bigatel, P., Stockdale, S., & Hoewe, J. (2018). The (lack of) influence of age and class standing on preferred teaching behaviors for online students. Online Learning Journal, 22(1), 163-181.

Kennette, L.N. & Redd, B.R. (2015). Instructor presence helps bridge the gap between online  and on-campus living. College Quarterly, 18(4).

Leroy, H., Palanski, M. E., & Simons, T. (2012). Authentic leadership and behavioral integrity as drivers of follower commitment and performance. Journal of Business Ethics, 107, 255-264. 

MarksJarvis, G. (2016). College financial aid shifting to the affluent, study says. Chicago Tribune. Retrieved from http://www.chicagotribune.com/business/ct-college-gouge-poor- marksjarvis-0320-biz-20160318-column.html

Martin, F., & Bollinger, D. U. (2018). Engagement matters: Student perceptions on the importance of engagement strategies in the online learning environment. Online Learning Journal, 22(1), 205-222.

Martin, F., Wang, C., & Sadaf, A. (2018). Student perception of helpfulness of facilitation strategies that enhance instructor presence, connectedness, engagement, and learning in online courses. The Internet and Higher Education, 37, 52-65.

Meyer, K. (2014). Student engagement online: What works and why. ASHE Higher Education Report. Jossey-Bass.

Miller, M. G., Haha-Vaughn, D. L., & Zygouris-Coe, V. (2014). A confirmatory factor analysis of teaching presence within online professional development. Journal of Asynchronous Learning Networks, 18(1).

National Student Clearinghouse. (2107). Persistence & retention. Retrieved from https://nscresearchcenter.org/snapshotreport28-first-year-persistence-and-retention/

National Survey of Student Engagement. (2019). Retrieved from http://nsse.indiana.edu/

Quality Matters. (2018). Retrieved from https://www.qualitymatters.org/

Redmond, P., Abawi, L., Brown, A., & Henderson, R. (2018). An online engagement framework for higher education. Online Learning Journal, 22(1), 183-203.

Richardson, J. C., Besser, E., Koehler, A., Lim, J., & Strait, M. (2016). Instructors’ perceptions of instructor presence in online learning environments. International Review of Research in Open and Distributed Learning, 17(4), 82-97.

Rourke, L., & Kanuka, H. (2009). Learning in communities of inquiry: A review of the literature. Journal of Distance Education, 23(1), 19-48. Retrieved from http://eric.ed.gov.ezproxy.bethel.edu/?id=EJ836030

Serdyukov, P. (2017) Innovation in education: What works, what doesn’t, and what to do about it? Journal of Research in Innovative Teaching & Learning, 10(1), 4-33.

Shea, P., & Bidjerano, T. (2010). Learning presence: Towards a theory of self-efficacy, self-regulation, and the development of a communities of inquiry in online and blended learning environments. Computers & Education, 55, 1721-1731.

Shea, P., Fredericksen, E., Pickett, A., & Pelz, W. (2003). A preliminary investigation of teaching presence in the SUNY Learning Network, Elements of Quality Online Education, 4. Needham, MA: Sloan-C.

Shea, P., Pickett, A., & Pelz, W. (2003). A follow-up investigation of “teaching presence” in the SUNY Learning Network. Journal of Asynchronous Learning Networks, 7, 61-80.

Shea, P., Li, C. S., & Pickett, A. (2006). A study of teaching presence and student sense of learning community in fully online and web-enhanced college courses. Internet and Higher Education, 9, 175-190.

Shea, P., Swan, K., Li, C. S., & Pickett, A. (2005). Developing learning community in online asynchronous college courses: The role of teaching presence. Journal of Asynchronous Learning Networks, 9, 59-82.

Shea, P. J., Fredericksen, E. F., Pickett, A. M., & Pelz, W. E. (2003). A preliminary investigation of teaching presence in the SUNY Learning Network. Elements of Quality Online Education: Practice and Direction (Vol. 4). Needham, MA: Sloan-C.

Sorensen, C., & Donovan, J. (2017). An examination of factors that impact the retention of online students at a for-profit university. Online Learning, 21(3), 206-221. doi: 10.24059/olj.v21i3.935

Stoltz-Loike, M. (2017). 4 reasons online learning works well for working adults. USNews.com. Retrieved April 22, 2018 https://www.usnews.com/education/online-learning-lessons/articles/2017-02-10/4-reasons-online-learning-works-well-for-working-adults

Teacher Education Ministerial Advisory Group Issues Paper. (2014). Teacher education ministerial advisory group. Canberra, Australia: Department of Education and Training.

Watson, S. L., Watson, W. R., Richardson, J., & Loizzo, J. (2016). Instructor’s use of social presence, teaching presence, and attitudinal dissonance: A case study of an attitudinal chance MOOC. The International Review of Research in Open and Distributed Learning, 17(3).

Wicks, D., Craft, B., Mason, G., Gritter, K., & Bolding, K. (2015). An investigation into the community of inquiry of blended classrooms by a faculty learning community. The Internet and Higher Education, 25, 53-62.

Zhang, H., Lin, L., Zhan, Y., & Ren, Y. (2016). The impact of teaching presence on online engagement behaviors. Journal of Educational Computing, 54(7), 887-900.

APPENDIX A

ARBAUGH TEACHING PRESENCE MEASURES INSTRUMENT

5-point Likert-type scale

1 = strongly agree, 2 = agree, 3 = neutral, 4 = disagree, 5 = strongly disagree 

  1. The instructor clearly communicated important course topics.
  2. The instructor clearly communicated important course goals.
  3. The instructor provided clear instructions on how to participate in course learning activities.
  4. The instructor clearly communicated important due dates/time frames for learning activities.
  5. The instructor was helpful in identifying areas of agreement and disagreement on course topics that helped me to learn.
  6. The instructor was helpful in guiding the class towards understanding course topics in a way that helped me clarify my thinking.
  7. The instructor helped keep course participants engaged and participating in productive dialogues.
  8. The instructor helped keep the course participants on task in a way that helped me learn.
  9. The instructor encouraged course participants to explore new concepts in this course.
  10. Instructor actions reinforced the development of a sense of community among course participants.
  11. The instructor helped focus discussion on relevant issues in a way that helped me learn.
  12. The instructor provided feedbacks that helped me understand my strengths and weak- nesses relative to the course’s goals and objectives.
  13. The instructor provided feedback in a timely fashion.

APPENDIX B

REPRODUCED CORRELATIONS



APPENDIX C

SCREE PLOT


Online Journal of Distance Learning Administration, Volume XXII, Number 4, Winter 2019
University of West Georgia, Distance Education Center
Back to the Online Journal of Distance Learning Administration Contents