Evaluation of Instructional Design Capabilities of Asynchronous and Synchronous Instruction


Kristi N. Garrett
Atlanta Technical College
kgarrett@atlantatech.edu

Angela D. Benson
The University of Alabama
abenson@ua.edu

Abstract

From a quantitative perspective, this study examined the instructional design knowledge of higher education instructors and others within the instructional design/technology arena who are members of a global educational based Internet forum. Results showed significant difference in opinions between genders, where males were more inclined to incorporate instructional technology into their asynchronous and synchronous teaching environments. Based on the results, providing training for gender specific groups could foster a more collaborative learning environment. Whether male or female, designing and developing quality instruction for use in online and face-to-face environments is paramount in order to give students an engaging learning experience.

Introduction

Instructional design is dated back to World War II when training was developed for military services (Reiser, 2001b).  In the education arena of the early 1970’s, instructional design evolved as a heuristic instructional strategy that encouraged the learner to independently explore and discover solutions (Hirumi, 2002).  Today, the face of education is steadily evolving to accommodate the busy schedules of current and potential learners.  This greatly impacts the need for qualified instructors who can not only facilitate, but also design and develop curriculum with instructional technology tools for use in traditional and distance education courses. 

Instructional design models have become more systematic over the years by including specific phases, such as analysis, design, and evaluate used to develop curriculum.  In addition to the educational arena’s focus on instructional design, the military developed the Interservice Procedures for Instructional Systems Design (IPIS) for developing and delivering training that continues to influence business processes and other training (Hirumi, 2002).  This paper evaluates the instructional design ability among higher education staff and instructors to effectively design courses for use in asynchronous and synchronous environments.   

A properly designed learning environment will result in “learner-centered, knowledge-centered, and community-centered” (Snyder, 2009, p.49).   For most adults, the learning process interconnects the “mind, body, spirit, emotions, and society” (Merriam, 2008, p.97) in which experiences are stored and available to be reflected on during the learning process.   It is through this interconnecting process where students learn better when they can associate experiences or real world scenarios with new learning.   This reflection process promotes critical thinking which increases high order thinking skills.  Learning can be promoted through forming communities that encourage interaction with a variety of perspectives, in addition to using examples and scenarios to convey meaning with the purpose of increasing knowledge and understanding (Merriam, 2008). 

Instructional Design Preparation

Survey statistics obtained from Dan Heffron (2010), a Kforce Government Solutions statistician for the National Center of Education Statistics, reported during the last 1998-1999 survey polls that only 22.1% of institutional funds allocated for improvement training were actually used for this purpose.  With the increase of technology tools used for instructional purposes, it is probable that more funding would be allocated for professional development.  This would be an incentive to encourage faculty to advance their pedagogical techniques, while making learning more engaging for the millennial learner.  This percentage represents roughly 215,784.4 out of the 976,400 faculties with instructional duties (Heffron, 2010).  Survey questions pertaining to professional development were removed from further surveys because data analysis did not reveal significant beneficial evidence (Heffron, 2010).   

More learners are enrolling in distance education courses which increase the need for more faculty members with the skills to develop and teach with instructional technology (Orr, Williams & Pennington, 2009).  This demand for the use of instructional technology also increases the workload of the faculty, which in turn poses a challenge on the institutions to compensate for course development and teaching tasks (Orr et al., 2009).  With the increase in job tasks, motivation of faculty is an important factor to consider when addressing the demands of instructional technology.  This motivational pressure is due to the fact that many faculty members begin to feel threatened by their lack of technical expertise.  Another pressure is the lack of institutional support for instructional preparation of content and technology (Orr et al., 2009). 

Instructional Design

Gustafson and Branch (1997) state that existing instructional design models are dynamic, therefore allowing for new models and processes to evolve and conceptualize the desired outcome of courses or curriculum.  This would allow instructional designers opportunities to develop new technology enhanced pedagogy to use for instructional purposes (Gustafson & Branch, 1997).  Instructional technology that is designed using a reputable framework results in more positive student perspectives toward independent and self-paced learning (Vernadakis, Giannousi, Derri, Kellis, & Kioumourtzoglou,  2010).  

Instructional design methods should take into consideration the various learning styles to stimulate higher order thinking in addition to promoting an interactive learning environment (Hirumi, 2002).  According to Merrill (2002), learning is more effective when the instruction given 1) involves prior experiences, 2) gives a demonstration of the content in practice, 3) allows skills to be practiced by the learner, and 4) merges the newly gained knowledge with real world scenarios. 

Organization of content is a key factor that the learners consider important for the development of successful synchronous environments and support of student learning (Harrington, Staffo, & Wright, 2006).  Reiser (2001a) states that the use of instructional technologies promotes interaction among the instructor, learner, and content.  Instructional activities are comprised of the following combinations:  1) the content and the learner, 2) the instructor and the learner, and 3) the learners, collectively, as a community (Cochran-Smith and Lytle, 1999; Reiser, 2001a).    According to the California State University Committee for Online Instruction (2009), effective instructional activities are based on six categories that evaluate the following: redesign based on student feedback, creative teaching methods with technology, student learning evaluation, design and delivery of instruction, organization and design of online content, and available support and resources for the learner. 

Methodology

A self-assessment survey was developed to gain insight on the beliefs and skills of instructional technology professors, instructional designers, and faculty with technology interest who are using instructional technology tools for asynchronous and synchronous environments.  The survey (see Appendix) on instructional technology beliefs and skill sets contained twenty-five items, based on a 4-point Likert scale (categories listed as: Strongly Disagree, Disagree, Agree, and Strongly Agree) along with four demographic items.  The four demographic items measured education level, occupation, gender, and number of years experience using a learning management system (LMS).  The survey was distributed via email to a global online professional learning community of professors, instructional designers, and a combination of future professors, instructional designers, and other interested individuals as volunteer participants.  There were a total of 52 anonymous survey responses in the initial data set.  After examining the data for errors, one survey response was deleted leaving a total of 51 usable survey responses, consisting of 31 females and 20 males, in the final data set.  The years of experience using a LMS ranged from 0 to 20 years.

Participants holding a Master’s or Doctorate degree represented 98% of the data and the remaining 2% held a 4-year college degree.  Based on the responses pertaining to occupation, 21.6% were classified as traditional classroom instructors/professors, 19.6% were classified as distance education instructors/professors, 51% were classified as curriculum instructional designers, and 7.8% were classified as in a none instructional technology related.  Table 1 illustrates the number and percentages of respondents based on occupation.  Reliability and validity of the data analysis were determined by Cronbach alpha, item-to-total correlation, factor analysis, and standard error of measurement.  In addition, statistical significance was determined by t-test and chi-square.  An alpha level (α) of .05 was used for statistical analysis of this instrument.  The survey was used to answer the following two questions:

1. Is there a difference in instructional design beliefs and skills across genders?
2. Is there a difference in instructional design beliefs and skills across years of experience with use of learning management systems (LMSs)

Table 1

Frequency of Occupation

 

 

Frequency

Percent

Valid Percent

Cumulative Percent

Valid

Traditional Classroom Instructor/Professor

11

21.6

21.6

21.6

Distance Education Instructor/Professor

10

19.6

19.6

41.2

Curriculum Instructional Designer

26

51.0

51.0

92.2

None Instructional Technology related

4

7.8

7.8

100.0

Total

51

100.0

100.0

 

Results

The reliability coefficient of this instrument was determined with an initial Cronbach’s alpha of .917 (Appendix) and item-to-total correlation ranging from .117 to .900.  After removing items #1, 6, and 7 with item-to-total correlations below .300, a revised instrument was determined consisting of 22 items with a final Cronbach’s alpha of .939 (Appendix).  With the remaining 22 items (Appendix), the lowest item-to-total correlation was .334 with the highest being .923.  This implies that most of the items gave a significant contribution to the total instrument.  High item-to-total correlations support the internal consistency reliability of the instrument (Cronk, 1999).  Using the total mean and standard deviation (Appendix), the standard error of measurement was calculated to be 1.98. 

An exploratory factor analysis was conducted on the survey responses using principle component analysis with a varimax rotation.  This factor analysis method was used to extract the maximum variance from the data set while reducing the component complexity.  This method is useful in establishing construct validity of the underlying dimensions of variables defined by some of the factor analysis tools.  There were a total of 3 factors with initial eigenvalues of 1.00 or greater (Appendix).  The first factor explained 57% variance with all 3 factors extracted explaining 76% variance.  The scree plot, in Figure 1, verifies the extraction of the3 factors.  According to Kim & Muller (1978), a scree plot uses eigenvalues and groups the factors according to the point where the eigenvalues level off and begin to form a straight line.


Figure 1: scree plot

The 3 factor solution was the best simple structure retained by evaluating the principle component analysis.  Table #2 contains the item loading per factor component.  All loadings were .534 or greater. The dimensions identified by each factor are as follows:  I) LMS Proficiency, II) Instructional Design, and III) Multimedia and Distance Education.  Analysis was done using a t-test and chi-square.  An alpha level (α) of .05 is used for statistical analysis for this instrument.  

Table 2

Factor Loading Rotated Component Matrix

 

Component

I

II

III

Q14

.979

 

 

Q12

.979

 

 

Q16

.957

 

 

Q13

.957

 

 

Q18

.923

 

 

Q17

.905

 

 

Q21

.901

 

 

Q15

.892

 

 

Q11

.814

 

 

Q20

.752

 

 

Q2

.548

 

 

Q10

.534

 

 

Q4

 

.832

 

Q8

 

.795

 

Q5

 

.781

 

Q9

 

.732

 

Q19

 

 

.640

Q3

 

 

-.605

Results

For question 1, there was a significant difference in the instructional design beliefs and skills found between genders when the t-test was performed, t(49) = -2.019, p = .049.  A contributing factor could be that the males surveyed generally pursued STEM (Science, Technology, Engineering, and Math) disciplines giving them a greater understanding of technology, which promotes greater proficiency when implementing into instructional strategies.  This greatly impacts the need for qualified instructors of both genders who can not only facilitate, but also design and develop curriculum with instructional technology tools for use in on ground and online courses.  Additional analysis was performed using chi-square to determine if a difference of opinion was significant between each item and gender.  The results of the chi-square analysis revealed no significant difference of opinion based on gender.  Both genders have similar opinions pertaining to their instructional design capabilities to deliver distance education.  This can be contributed to the fact that 51% of the participants are in the curriculum instructional design profession, who have more experience designing and developing using instructional technology tools.  Based on this occupation, it is implied that many of the respondents have equivalent educational background and/or professional developmental training.  In regards to instructional design professional development, there may be a need for homogenous training  groups that will encourage same gender participants to be more willing to engage in the learning community by reducing the possibility of gender inferiority of a heterogeneous group.  Gender inferiority could be a contributing factor to the negative perception of the value of instructional design training.

For question 2 there was no significant difference in the  instructional design beliefs and skills between years of experience with use of learning management systems (LMSs) when a t test analysis was performed , t(49) = .447, p = .657.  A contributing factor could be that most respondents had an equivalent understanding of LMSs and how it can be used to deliver instructional methods to enhance learning.  Another factor could be that the respondents had more confidence in their instructional design capabilities with LMSs.  The chi-square results did indicate a significant difference of opinion on whether teaching traditional (on-campus) courses are more likely to require more time preparation than teaching distance education courses based on the hypothesized demographic out of the 51 item responses.  Interestingly, the responses based on years of experience with the use of LMSs indicated courses developed using a modeling framework could transition from a traditional (on-campus) course into a distance education course; based on p = .036, which is significantly lower than α.  A reason for this difference could be that these respondents have gained their skills from hands-on experience using an instructional design model.  The responses for years of LMS experience in category 0-6 were 12 out of 30(40%) disagreement and 18 out of 30(60%) agreement. Another reason could be that these respondents have more in depth theoretical exposure on instructional design for asynchronous teaching than the respondents in the 7-20 category. 

The responses for years of LMS experience in category 7-20 were 16 out of 21(76%) disagreement and 5 out of 21(24%) agreement.  From a contrary perspective, these respondents could be bias to learning new technology. They could also believe that instructional design models are perplexing for any form of curriculum design, thus does not merit the effort to learn without some form of compensation or employee performance benefits.  Therefore, for respondents who have limited instructional design experience used for developing asynchronous environments, the use of instructional design models does not increase their belief that those models are effective when transitioning a traditional on-campus course into a distance education course.  The respondents in this category have more years’ experience using a LMS, yet their experience may have been gained informally without the use of any instructional design training. It is important to note that respondents could have limited knowledge of instructional design concepts used in the instrument along with the perspective that instructional design only applies to the online environment. As a result, participants could have inadvertently answered some of the instrument items.      

From these results, it can be concluded that respondents have some differences of opinion on their instructional design capabilities based on gender and years of experience using a LMS.   This reiterates the need for skilled instructors (i.e., personal and professional instructional knowledge) who can not only facilitate, but also design and develop curriculum with instructional technology tools for use in both asynchronous and synchronous environments.  According to Yarbrough (2001), content and “lived” (Cochran-Smith and Lytle, 1999) experience with subject matter are deemed the determining factors for success of both technology enhanced and traditional methods of instruction.  Therefore, more effective student outcomes result from instructors with more experience and knowledge with the subject matter in addition to experience using technology for educational purposes (Yarbrough, 2001).  Whether a more traditional instructional design model is used as the instructional design methodology versus other models, the goal of instructional designing of distance education classes should be to enrich teaching for improved student learning.

Conclusion

The participant responses indicated gender as having a significant difference in opinion on instructional design capabilities.  In order to provide quality instruction, all instructors regardless of gender should be able to design and develop curriculum for face-to-face and online classes.  Quality instructional design preparation is paramount for providing learners with quality content that contributes to an effective learning experience. 

Future studies could be conducted to more closely examine 1) the instructor knowledge and beliefs in using other instructional technologies, such as simulation and virtual worlds, between traditional classroom instructors and distance education instructors, and 2) the difference of opinion on instructional design skill sets using technology tools between experienced instructors and novice instructors?

Appendix

Section 1Based on your knowledge of using instructional technologies, please answer the following:

Strongly Disagree

Disagree

Agree

Strongly Agree

  1. Teaching traditional (on-campus) courses are more likely to require more time preparation than teaching distance education courses.
  2. The skills required to teach distance education courses are different than those required to teach traditional (on-campus) courses.
  3. Distance education provides just as effective learning as traditional (on-campus) education.
  4. In the next 5 years, the need for instructional designers will increase as colleges and universities implement more distance education programs.
  5. In the next 5 years, the need for instructional designer training will increase as colleges and universities implement more distance education programs.
  6. Traditional (on-campus) professors can transition their courses using instructional design methodologies into online curriculum with limited instructional technology skills.
  7. Instructional technology skills to design and develop courses in Learning Management Systems (LMSs) are a skill set that all professors should obtain.
  8. Instructional design training would better prepare professors for using the basic ADDIE (Analyze, Design, Develop, Implement, & Evaluate) model to develop instructional technology based courses using Learning Management Systems (LMSs).
  9. If effectively developed using a modeling framework such as the ADDIE model, any college course can be transitioned from a tradition (on-campus) course into a distance education course.
  10. In order for professors to be successful at designing and developing distance education courses using instructional technologies, he/she must have a desire to acquire the necessary skills in order to develop effective courses.
  11. I can create a high quality syllabus within a Learning Management System.
  12. I can post announcements that will display upon the learner entering a LMS.
  13. I can generate emails using the email feature within a LMS.
  14. I can create a discussion forum within a Learning Management System (LMS).
  15. I can create a learning assessment (i.e., quiz) within a Learning Management System (LMS).
  16. I can load course materials, such as, PowerPoint slides and/or lecture notes within a Learning Management System (LMS).
  17. I can setup the grade book feature to display learner progression within a Learning Management System (LMS).
  18. I can set due dates for assignments that automatically restrict late submission within a Learning Management System (LMS).
  19. I can create multimedia tutorials with tools (such as Camtasia, Captivate, etc.) within a Learning Management System (LMS).
  20. I can load multimedia tutorials (such as Camtasia, Captivate, etc.) within a Learning Management System (LMS).
  21. I can create learning modules for each course topic within a Learning Management System (LMS).

Section 2:  Please answer each of the demographic related items below.

  1. What is the highest level of education you have completed?
    1. Some College
    2. 2-year College Degree
    3. 4-year College Degree
    4. Master's Degree
    5. Doctoral Degree
  2. Please indicate your occupation.
    1. Traditional Classroom Instructor/Professor
    2. Distance Education Instructor/Professor
    3. Curriculum Instructional Designer
    4. Not Instructional Technology related
  3. Please indicate the number of years you have developed and implemented curriculum using a Learning Management System (enter numbers only; for example, if two years is your answer, you would enter "2" without the quotation marks).   ________
  4. Select your gender:  o Male           o Female



References

California State University Committee for Online Instruction (2009). Rubric for online instruction. Retrieved from https://www.csuchico.edu/eoi/documents/rubricpdf

Cifuentes, L., Davis, T., & Clark, S. (1996, April). From sages to guides: A professional development study.  Paper presented at the Annual Meeting of the American Educational Research Association.

Cochran-Smith, M. & Lytle, S. L. (1999). Relationships of knowledge and practice: Teacher learning in communities. Review of Research in Education, 24, 249-305.Darling-Hammond, L. (2010). Teacher education and the American future. Journal of Teacher Education, 61(35). doi: 10.1177/0022487109348024

Freeman, D. (2002). The hidden side of the work: Teacher knowledge and learning to teach. Language Teaching, 35, 1-13.

Gibbs, G., & Coffey, M. (2004). The impact of training of university teachers on their teaching skills, their approach to teaching and the approach to learning of their students. Active Learning in Higher Education, 5, 87–100.

Gustafson, K. L. & Branch, R. M. (1997). Revisioning models of instructional development. Educational Technology Research and Development, 45(3), 73-89.

Harrington, T., Staffo, M., & Wright, V. H. (2006). Faculty uses of and attitudes toward a course management system in improving instruction. Journal of Interactive Online Learning, 5(2), 178-190.

Heffron, D. (2010). A telephone interview with Kforce Government Solutions statistician for the National Center of Education Statistics, Dan Heffron/Interviewed by author.  Retrieved from http://nces.ed.gov/

Hein, G. E. (1996). Constructivist learning theory.  Retrieved from http://www.exploratorium.edu/IFI/resources/constructivistlearning.html

Hirumi, A. (2002). The design and sequencing of e-learning interactions: A ground approach. International Journal on E-Learning, 19-27.

Lieberman, A. & Mace, D. H. P. (2008). Teacher learning: The key to educational reform. Journal of Teacher Education 59(226). doi: 10.1177/0022487108317020

Merriam, S. (2008). Adult learning theory for the twenty-first century. New Directions for Adult and Continuing Education, 119, 93-98. doi: 10.1002/ace

Merrill, M. D. (2002). First principles of instruction. Educational Technology Research & Development, 50(3), 43-59.

Modla, V. B., & Wake, D. G. (2007). Modeling active engagement and technology integration: Learning to teach literacy skills and strategies. Journal on Excellence in College Teaching, 18(1), 123-146.

Montgomery, D. C. (2010). A modern framework for achieving enterprise excellence. International Journal of Lean Six Sigma, 1(1), 56-65. doi: 10.1108/20401461011033167

Orr, R., Williams, M. R., & Pennington, K. (2009). Institutional effort to support faculty in online teaching. Innovation in Higher Education, 34, 257-268. doi:  10.1007/s10755-009-9111-6

Polly, D. & Hannafin, M. J. (2010). Reexamining technology’s role in learner-centered professional development. Educational Technology Research & Development, 58, 557-571. doi: 10.1007/s11423-009-9146-5

Reiser, R. (2001a).  A history of instructional design and technology: Part I: A history of instructional media. Educational Technology Research and Development, 49(1), 53-64.

Reiser, R. (2001b).  A history of instructional design and technology: Part II: A history of instructional media. Educational Technology Research and Development, 49(2), 57-67.

Ruey, S. (2009). A case study of constructivist instructional strategies for adult online learning. British Journal of Educational Technology, 41(5), 706-720.

Shiue, Y. (2007). Investigating the sources of teachers’ instructional technology use through the decomposed theory of planned behavior. Journal of Educational Computing Research, 36(4), 425-453.

Snyder, M. (2009). Instructional-Design theory to guide the creation of online learning communities for adults. TechTrends, 53(1), 45-57.

Stes, A., Coertjens, L., & Van Petegem, P. (2010). Instructional development for teachers in higher education: Impact on teaching approach. Higher Education, 60(2), 187-204.

Vernadakis, N., Giannousi, M., Derri, V., Kellis, I., & Kioumourtzoglou, E. (2010). Athens 2004 team leaders’ attitudes toward the educational multimedia application “Leonidas”. Educational Technology & Society, 13(1), 208-219.

Yang, S. C. & Liu, S. F. (2004).  Case study of online workshop for the professional development of teachers.  Computers in Human Behavior, 20, 733–761.

Yarbrough, D. N. (2001). A comparative analysis of student satisfaction and learning in a computer-assisted environment versus a lecture environment. Journal on Excellence in College Teaching, 12(1), 129-147.


Online Journal of Distance Learning Administration, Volume XX, Number 3, Fall 2017
University of West Georgia, Distance Education Center
Back to the Online Journal of Distance Learning Administration Contents