Quality Matters: Collaborative Program Planning at a State Level



Kay Shattuck, D.Ed.
Director of Distance Learning Programs
Blackboard Administrator
Carroll Community College
Westminster MD
410-386-8419
kshattuck@carrollcc.edu

 

Abstract

Quality in online distance education has been a hot topic in the popular, professional, and academic literature during the past decade. This article describes an innovative response to the problem of assuring quality in sharable online courses encountered by MarylandOnline, a statewide consortium of 19 Maryland community colleges and four-year institutions. Phases of collaborative program development of the Quality Matters program, a replicable system of peer-review for quality assurance and continuous improvement for online courses, are described.

Introduction

The story of Quality Matters (QM) begins in the fall of 2002 when a small community of Maryland distance educators informally problem solved to assure quality in the online distance learning courses that they shared and to prepare for anticipated questions from future regional accreditation teams visits. What evolved over the next four years was a sophisticated, faculty-centered course review and improvement system that was embraced by MarylandOnline, funded by the U.S. Department Fund for the Improvement of Postsecondary Education (FIPSE) for further development, awakened and jelled distance education talents from around the state, evolved with unimaginable innovative twists, met with enthusiastic interest by colleges and universities inside and outside of Maryland, and established as a nationally recognized subscription service under the auspice of MarylandOnline. This article describes the collaborative program development phases of QM informed by the concepts of community of practice and distributed leadership.

A Community of Practice

Traditions of formal and informal collaboration are key considerations for understanding the development of the Quality Matters (QM) program. Indeed, the collaborative program planning that resulted in QM involved real work situations by practitioners (a community of practice) who were making program decisions based on collaborating, negotiating, pulling talents, and experiencing as they proceeded to find solutions to a specific work related problem (Cervero & Wilson, 1994a; Lave & Wenger, 1991; Boud & Walker, 1990; Schön, 1983; Cousin & Deepwell, 2005; Hung, Seng, Hedberg, & Seng, 2005; Guldberg & Pilkington, 2006).

Formal collaborative traditions.

In 1997, The Maryland Council of Community College Presidents and the Maryland Higher Education Commission (MHEC) established the Maryland Community Colleges Teleconsortium (MCCT) to provide students throughout the state with increased access to quality distance education opportunities. This was to be done through a system of course sharing. MCCT member presidents signed a memorandum of agreement that allowed students from the adopting (home) community college to register and to pay in-county tuition for the courses which would appear on the students’ home college grade reports and transcripts.

In 1999, MCCT partnered with University of Maryland University Campus (UMUC) to create MarylandOnline (MOL) thereby opening opportunity to Maryland’s four year colleges and universities. In 2001, MOL was reformed an independent organization and a board of directors was created by MOL college presidents. MOL’s Board of Directors subsumed MCCT and renamed it The Distance Learning Initiatives Committee (DLIC).

Informal collaborative traditions.

Members of DLIC (formerly MCCT) had a strong history of action orientation (Cervero & Wilson, 1994b) and an entrepreneurial culture that relied on informal and formal collaboration (Hanna, 2003). Many of the DLIC members had been sharing ideas and information as early as 1994 through association with College of the Air. As hands-on practitioners (some faculty, some program coordinators) they worked with course design, student support issues, and teaching and faculty development issues, in addition to technology, management, and administrative issues.

Development of the Quality Matters Program

When reviewing the development of the Quality Matters program, four distinct phases emerge. Those phases relate well to Donaldson and Kozoll’s (1999) four developmental stages of collaborative relationships – emergence, evolution, implementation, and transformation.

Emergence: June 2002 – September 2003

What would become Quality Matters was launched from a seed idea among members of DLIC as they informally problem-solved quality assurance for shared online courses and anticipated future regional accreditation teams’ questions. Past initiatives from the group had resulted in the Faculty Online Technology Training Consortium (FOTTC) project from 1999-2001 and Project Synergy from 2001-2002. FOTTC was a project organized to provide training to those new to teaching in an online distance learning environment and resulted in development of instructional modules. Project Synergy set up faculty teams from member community colleges to identify sharable learning objects within specific disciplines. Both projects had been funded by Maryland Higher Education Commission (MHEC).

With these traditions of collaboration and successful statewide program development in place, Jurgen Hilke, director of distance learning at Frederick Community College, proposed developing a course review and improvement system for the shared MOL online courses. Course reviews, he suggested, would be done by faculty experienced in teaching online; thereby encouraging meaningful feedback on course improvement strategies.

Peer reviews were done at Frederick Community College, Carroll Community College, and Chesapeake Community College during the fall of 2002. These reviews consisted of a checklist produced from the FOTTC materials. But, from the beginning this was seen as a course review and improvement process, not just a list of best practices; therefore, a review guide, an instructor’s worksheet, and a step-by-step guide to the whole process were developed.

Each review team consisted of the chair from the home college and two other members from the DLIC group. It was decided that at least one team member must be a subject expert in the discipline field from which the review course was chosen and that all team members must have current online teaching experience. The instructor/course designer was considered part of the team who would get the other members’ feedback. The reviews focused on the instructional quality of a particular course. Indeed, the voluntarily peer review and course improvement process did not interfere with the faculty evaluation processes of individual member colleges which were outside of the inter-institutional partnership.

In early spring 2003, Mary Wells, then Director of Distance Learning at Prince George’s Community College, led an effort for the MOL Board of Directors to prepare a proposal for funding to the U. S. Department Fund for the Improvement of Postsecondary Education (FIPSE) for further development of the process. The proposal, Creating a Pathway to Credible Inter-Institutional Quality Assurance in Online Learning, became know as Quality Matters: Inter-Institutional Quality Assurance in Online Learning.

Evolution: September 2003 – June 2004

Phase Two was launched with a three-year, $509,177 grant from FIPSE. Funding allowed further development and formalization of a peer-review process for course improvement and review (See http://www.qualitymatters.org/FIPSE.htm). The formation of the Project Management Team (PMT) and three major committees, as well as development of the rubric and implementation and dissemination processes are described here.

Formation of the project management team and committees.

The benefits of having an existing community of practice were realized when MOL got the green light from FIPSE. Mary Wells (representing MOL community colleges) and Christina Sax (representing MOL four-year colleges and universities) were appointed by the MOL Board of Directors as co-directors to head up the Project Management Team (PMT). Each MOL member institution appointed administrative representatives who served as the primary link between the project and MOL institutions.

The PMT organized three committees from the cadre of Maryland distance education practitioners to develop and implement the Quality Matters program:

  1. The Tool Set Committee was responsible for developing a set of standards and protocols for a peer course review. The committee was responsible for review of nationally accepted standards of good practice in online distance education, creation of a rubric to be used by the peer reviewers, and development of other documents that would facilitate the peer review/course improvement system. The Tool Set Committee’s work was completed first and turned over to the Process Committee.
  2. The Process Committee was responsible for creating the procedures necessary to carry out the various course review activities. The major focus was on course and peer reviewer selection and certification processes. This committee was also responsible for determining how the Quality Matters activities would be extended beyond the FIPSE grant funding period. The Process Committee communicated their work to the Training Committee.
  3. The Training Committee was responsible for developing and implementing the training curriculum for peer course reviewers and faculty developers. This committee served as a resource and provided instructional design services to faculty/course developer participants in the QM process.

Development of the course review rubric.

The rubric used to guide peer reviewers in the course review is a central component in the QM process. Its development can be marked in five-transitions: FOTTC (pre-FIPSE); pilot project (pre-FIPSE pilot); 2004 QM rubric; 2005 QM rubric; and 2006 QM rubric.

  1. The FOTTC (pre-FIPSE) project built training for teaching in online distance learning on a checklist adopted from Chickering and Ehrmann’s (1996) Implementing the Seven Principles: Technology as Lever and The American Council on Education’s (ACE) Guiding Principles for Distance Learning in a Learning Society (1996). Chickering and Ehrmann had considered the application of communication technologies to Chickering and Gamson’s seminal 1987 SevenPrinciples for Good Practice in Undergraduate Education (based on more than 50-years of educational research primarily focused on the role of the teacher); while the ACE’s guidelines presented a broader system approach for distance education (Moore & Kearsley) by including learning design, learner support, organizational commitment, learning outcomes, and instructional technology. The nine 2000 FOTTC best practices included:
    1. encourage student cooperation
    2. active learning
    3. respect diverse talents and ways of learning
    4. prompt feedback
    5. emphasize time on task
    6. encourage student-faculty contact and interaction
    7. communicate high expectations
    8. let students know what to expect
    9. effective use of technology
  2. The (pre-FIPSE) Pilot Project used a checklist Hilke adopted from ACE’s 2001 Distance Learning Evaluation Guide. The six 2002 pilot principles included:
    1. learning objectives and outcomes
    2. learning design
    3. course information
    4. learning materials
    5. technology
    6. learner support
    Importantly, the notion of process was added during this phase of development to what can be described as “stand-alone” best practices. A Step-by-Step Review Process was put on paper, along with a Review Guide, an Instructor Worksheet, and a Team Review Questionnaire. The questionnaire introduced scoring as each of the 56 questions was given yes/no checks. A two-fold goal was explained to the peer reviewers: First, specific feedback for course improvement was to be given to the instructor, and second, specific feedback was to be given to improvement of the review questionnaire and process. Thus, the process was to be based in formative evaluation (Inglis, Ling, & Joosten, 1999), whereby there was continual improvement of the whole process of peer review.
  3. FIPSE funding provided the opportunity to refocus on development of a set of standards that would guide a course review by peer reviewers. The result was the 2004 rubric which served as a model for future iterations. The Tools Set Committee identified and consulted national standards of best practices in distance education. In late October 2003, the members of the Tool Committee met and reviewed eleven national standards including:
    1. Maryland Online (MOL) Course Sharing Initiative: Peer Review of Distance Learning Courses (the pre-FIPSE pilot program) checklist;
    2. Maryland’s Faculty Online Technology Training Consortium (FOTTC) checklist;
    3. American Council on Education (ACE) Distance Learning Principles;
    4. American Distance Education Consortium Guiding Principles for Distance Learning;
    5. Best Practices National Education Association/Institute for Higher Education Policy (NEA/IHEP) Quality Benchmarks;
    6. Southern Regional Educational Board Electronic Campus Principles of Good Practice and Criteria for Evaluating Online Courses;
    7. Southern Regional Educational Board Criteria for Evaluating Online Courses;
    8. Middle States Commission of Higher Education’s Interregional Guidelines for Electronically Offered Degree and Certificate Programs;
    9. Michigan Community College, Virtual Leaning Collaborative Online Course Development Guidelines and Rubric;
    10. Seven Principles for Good Practice (Chickering & Gamson, 1987) and Implementing the Seven Principles: Technology as Lever (Chickering & Ehrmann, 1996);
    11. Innovations in Distance Education (1995 Faculty Initiative listing/project at Penn State University).

As result, eight general review standards were identified:

  1. Course Overview and Introduction: The overall design of the course, navigational information, as well as course, instructor and student information are made transparent to the student at the beginning of the course.
  2. Learning Objectives (Competencies): Learning objectives are clearly defined and explained. They assist the student to focus learning activities.
  3. Assessment and Measurement: Assessment strategies use established ways to measure effective learning, assess student progress by reference to stated learning objectives, and are designed as essential to the learning process.
  4. Learning Resources and Materials: Instructional materials are sufficiently comprehensive to achieve announced objectives and learning outcomes and are prepared by qualified persons competent in their field (Materials, other than standard textbooks produced by recognized publishers, are prepared by the instructors or distance educators skilled in preparing materials for distance learning.)
  5. Learner Interaction: The effective design of instructor-student interaction, meaningful student cooperation, and student-content interaction is essential to student motivation, intellectual commitment and personal development.
  6. Course Technology: To enhance student learning, course technology enriches instruction and fosters student interactivity.
  7. Learner Support: Courses are effectively supported for student through fully accessible modes of delivery, resources, and student support.
  8. Accessibility: The course is accessible to all students.

(2005 version in public domain at http://www.qualitymatters.org/Documents/Matrix%20of%20Research%20Standards%20FY0506.pdf). Specific review standards were identified for each general review standard to guide the course review.

Importantly, the group had early on resolved that any set of standards used in creating a rubric would also be supported by existing distance education research. Thus, the academic literature was reviewed for research that supported distance education course design and teaching strategies. The first Quality Matters matrix of review standards was produced in early January 2004. Research support was presented for seven of the eight general review standards. ADA Compliance was included as an awareness item for designers/teachers although it was acknowledged that some of the ADA compliance issues were at an institutional Informational Technology (IT) level.

Each specific review standard was weighed as being “essential” (worth 3 points); necessary (worth 2 points); or important (worth 1 point) in assuring high quality in the course under review. It was determined that out of a possible 80 points, 68 (85 percent) would be required to meet the Quality Matters designation of a well developed online distance learning course. Content validity for design of the scoring system was provided by 70+ years experience of committee members in teaching, designing, and managing distance learning programs. Indeed, content validity of the rubric was further supported with feedback during peer reviewer training sessions, by debriefings at the conclusion of peer reviews, and by feedback of Maryland Distance Learning Association conference workshop attendees. The rubric was piloted and refined for clarification during the first round of course reviews in the spring of 2004.

Development of implementation and dissemination processes.

From the beginning, Quality Matters was envisioned to be more than a rubric; it was seen as a framework to facilitate inter-institutional cooperation and training for peer review of online courses. Indeed, once the initial rubric was developed by the Tool Committee, the Process and Training Committees began their work in developing a replicable system of peer review for the improvement of online courses.

A criterion for course selection and for potential peer reviewers was developed by the Process Committee. All members of the committee had been very active in the FOTTC project and Project Synergy. Content validity for development of the procedural components of the QM process was addressed by a total of 80+ years of experience in distance education.

The Committee worked closely with Kay Kane, the one paid staff for the QM project. The PMT had been keen enough to realize that while the QM project grew out of, and remained largely a voluntary effort, some dedicated staffing was crucial. Kane coordinated the complex logistics of working with multiple institutions and faculty, established the www.qualitymatters.org website, and converted the rubric to an interactive online format (including annotations for each specific review standard). Programming for the online rubric allowed for individual peer reviewers to input scores and comments which were automatically compiled into a team report.

The Training Committee was responsible for developing dynamic training to prepare peer reviewers to conduct reviews using the rubric. All committee members had been very active in FOTTC, Project Synergy, and OnlineEducators. As a group they had 50+ years of experience working in distance learning.

As a result of the PMT and the committees work the following model was developed to provide model of the Quality Matters process:

Quality Matters is a continuous improvement model for assessing and assuring the quality of online courses. The process begins with a mature course (one which had been taught for at least 2 semesters) and a rubric which had been developed from national standards and supported by existing distance education research literature. A three-member team is organized. All team members must be actively teaching online and one member must be a content expert on the course subject. All must complete QM peer reviewer training. The faculty/designer of the course to be reviewed is also considered part of the review team. The review is guided by the QM rubric and points are assigned. Comments and suggestions are encouraged on all standard and required when an essential standard (those worth 3-points) is not met. The faculty/course designer (not administrators or program directors) receive a copy of the compiled team report. In the event that a course does not meet the required 85 percent (68 of 80 points, including all fourteen 3-point essential specific standards), the problem areas can be improved immediately by the instructor/designer or a course designer might be enlisted to assist the faculty/course designer. The expectation is that all courses under QM review meet standards either the first time reviewed or after necessary design improvements. Once the course meets QM standards the Academic Dean of the college is informed that the particular course reviewed met QM standards and may display the QM logo.

Implementation (June 2004 – March 2005)

Full implementation of the QM process launched the third phase of program development. The first phase had evolved when a small community of practice initiated course reviews of sharable online courses. The second phase had evolved with award of a three-year FIPSE grant and relied on an expanded community of Maryland distance learning practitioners in finetuning and formalizing a replicable peer review process. The third phase was marked by excitement and energy as QM grew expotentially, both within and notably outside of Maryland. This was a crucial stage in the program development process and could have easily turned into chaos.

When applying Donaldson and Kozoll’s principles of collaborative program development to QM it is possible to see that during the first phase a situationally unique community of practice, with shared historical reference points, worked together to provide an innovative response to the problem of assuring quality in their sharable online distance courses. During the second phase of development, leadership and vision emerged as the PMT paced their peers through the complexities of working inter-institutionally to develop a sophisticated system of peer review.

Members of the PMT, practitioners and part of the Maryland community of practice themselves, were in the dubious position of encouraging continued collaboration while keeping focused on an increasingly sophisticated program. Guiding a program from a community of practitioners to a project funded by federal dollars required finesse. This was skillfully done by member of the PMT as they negotiated interests (Cervero & Wilson) and employed a distributed leadership stance of “holding on and letting go” (MacBeath, 2005, p. 354). Jurgen Hilke reported that the most important thing he did was “to let go” (personal communication, April 30, 2007); Mary Wells noted that, “it was amazing, but at just the right point the right person would show up with a timely solution and volunteer to take it on” (personal communication, May 3, 2007). Members of the PMT strategically enabled refinement and extension of the process while maintaining focus on the commitment to the project funded by FIPSE.

In 2004, the PMT called for action research proposals for studies to address the impact of QM on student learning. Eight student impact research projects were funded and revealed positive impacts. It was also during this phase of program development that members of the PMT began presenting at national conferences on the QM process. Colleges, universities, and professional distance education partnerships outside of Maryland began approaching Quality Matters about implementing or adopting the QM process for their own use.

Transformation: March 2005 – July 2006

A focus on program continuation marked the fourth phase of program development for Quality Matters. Previous years had resulted in successful program development, implementation, and continual improvement; but, at this point, the complex program was possible only because of FIPSE funding. Impressive results supported the continuation of the project. By the end of FIPSE funding, 103 courses had been reviewed and 709 peer reviewers had been trained in 158 institutions in 28 states. Data collected revealed a positive professional development impact for both peer reviewers and faculty course developers over-and-above a positive impact on the reviewed course.

Interest in the QM process continued to come from all directions – from institutions across the country, from continuing education programs, from accreditation boards – and it became obvious that what had been developed was in demand. The previous year the PMT had been asked numerous times about the possibility of extending the process to hybrid courses. As the FIPSE grant was drawing to a close, the MOL Board of Directors investigated the possibility of establishing a non-profit QM program. The result was the establishment of Quality Matters as a subscription service supported by MOL.

Discussion

The Quality Matters project evolved in four years from a seed idea among a community of practitioners to a sophisticated replicable program of peer review and course improvement for the assurance of quality in online learning. The seed idea evolved from members of small group of distance educators in Maryland community colleges who had developed a culture of engagement and continuous coordination problem solving around issues arising from course sharing.

The development of QM fell into four phases as described above and relates well to what Donaldson and Kozoll identified as the four developmental stages of collaborative relationships – emergence, evolution, implementation, and transformation. The concept of emergence, informed by community of practice and collaborative program development, has been be used to describe the initial problem solving collaboration of the DLIC members. The concept of evolution includes direction setting, maintenance, and growth as key activities. The evolution phase of QM began with funding from FIPSE. The PMT generated excitement and helped established a dynamic learning organization (Morgan, 1997). Donaldson and Kozoll’s notion of implementation describes Phase Three. They noted that during this phase not only is the program “offered”, but expands to include relationship building with external partners and stakeholders, and operational agreements (p. 21). During the implementation phase of QM the materials and processes developed in the evolution phase were put into practice. The transformation phase of QM occurred in response to the energy created by the program and by sustainability issues.

Based on a spirit of community of practice and traditions in sharing ideas, leadership emerged as distance educators in Maryland surfaced as convener, champion, and strategy maker (Donaldson & Kozoll). These roles were not restricted to an individual, but had fluidity through the development of a program. Convener and champion roles were shared by the community of practice members of DLIC and later with members of the PMT. The strategy maker role describes the work of the distance learning practitioners who made up the PMT. But, throughout the project the PMT acted in a facilitator role among peers employing distributed leadership by understanding when to “hold on and [when to ] let go” (MacBeath, p. 354).

Future Research

The program developed by MOL to address issues of quality assurance in sharable online courses grew from a seed idea of a small community of practice and retained the original focus of being grounded in national accepted standards of good practice and supported by existing distance education literature. Three directions of future research can be identified: (1) student learning, (2) professional development, and (3) validation of the rubric standards. Student learning research might include further research on the impact of student learning and retention in a course that has been reviewed, especially in consideration of areas that required redesign to meet QM standards. Professional development might include impact on traditional classroom instruction after an instructor participated in the QM process and on the attitude and evolvement of a team approach to designing courses. Future research on the course improvement/review rubric should include additional validation of the essential standards (the 3-point standards).

Conclusion

This article describes the collaborative program development that evolved when a community of distance educators in Maryland problem solved to assure quality in their sharable online distance learning courses. Development of the program was further assisted by application of a distributed leadership perspective by the Project Management Team. Funding by FIPSE allowed for development and dissemination of a sophisticated quality improvement/review process, as well as a faculty-centered professional development program. The QM program continues as a not-for-profit subscription service (www.qualitymatters.org).


References

Argyris, C. (1993). Education for leading – learning. Organizational dynamics, 21, 5-17.

Argyris, C. & Schön, D. A. (1978). Organizational learning: A theory of action perspective. Reading MA: Addison-Wesley.

Cervero, R. M. & Wilson, A. L. (1994a). The politics of responsibility: A theory of program planning practice for adult education. Adult Education Quarterly, 45(1), 249-268.

Cervero, R. M. & Wilson, A. L. (1994b). Planning responsibly for adult education: A guide to negotiating power and interests. San Francisco: Jossey-Bass Publishers.

Chickering, A. W. and Ehrmann, S. C. (1994). Implementing the seven principles: technology as lever. Available at http://www.aahe.org/Bulletin/sevenprinciples1987.htm

Chickering, A. W. and Gamson, Z. F. (1987). Seven principles for good practice in undergraduate education. American Association of Higher Education Bulletin, 3-7.

Cousin, G. & Deepwell, F. (2005). Designs for network learning: a community of practice perspective. Studies in Higher education, 30 (1), 57-66.

Donaldson, J. F. & Kozoll, C. E. (1999). Collaborative program planning: Principles, practices, and strategies. Malabar, FL: Krieger Publishing Company.

Guldberg, K. & Pilkington, R. (2006). A community of practice approach to the development of non-traditional learners through networked learning. Journal of Computer Assisted Learning, 22, 159-171.

Hanna, D. E. (2003). Organizational models in higher education, past and future. In M. G. Moore & W. Anderson. Handbook in distance education (67-78). Mawhah, NJ: Lawrence Erlbaum Associates.

Huber, G. (1991). Organizational learning: The contributing processes and the literature. Organizational Science, 2, 88-115.

Inglis, A., Ling, P., & Joosten, V. (1999). Delivering Digitally: Managing the Transition to the Knowledge Media. London: Kogan Page, Limited.

Marsick, V. J. (1988). Learning in the workplace: The case for reflectivity and critical reflectivity. Adult Education Quarterly ,38(4), 187-198.

Marsick, V. J. & Watkins, K. E. (1992). Continuous learning in the workplace. The reflective practitioner, 9-11.

MacBeath, J. (2005). Leadership as distributed: a matter of practice. School Leadership and Management, 25(4), 349-366.

McConnell, D. (2002). Action research and distributed problem-based learning in continuing professional education. Distance Education,23(1), 59-83.

Moore, M. G. & Kearsley, G. (2005). Distance education: A systems view (2 nd ed.). Belmont, CA: Wadsworth Publishing Company.

Morgan, G. (1997). Images of organization, ( 2 nd ed). Thousand Oaks, CA: Sage Publications.

Rayner, S. & Gunter, H. (2005). Rethinking leadership: perspectives on remodeling practice. Educational Review, 57(2), 151-161.

Saba, F. (2003). Distance education theory, methodology, and epistemology: A pragmatic paradigm. In M.G. Moore & W. G. Anderson (Eds.), Handbook of distance education (pp. 3-20). Mahwah, NJ: Lawrence Erlbaum Associates, Publishers.

Schön, D. W. (1990). Educating the reflective practitioner: Toward a new design for teaching and learning in the professions. San Francisco: Jossey-Bass Publishers.

Schön, D. W. (1983). The reflective practitioner: How professionals think in action. New York: Basic Books, Inc.

Timperley, H. S. (2005). Distributed leadership: developing theory from practice. Journal of Curriculum Studies, 37(4), 395-420.

Zhu, E. & Baylen, D. M. (2005). From learning community to community learning: pedagogy, technology and interactivity. Educational Media International, 42(3), 251-268.


Online Journal of Distance Learning Administration, Volume X, Number III, Fall 2007
University of West Georgia, Distance Education Center
Back to the Online Journal of Distance Learning Administration Content