An Organizational Development Framework for Assessing Readiness and Capacity for Expanding Online Education

Anthony A. Piña
Sullivan University


In this article, a popular model for organizational development is utilized as a framework for assessing the organizational readiness and capacity of educational institutions whose leaders wish to establish or expand their online/distance education programs. Examples of institutionalization factors to consider and alternative models for assessing readiness and capacity are also provided.


Online/distance education is widely recognized as a way for higher education institutions to increase their student enrollment and graduation levels, and to provide flexibility for non-traditional students, whose employment, health, family or other circumstances are not conducive to a traditional academic schedule (Clinefelter & Aslanian, 2015). Throughout the new millennium, online education has been the fastest growing segment of higher education, even during recent years of historic declines in overall U.S. higher education enrollments (Allen, Seaman, Poulin, & Straut, 2016). A recent survey of chief academic officers at 2,807 colleges and universities indicated that more than 70% consider online education to be critical to their institution’s long-term strategy, compared to less than 50% in 2002 (Allen & Seaman, 2015). The National Center for Education Statistics reports that over 70% of degree granting institutions offer distance education--the number increasing to 95% for institutions with 5,000 students or more (U.S. Department of Education, 2016).

Online Development Not Always Successful

While the growth of online education enrollments has slowed in the past few years, its importance for the future of colleges and universities has continued to increase, with many institutions establishing new online programs or scaling up their existing programs. (Allen, Seaman, Poulin, & Straut, 2016; Clinefelter and Aslanian, 2015). However, not all online education initiatives have been successful. Highly publicized attempts to establish virtual campuses by the University of Illinois and University of California systems never gained traction and eventually failed--even after millions of dollars were spent (Derousseau, 2015; Kolowich, 2009).

Ashford University, which saw online enrollments skyrocket from less than 11,000 to 74,000 in a mere four years, had ten times the staff for admissions and recruitment staff as it had for advisement and retention. Of the institution’s 2,100 faculty, only seven were full-time. As a result, retention of its online students was less than 50% (WASC, 2012). Ashford’s initial bid for accreditation with the Western Association for Schools and Colleges was denied. Although Ashford was able to achieve accreditation in a subsequent evaluation by WASC, it was only after bringing in new academic leadership, hiring dozens of full-time faculty and academic advisors and undergoing a complete restructuring of its organization, staffing and operations, costing more than ten million dollars (Fain, 2013).

The WASC visiting team had noted that Ashford’s previous leadership had failed to properly assess its capability and infrastructure for successful expansion of its online education program. Had a systematic and complete internal capacity and readiness assessment been performed, time and money could have been saved and many layoffs could have been avoided (WASC, 2012).

Assessing Organizational Readiness and Capacity

Educational leaders seeking to establish or expand their institution’s online education must look beyond simply adopting a learning management system (e.g. Blackboard, Canvas, Desire2Learn or Moodle) and have faculty create and teach online courses. Online education is systemic and involves strategic planning and the involvement of multiple areas of the institution (Piña, Lowell & Harris, 2017; Shelton and Saltsman, 2005). Successful online programs require systematic organizational development and change, beginning with an assessment of the organization’s internal capacity and readiness to establish or expand is online education program.

A widely-referenced model for assessing organizations is the “Comprehensive Model for Diagnosing Organizational Systems” formulated by Thomas Cummings and Christopher Worley. The model is based on principles and theories of organizational development and change and considers organizations in terms of their inputs (i.e. external environment); transformations or design components (i.e. internal environment) and outputs (i.e. organizational results) (Cummings and Worley, 2015). In this article, the Comprehensive Model for Diagnosing Organizational Systems will be utilized as a framework for assessing higher education institutional capacity and readiness for establishing or expanding online education.

Establishing and Diffusing Online Learning

Successful online learning programs do not just happen, they are established, developed, grown and, at some point, are institutionalized (Piña, 2016). At most colleges and universities, online courses are initiated by faculty seeking to meet the needs of their particular learners. These innovators gain the attention of other faculty, who also begin to offer their own online courses. At some point the program captures the attention of administration, which codifies the online program with policies, procedures, resources and operations. This follows the diffusion of innovations model popularized by Everett Rogers (Rogers, 2003). In Roger’s model, illustrated in Figure 1 below, innovations are formulated by innovators, who persuade early adopters to join in the innovation. Formal and informal communication channels, time and social systems (including opinion leaders, organizational mandates, media, and government regulations) help the innovation to spread until it becomes widely accepted by the early majority, late majority and, ultimately those most resistant to the innovation (laggards).

Figure 1: Diffusion of Innovations Model (Rogers, 2003).

Institutionalization and Assessment

Adopting and implementing an innovation, while essential, does not, by itself, make that innovation a normal, regular and enduring part of the organization. Many innovations are adopted and implemented, only to be discarded later (Piña, 2008). The key to an innovation’s endurance is institutionalization (Surry & Ely, 2002). In a research study to identify those factors leading to successful institutionalization of distance learning in higher education, 30 different factors were identified and were subsequently validated by 170 online education professionals from colleges and universities nationwide as indicators that the distance learning program has been successfully institutionalized and that the program can be considered a strong one (Piña, 2008; 2016). The 30 factors are listed and defined in Table 2 below.

For institutions seeking to establish their initial online programs or seeking to expand their current online programs, it is necessary to determine whether the institution has the capacity and readiness for online learning success. The field of organizational development can provide a solution to meet this need.

Table 2: Topic Areas, Institutionalization Factors and Application Items (Piña, 2016)

Organizational Development

Organizational Development (OD) has been defined as “a system-wide process of data collection, diagnosis, action planning, intervention and evaluation aimed at: 1) enhancing congruence among organizational structure, process, strategy, people and culture; 2) developing new and creative organizational solutions and 3) developing the organization’s self-renewing capacity" (Beer, 1980, p. 1). Cummings and Worley (2015) posit that OD is different from other fields concerned with improvement and change in organizations, such as project management, operations management and management consultation.

“OD and change management both address the effective implementation of planned change. They are both concerned with the sequence of activities, the processes and the leadership that produce organizational improvements. They differ, however, in their underlying value orientation. OD’s behavioral science foundation supports values of human potential, participation and development in addition to performance and competitive advantage. Change management focuses more narrowly on values of cost, quality, and schedule. As a result, OD’s distinguishing feature is its concern with the transfer of knowledge and skill so that the organization is more able to manage change in the future” (Cummings and Worley, 2015, p. 4).

The general OD framework for planned organizational change is based on four basic activities: 1) Entering and Contracting, in which the OD professional establishes whether a situation exists that may warrant organizational development and change; 2) Diagnosing, in which the client system is studied an analyzed; 3) Planning and Implementation, where the interventions and other methods of change are formulated and applied and 4) Evaluation and Institutionalization, where the change is evaluated for success and made permanent (Cummings & Worley, 2015). The framework is illustrated in Figure 2 below.

Figure 2: General OD Framework of Planned Change (Cummings & Worley, 2015)

A Model for Assessment

For the purposes of assessing an organization’s readiness and capacity for adoption or expansion of online learning, this article draws primarily from the OD activity of diagnosing.
“In this stage of planned change, the client system is carefully studied. Diagnosis can focus on understanding organizational problems, including their causes and consequences, or on collecting stories about the organization’s positive attributes. The diagnostic process is one of the most important activities in OD. It includes choosing an appropriate model for understanding the organization and gathering, analyzing and feeding back information to managers about the problems or opportunities that exist” (Cummings & Worley, 2015, 29).

The diagnostic/assessment model developed by Cummings and Worley is based on principles and theories of organizational development and change (OD) and shares commonalities with other popular OD models, such as Kotter’s Organizational Dynamics (Kotter, 2012), Galbraith’s Star Model (Galbraith, 1995), Nadler and Tushman’s Congruence Model (Nadler & Tushman, 1997) and Weisbord’s Six Box Model (Weisbord, 1978). The model is designated as the “Comprehensive Model for Diagnosing Organizational Systems” (Cummings & Worley, 2015, p. 94), shown in figure 3 below.

Figure 3: Comprehensive Model for Diagnosing Organizational Systems (Cummings & Worley, 2015)

The OD-based diagnostic model is divided into three broad categories:

Adapting the OD Model

Just as there are multiple models for classifying higher education institutions--including public, private, for-profit, two-year, four-year, graduate, doctoral, bricks & mortar, virtual, etc.—there are multiple ways to establish or expand online learning for different institutions (Miller, Benke, Chaloux, Ragan, Schroeder, Smutz & Swan, 2014; Piña, Lowell & Harris, 2017). Online programs vary widely in their structure, organization, mission and functions (Shelton & Saltsman, 2005). While OD is generally thought to be a process for improving organizations in business and industry, Cummings and Worley (2015) have demonstrated its successful use in health care, public schools and public-sector organizations. Below is an adaptation of the Comprehensive Model for Diagnosing Organizational Systems specifically for inputs, design components and outputs relevant to the assessment of capacity and readiness for establishing or expanding online learning at a college or university.

Inputs/External Environment

Design Components/Internal Environment


Outputs for higher education institutions adopting or expanding online learning include numbers of enrolled students, transferability of credits, student retention rates, graduation rates, employment of graduates, student satisfaction and accreditation (Shattuck, 2014).

Alternative Solutions

The OD-based Comprehensive Model for Diagnosing Organizational Systems adapted above is comprehensive. However, it is not the only option for assessing organizational capacity and readiness for online learning. For several years, various assessment models and methods have been proposed by professionals from inside and outside academe.

An early model for assessing organizational readiness for e-learning is provided by Haney (2002) and is based upon a framework and survey to identify organizational goals, needs, motivators, resources and constraints. This is done via a series of seven checklists with ten questions per checklist. The checklists are categorized as 1) Human Resources, which deals primarily with the management of the employee records and processes; 2) Learning Management System, is concerned primarily with the tracking and enrollment capabilities of the system; 3) Learners, which establishes how employees access the system; 4) Content, which covers the technicalities of how the system delivers, scores, prints and manages content; 5) Information Technology, dealing with the technical and infrastructure requirements to run the program; 6) Finance, which focuses on institutions costs of e-learning and 7) Vendor, which includes vendor services, support and quality. Advantages of Haney’s model include the fact that it considers the organization from multiple perspectives and that the 70 survey items tend to be heavily oriented toward the learner’s experience. However, the model emphasizes technical course-related items throughout the seven checklists and, in the process, excludes important non-course items, such as the external environment (inputs), organization, strategy and culture. As a tool for institutional assessment of capacity and readiness for online learning, Haney’s model, while very useful, ends up being less comprehensive than the OD-based model.

Bandiru and Jones (2012) propose a framework for executing distance education programs based on principles of project management, which they define as “the process of managing, allocating, and timing resources to achieve a given goal in an efficient and expeditious manner” (p. 160-161). The framework includes a step-by-step process for implementation of the framework that includes: 1) Identification, which is the process for identifying the need for distance education; 2) Definition, which involves identifying goals and strategies for distance education; 3) Planning, encompassing program expectations, quality control strategies, organizational communication and the selection and delivery of technology-based instruction; 4) Organizing, where the responsibilities and tasks of individuals involved with distance education are determined and mapped; 5) Resource Allocation, the process in which people, funding, materials, equipment, facilities, and services are allocated; 6) Task Scheduling, in which the various activities (e.g. classes, resources, instructors, and bandwidth) are scheduled and deployed, typically using a software-based system; 7) Tracking, which “entails reporting and auditing the performance of the DE program to ensure that DE operations are aligned with the plans, goals, and institution standards” (p. 159); 8) Control, where decision makers review the data gathered and make decisions regarding the success of the program and 9) Phase-out, where actions are taken to improve the program. In terms of the total number of activities undertaken, the project management-based model goes farther than the OD-based model, as the former is both a diagnostic and an implementation model, while the latter is exclusively a diagnostic model, relying upon additional OD-based models for designing interventions and managing change (Cummings & Worley, 2015). As a diagnostic tool, the project management-based model provides less guidance and is less robust than the OD-based model, although Bandiru and Jones do include a useful case study as an example of how the model can be implemented.

The e-Learning Maturity Model (Marshall, 2011) applies the continuous improvement paradigm of capability maturity models (Paulk, Curtis, Chrissis & Weber, 1993) to an online learning context. The model considers 35 different processes that are assessed within the context of six dimensions: 1) Learning, which includes processes that directly impact on pedagogical aspects of e-learning; 2) Development, which includes processes surrounding the creation and maintenance of e-learning resources; 3) Support, which includes processes surrounding the support and operational management of e-learning; 4) Evaluation, which includes processes surrounding the evaluation and quality control of e-learning through its entire lifecycle; and 5) Organization, which includes processes associated with institutional planning and management (Marshall, 2011, p. 68). While the e-Learning Maturity Model does not include all of the process that occur within the OD-based model (e.g. inputs, outputs, strategy and culture) it is a true diagnostic model and is a viable and comprehensive alternative to the OD-based model.

Another notable example for assessing online learning readiness is from Aydin and Tasci (2005) who developed a survey instrument designed to offer a more simplified approach appropriate for programs in developing countries. The survey consists of four areas for evaluation: technology, innovation, people self-development. Its greatest strength is in assessing the capability and readiness of the people who will be developing, supporting and utilizing the distance learning program. While this could be an effective way to assess institutional capacity and readiness for institutions with access to few resources, it does not measure up well compared to the other models highlighted in this section.

Recommendations and Implementation

There are many worthy models available for assessing the capacity and readiness of an institution to establish or expand a program of online learning. In comparison to other established models and solutions, the OD-based Comprehensive Model for Diagnosing Organizational Systems developed by Cummings and Worley appears to provide a more robust, viable and inclusive assessment solution and is the model recommended by this report.

The model is best implemented via a series of interviews with representatives from key stakeholder groups, including top academic/administrative leadership, support staff, faculty and students. Qualitative data are gathered and analyzed to determine common trends and to identify whether there is a shared vision for online learning among the shareholder groups or whether interpersonal or group intervention processes are necessary to achieve consistency and cooperation (Cummings & Worley, 2015).

Possible outcomes of the assessment of institutional capacity and readiness for online learning are:

Implementation of the assessment findings will be determined according to the institutional mission, the goals and priorities of the institution’s leadership and whether the overall culture of the institution will support the development or expansion of online learning (Miller, Benke, Chaloux, Ragan, Schroeder, Smutz & Swan, 2014). For institutions whose assessment concludes that they do not have the internal capacity to develop a new online education program or effectively scale their existing programs, alternative solutions do exist. One solution is to enhance the internal capacity through investments in personnel, infrastructure or technology. If, for example, the decision is made to venture anew into online learning, there will be a need to establish a process for the design and development of online courses, which will include hiring instructional designers skilled in online course development and delivery. A college or university that wishes to expand its operations and offerings to include fully online programs for students outside its geographical area will need to invest in an infrastructure of student services and technical support to accommodate the unique needs of learners who may never set foot on campus.

An alternative to performing all operations, services and support completely in-house, is to outsource one or more functions to a third–party e-learning vendor, such as Learning House or Pearson Embanet, or to a local consulting or solutions firm, such as an instructional design group or I.T. solutions provider. Several colleges and universities use third-party vendors to, in essence, run their online education programs, as the vendors handle program and course development, marketing, student recruiting, technical support and reporting. The typical model for this type of arrangement involves revenue sharing, with the vendor receiving a percentage of tuition and/or fees paid by the students. An advantage of this approach is a relatively quick start-up time, as the vendor supplies the personnel, courses, materials and expertise needed to establish and operate the program.

Leaders and faculty at a higher education institution may be wary of turning over so much control (and revenue) to third-party vendors and may have concerns that their online programs might be merely copies of programs that the vendors are providing for other colleges and universities (Riter, 2017). There may also be the fear that if the partnership with the full-service vendor were ever to be discontinued, the vendor could “pack up the program and go,” leaving the institution and its learners without any program at all.

A less invasive alternative is to use a third-party vendor to provide a more limited number of services in areas for which the institution does not have sufficient capacity, such as providing instructional design support for course development, after-hours helpdesk/technical support, admission and retention services or data analytics reporting. This hybrid approach (i.e. mostly in-house with outsourcing only in areas that the institution lacks capacity) has the advantage of giving the institution the time to build up its internal capacity, while it can still offer its own online programs—rather than someone else’s—to its learners.

All indications are that online learning will continue to increase in ubiquity across higher education. Colleges and universities that do not currently offer online courses will likely do so in the not-too-distance future. Institutions that now offer only online courses will undoubtedly expand to online diploma, certificate and degrees later. In an era of declining higher education enrollments and long-established colleges having to shut their doors, most colleges and universities are looking for strategies to increase the size of their student bodies. By using one of the models presented in this report to perform a systematic assessment of its internal capacity and readiness to establish or expand online learning, institutions may be able to avoid costly mistakes.


I wish to thank Dr. Shalom Charles Malka, who inspired the writing of the original project that became this article and for Dr. Bruce Harris, who facilitated the application of this framework in a real-world setting.



Allen, I. E., & Seaman, J. (2015). Grade level: Tracking online education in the United States. Babson, MA: Babson Survey Research Group and Quahog Research Group, LLC.

Allen, I. E., Seaman, J., Poulin, R., & Straut, T. T. (2016). Online report card: Tracking online education in the United States. Babson Park, MA: Babson Survey Research Group and Quahog Research Group, LLC.

Aydin, C. H., & Tasci, D. (2005). Measuring readiness for e-learning: Reflections from an emerging country. Educational Technology & Society 8(4), 244-257.

Bandiru, A. B., & Jones, R. R. (2012). Project management for executing distance education programs. Journal of Professional Issues in Engineering Education & Practice 158, 154-162.

Beer, M. (1980), Organization change and development: A systems view. Santa Monica, CA: Goodyear.

Clinefelter, D. L., & Aslanian, C. B. (2015). Online college students 2015: Comprehensive data on demands and preferences. Louisville, KY: The Learning House, Inc.

Cummings, T., & Worley, C. (2015). Organization development & change (10th ed.). Stamford, CT: Cengage Learning.

Derousseau, R. (2015). California’s multimillion-dollar online education flop is another blow for MOOCs. The Hechinger Report. Retrieved from:

Fain. P. (2013). If at first you don't succeed. Inside Higher Education. Retrieved from

Galbraith, J. R. (1995). Designing organizations: An executive briefing on strategy, structure, and process. New York, NY: Jossey-Bass.

Haney, D. (2002). Assessing organizational readiness for e-learning: 70 questions to ask. Performance Improvement 41(4), 10-15.

Kolowich, S. (2009). What doomed global campus? Inside Higher Ed. Retrieved from

Kotter, J. (2012). Leading change. Cambridge, MA: Harvard Business Review Press.

Marshall, S. (2011). Improving the quality of e-learning: Lessons from the eMM. Journal of Computer Assisted Learning 28, 65–78

Miller, G., Benke, M., Chaloux, B., Ragan, L. C., Schroeder, R., Smutz, W., & Swan K. (2014). Leading the e-learning transformation of higher education. Sterling, VA: Stylus.

Nadler, D.A., & Tushman, M.L. (1997). Competing by design: The power of organizational architecture. New York: Oxford University Press.

Paulk M.C., Curtis B., Chrissis M.B. & Weber C.V. (1993). Capability maturity model, version 1.1. IEEE Software 10, 18–27.

Piña, A. A. (2008). How institutionalized is distance learning? A study of institutional role, locale and academic level. Online Journal of Distance Learning Administration 11(1), 1-15.

Piña, A. A. (2016). Institutionalization of distance learning in higher education. In A. A. Piña, & J. B. Huett (Eds.) Beyond the online course: Leadership perspectives on e-learning. Charlotte, NC: Information Age Publishing.

Piña, A. A., Lowell, V. L., & Harris, B. R. (2017). Leading and managing e-learning: What the e-learning leader needs to know. New York, NY: Springer.

Riter, P. (2017). Five myths about online program management. EDUCAUSE Review 3 (March 17). Retrieved from

Rogers, E. M. (2003). Diffusion of innovations. New York, NY: Simon & Schuster.

Shattuck, K (2014). Assuring quality in online education: Practices and processes at the teaching, resource, and program levels. Sterling, VA: Stylus.

Shelton, K., & Saltsman, G. (2005). An administrator’s guide to online education. Charlotte, NC: Information Age Publishing.

Surry, D. W., & Ely, D. P. (2002). Adoption, diffusion, implementation, and institutionalization of educational technology. In Reiser, R. A., & Dempsey, J. V. (Eds.), Trends and issues in instructional design and technology. Upper Saddle River, NJ: Merrill/Prentice Hall. 

U.S. Department of Education (2016). IPEDS data center. Washington, DC: National Center for Education Statistics. Retrieved from

WASC (2012). Report of the WASC pathway B visit team to Ashford University. Alameda, CA: Senior College and University Commission, Western Association for Colleges and Schools.

Weisbord, M. (1978). Organizational diagnosis: Six places to look for trouble with or without a theory. Journal of Group and Organizational Management, 1(4), 430-447.  

Online Journal of Distance Learning Administration, Volume XX, Number 3, Fall 2017
University of West Georgia, Distance Education Center
Back to the Online Journal of Distance Learning Administration Contents