From Bricks To Clicks: Building Quality K-12 Online Classes Through An Innovative Course Review Project

Kim C. Huett, Ed.S.
University of West Georgia

Jason B. Huett, Ph.D.
University of West Georgia

Ravic Ringlaben, Ed.D. 
University of West Georgia


Using an explanatory mixed methods design, this study uses the National Standards of Quality for Online Courses to measure the extent to which teachers in a blended middle school and a fully online supplemental high school are designing quality online learning environments for students. As a part of fully online graduate coursework, graduate student reviewers were trained to conduct course reviews of blended or fully online courses created by teachers in one of two Georgia secondary local education agencies. In teams, the graduate student reviewers reported the synthesis of their findings and recommendations to the teacher designers and their respective administrators.


The first K-12 virtual schools were created in Canada in 1995 (Barbour & Reeves, 2009). Two years later, virtual schooling began in the United States, with the creation of the Virtual High School and the Florida Virtual School. In the 15 years that have passed since the inception of North American virtual schooling to the time of this study, the virtual school movement has expanded to all nine Canadian provinces (Barbour & Stewart, 2008) and to all fifty states in the United States (Watson, Murin, Vashaw, Gemin, & Rapp, 2010).

As the field of K-12 online learning evolves, researchers (Barbour & Reeves, 2009; Picciano & Seaman, 2007; Watson, Murin, Vashaw, Gemin, & Rapp, 2010) have noted that it is difficult to accurately describe the shifting landscape due to inconsistent or nonexistent policies across states as well as disagreement over definitions of terms that describe distance and online learning at this level. Compounding this at the national level, the No Child Left Behind Act requires states to offer alternative schooling options to students attending schools that fail to make adequate yearly progress (AYP). This has forced many schools to consider alternatives to traditional site-based learning. With student populations increasing faster than new facilities can be built combined with teacher shortages, budget cuts, increased competition from state-sponsored virtual schools and online charter schools, as well as rapid advances in mobile technologies, even school districts that currently make AYP are still turning to online learning in record numbers. This is causing considerable debate about each district’s (LEA’s) institutional mission as well as the future of education.

One area that remains particularly understudied is the single district program, or programs run by an individual school district or local education agency (LEA). In a 2009 report, Watson and Gemin wrote:

Online programs run by a single district, for students in that district, represent an emerging category of online learning activity. Limited data are available for district programs, but existing data points and anecdotal evidence suggest that the number of district programs is growing rapidly. These programs often combine supplemental online courses and blended (online and face-to-face) learning opportunities; some include a full- time online school option as well. (Watson & Gemin, 2009)
This study seeks to contribute to the knowledge base on single district programs by documenting the current results of an ongoing blended and online course quality project involving graduate students, referred to as graduate student reviewers, at a Georgia university and teacher designers of online and blended instruction at a local Georgia education agency (LEA).

Summary of Prior Literature

The Need for Research

One of the hallmarks of a new field is the lack of research to study, support and guide its growth (Cavanaugh, Barbour, & Clark, 2009; Rice 2006). In efforts to consolidate what is known about phenomena in the field of K-12 online learning, a number of literature reviews have been published that offer detailed suggestions for needed research in the field (Barbour & Reeves, 2009; Cavanaugh, Barbour, & Clark, 2009; Rice 2006). However, much of the research cited to inform decision-making in K-12 online learning comes from the field of adult distance education (Barbour & Reeves, 2009; Cavanaugh, Barbour, & Clark, 2009; Means et al., 2009; Rice 2006).  Rice (2006) warns that "care must be given when generalizing adult research to the K-12 student population . . . younger students need to be provided guidance in developing characteristics of successful distance students" (p. 440). Studies conducted in adult distance learning suggest that students who exhibit greater autonomy and responsibility are found to be more successful distance learners (Cavanaugh et al., 2004). Whether or not we can infer that younger students would be the same is questionable.

Growth in the field of K-12 blended and online learning is outpacing research. In a 2007 SLOAN survey of school district administrators in the United States, Picciano and Seaman found that nearly two thirds of all districts surveyed have students who are enrolled in blended or online courses, and one fifth of administrators indicated the intention of introducing such courses in coming years. Barbour and Reeves (2009) identify five main benefits cited across the literature to justify virtual schooling: “expanding educational access, providing high-quality learning opportunities, improving student outcomes and skills, allowing for educational choice, and achieving administrative efficiency” (p. 413). While cautioning against broadly applying the findings to a K-12 audience, a recent meta-analysis suggests that blended and online learning may offer advantages over traditional face-to-face learning (Means, Toyama, Murphy, Bakia, & Jones, 2009). The reasons for the advantage seem to have less to do with the medium itself and more to with a combination of differences in "time spent, curriculum, and pedagogy" (p. xviii).  For some, online and blended formats may allow teachers to employ pedagogical alternatives to behaviorist teaching practices (Bloymeyer, 2002; Means et al., 2009). For others, the push to put courses in online and blended formats stems from fear that traditional schools may lose per-pupil funding dollars to online schools that are becoming more of a choice in various parts of North America (Stone, 2008).

It seems clear that K-12 online learning is often not well understood by teachers or policymakers and this dearth of understanding can result in the misapplication of existing, traditional education policies to online programs (Rice, 2006). Additionally, quality indicators used to measure the success of online programs are often similar to those used with conventional K-12 programs including academic performance, seat-time, time-on-task, retention, academic achievement, and satisfaction (Ronsisvalle & Watkins, 2005).Much more research is needed as it would appear that traditional approaches to the new medium tend to be confusing and marginally effective.

One area in which researchers (Barbour & Reeves, 2009) have recommended more study is that of online course design. According to Oliver, Kellogg, Townsend, & Brady (2010), “Many virtual schools are involved in their own course development efforts, yet little research is available detailing the needs of online course designers, particularly nontraditional designers such as K-12 teachers” (p. 73). Often, where teachers are expected not only to teach an online course but also design it, they are limited in terms of time and resources (Kranch, 2008). In her suggestions for future research, Rice (2006) recommends that researchers "develop valid and reliable tools for identifying interactive qualities in course design and instruction" (p. 442).

The Need for Professional Development

Parallel to the need for more research on building and assessing quality online courses is the need for professional development to train pre-service and in-service teachers in the design and delivery of blended and online courses (Annetta & Shymansky, 2008). According to The Sloan Consortium, K-12 online learning in the United States showed growth of 47% between 2007 and 2008 (Picciano & Seaman, 2009). Christensen, Horn, and Johnson (2008) predict that as the world flips to student-centered instruction “…by 2019, about 50 percent of high school courses will be delivered online” (p.98). Combine this with the current focus on teacher quality and its impact on student achievement, and the online delivery medium adds a completely new dimension to the issue of training quality teachers. And, there is little real agreement about standards for teacher training in online environments and currently no nationally recognized endorsement or certificate for online teaching.

Most professional development in this area is occurring locally at schools on an “as needed” basis with only a small percentage of online teachers receiving training at the university level (Dawley, Rice, & Hinck, 2010). According to Lee and Hirumi (2004), the surest way to frustrate potential online educators is to not provide them with adequate training and assistance. Such lack of professional development directly impacts course quality as well: “If an institution asks educators to teach online, but does not provide adequate training and support, the quality of online course materials and delivery may be jeopardized” (p.536).  This is turn impacts student outcomes.

The National Education Association (NEA) recognizes this challenge and acknowledges in their 2010-11 Handbook’s Policy Statement Regarding Distance Education Section A, 7 (b) that:

Although licensure in the sub­ject matter being taught is a necessary condition for any teacher, it is not a suf­ficient condition for a teacher involved in distance education. Teachers who provide distance education should in addition be skilled in learning theories, technologies, and teaching pedagogies appropriate for the online environment. Moreover, be­cause of the rapidly changing technology, these skills should be continuously up­dated through professional development. (p.407)
Researchers (Davis, Roblyer, Charania, Ferdig, Harms, Compton, & Cho, 2007; Rice, 2006) are calling for updated pre-service and in-service training to ensure teachers’ fitness to teach in online and blended modes, and there are movements underway to develop models and programs for training online educators. For example, through the federally-funded Teacher Education Goes Into Virtual Schooling Project (TEGIVS), researchers have developed a model teacher educator preparation program through which pre-service teachers gain experiences in the roles of facilitator, teacher, and designer.  The curriculum developed through this project is currently being used in several universities in the United States.

Another initiative (VHS, 2011) to establish a standard of excellence for blended and online teachers is the five-course Certificate in Online Teaching and Learning offered through a partnership between Virtual High School and Plymouth State University.  Additionally, the states of Georgia and Idaho have authorized “online teaching” endorsements which in-service teachers may obtain through professional development and university study (Dawley, Rice, & Hinck, 2010; Georgia Professional Standards Commission, 2010). Unfortunately, as laudable as these efforts might be, they are not enough to keep up with growing demand for professional development in this area. Response to this rapidly evolving delivery medium has been slow and “there is a need for a wide-scale call to integrate online teaching requirements into teacher development across all levels, and to explore new models of collaborative teacher professional development” (Dawley, Rice, & Hinck, 2010, p. 11). Clearly, as LEAs grow their blended and online offerings, the need for pre-service and in-service teachers to have professional development in these areas becomes paramount.  This study attempts to address this need through the development of a collaborative course review partnership model to create a quality control and feedback system for K-12 courses.

 Research Questions

The project under study was born of the dual needs of administrators from a Georgia LEA and a university professor of online graduate coursework at a regional university in Georgia. In project design meetings, the administrators of both LEA schools expressed the need for feedback from external reviewers on the quality of the online courses provided to students. In the same meetings, the professor expressed the need to provide graduate students with more authentic K-12 online professional development experiences as the students completed a graduate course of study in Instructional Technology.
The study sought to answer two research questions:

1. To what extent are teachers in the blended middle school designing quality online learning environments for students as measured by the National Standards of Quality for Online Courses instrument?
2. To what extent are teachers in the online supplemental high school program designing quality online learning environments for students as measured by the National Standards of Quality for Online Courses instrument?



This study employs an explanatory mixed methods design. Mixed methods design draws from "both quantitative and/or qualitative data in a single study in which the data are collected concurrently or sequentially, are given a priority, and involve the integration of the data at one or more stages in the process of research’’ (Creswell, Clark, Gutmann, & Hanson et al. 2003, p. 212). In explanatory mixed methods research, quantitative data are collected first, and these data provide the main focus of the results (Mertler & Charles, 2011); the qualitative data are collected to support the explanation of the quantitative data.


The project was directed by university faculty during two semesters: data were collected during a sixteen-week spring semester, followed by a 6-week summer semester.  In the spring semester, graduate student enrollments in a single 3-credit graduate course enabled the reviewing of seven online or blended courses being used by LEA teachers and students during the 2009-2010 school year. A minimum of three students reviewed each course. In the summer semester, enrollments allowed for the reviewing of four online or blended courses from the same school year. 


The online and blended courses reviewed by the graduate student reviewers were designed by teacher designers working in two Georgia schools from a single local education agency (LEA). In 2009, the LEA had a total student population of 39,782 (Georgia Department of Education, 2007). The two schools under study used the Angel learning management system (LMS) to deliver online and blended instruction, but each school followed a different model of online instruction.

Students attending the online supplemental high school enroll in courses in addition to the courses they attend at one of the LEA’s traditional high schools. The online high school serves students in grades nine through twelve, and is not degree-granting, making it necessary for students to be enrolled in another degree-granting school in the LEA. In other words, students at the online supplemental high school were dual-enrolled.
Unlike the online high school, the blended middle school is a traditional face-to-face school serving students in grades six through eight. Students follow the 180-day school calendar and attend in person. The nature of the blended learning is not the focus of this study, although it merits some description for the sake of clarity. Being a recent innovation in this middle school, the implementation of blended learning varied by teacher, as observed by one of the researchers.  Some teachers would start and end the class with students working in the Angel learning management system in a computer lab setting, while others, working in their traditional classrooms, might begin the class using face-to-face interactions and eventually move to having students work through instructional experiences using laptops at their desks. 

At the blended middle school, the schedule was designed so all students experienced at least one class period of blended learning each school day.  For example, on Tuesdays, all sixth grade language arts classes would be taught using computers: students would learn through teacher-designed experiences in the Angel learning management system during the class period.  In rare cases, some courses offered at the blended middle school employed blended learning on a daily basis, such as the agriculture science classes.  In most cases, however, students had the equivalent of one day of blended learning each week.
On publicly viewable web pages, each school expressed a rationale for the use of blended and online learning. According to the website describing the online high school, students could use the online supplemental coursework to overcome scheduling conflicts, recover credits for courses not completed, graduate more quickly and with honors, and participate in coursework despite being hospitalized or homebound (Henry County Schools, 2009, para. 1). At the blended middle school, the benefits of blended learning include using “technology as a means of organization, communication, and creativity to close the feedback loop, monitor and track student progress toward mastery targets, and to engage students in meaningful standards-based learning projects” and providing “…students and teachers ubiquitous access to technology tools that are clearly linked to their demonstration of mastery learning” (Henry County Schools, 2008, para. 3).

Upon administrator request, teachers volunteered to participate in the project by submitting their online or blended courses for review. Teacher designers consisted of 12 instructors teaching nine courses to be reviewed (some courses were co-taught or had multiple teachers contributing to a single course) by the graduate student reviewers.
Graduate student reviewers were 33 graduate students enrolled in an online graduate course on distance education at a regional university in Georgia in either the spring or summer semester of 2010. Graduate student reviewers were working on either the Master’s or Specialist’s degree in Media or Instructional Technology. All graduate students who enrolled in the course participated in this study.  The majority of the graduate students in the course were practicing teachers, media specialists, and instructional technology specialists. 


In conducting course reviews, graduate student reviewers used a quantitative instrument called National Standards of Quality for Online Courses (NS) (see Appendix A). The NS instrument measures five standards: Content, Instructional Design, Student Assessment, Technology, and 21st Century Skills. Each standard area has multiple criteria, which are rated on an ordinal scale from 0 to 4.  Table 1 includes the scale used on this instrument.

Table 1
Scale used on NS Instrument




Absent – component is missing


Unsatisfactory – needs significant improvement


Somewhat satisfactory – needs targeted improvement


Satisfactory – discretionary improvement needed


Very Satisfactory – no improvement needed

The NS instrument is available from the website of the International Association of K-12 Online Learning (iNACOL).  At the time of this study, one of the researchers inquired with an iNACOL representative about the existence of validity and reliability studies related to the instrument. The researcher was told that the instrument, which is based on an instrument designed by the Southern Regional Education Board and reflects input gathered from iNACOL members, is currently being updated (personal communication, November 8, 2010).  A new instrument may be unveiled in the near future.

To increase reliability of student scores, graduate students were provided with a “National Standards Supplemental Document” which provided an explanation of each criterion being rated on the instrument. This document was created by iNACOL and the Region 4 Education Service Center in Houston, Texas, and was used with permission.    

Before completing the NS, graduate student reviewers completed a course-specific standards alignment document (AD) aligned to state curriculum standards. This document, while important, is supplementary to the NS. The AD is a table of all of the state standards and elements pertaining to a given course. For each standard, reviewers had to decide whether or not the standard was being addressed in the course, and if it was being addressed at the appropriate level of Bloom’s Revised Taxonomy. If it was not being addressed—or was being addressed, but at a lower level of Bloom’s Revised Taxonomy than desired—graduate student reviewers offered a recommendation for a way in which the standard could be addressed satisfactorily in the online or blended environment.


After completing the NS (with AD support), graduate student reviewers used a discussion board to address specific questions related to the NS instrument. See Appendix B for the prompts.

The final form used to collect data was the Presentation Template.  Each graduate reviewer team had a template established on a project wiki.  The template was used to synthesize data from the discussion prompts above, and it included six main areas as seen in Table 2. 

Table 2
Presentation Template


Presentation Contents

1: Introduction

  • names of reviewers
  • course reviewed (school, subject, grade, number of semesters)
  • the date (month, year)

2: Content

  • strengths and weaknesses of the course content
  • recommended changes
  • listing of all state standards that 2 or more reviewers found missing

3: Instructional Design

  • strengths and weaknesses of the instructional design
  • levels of interaction (between students and teachers, and between students)
  • recommended changes

4: Student Assessment

  • strengths and weaknesses of student assessment
  • types and frequency of student assessment in course
  • recommended changes

5: Technology

  • strengths and weaknesses of the use of technology
  • recommended changes

6: 21st Century Skills

  • strengths and weaknesses of the incorporation of 21st Century skills
  • recommended changes

Pre-project preparations.

At the beginning of each semester of the project, the LEA administrators selected online and blended courses to be submitted into the review process.  A LEA instructional technology specialist coordinated with university faculty to determine the number of courses needed. The instructional technology specialist created guest logins (with student-level rights) to the selected LEA courses.  

University faculty built a wiki to coordinate the project.  The wiki allowed LEA administrators and teacher designers to communicate with graduate student reviewers during and after the project.  It allowed project administrators to capture and archive feedback for later retrieval. 

Within the university course, the graduate student reviewers spent the first third of the semester learning about instructional design in online environments. Graduate student reviewers read reports and articles related to K-12 online and blended learning and a seminal textbook on distance education. These readings were supported by relevant professor-designed discussions. Students designed and built an online learning module in the WebCT BlackBoard learning management system and peer-evaluated each other’s learning modules using the National Standards of Quality for Online Learning instrument.  

In preparation for conducting course reviews, graduate student reviewers were informed of the courses available for review and were asked to rank their preferences based on their own teaching experience and educational background. In many cases, graduate student reviewers were assigned to courses in the field they were certified to teach, or a closely related field. In some cases, such as in the instance of agricultural science, it was difficult to assemble a team of graduate student reviewers with content knowledge and teaching experience in the field.

Between three and four graduate student reviewers were assigned to each course. Each course represented approximately one semester (one-half course) of content, although there was some variability in this area when looking at the middle school courses. 

In the case of the blended middle school courses, the graduate student reviewers were examining a snapshot of these courses that represented content built between August and January or February of the school year. There was variability in the amount (in terms of duration of learning modules inside) of material, although for each course, the material constituted approximately one semester of content.  In the case of the 6th grade Agriculture Science course, the entire semester was present, and the course is offered on a single-semester basis, meaning graduate student reviewers were looking at all content for the course. Table 3 illustrates which courses were reviewed across both semesters. The number of graduate student reviewers assigned to the course is indicated in parentheses.
Table 3

Courses Reviewed and Number of Graduate Student Reviewers 

Blended Middle School

Online High School

6th Science (3)
6th Math (3)
6th Agriculture Science (3)
7th Language Arts (3)
8th Social Studies (3)

English 1 (6)
English 4 (4)
US History (3)
World History (8)

A project in four phases.

The project took place over four phases.  Data from Phase 1, Phase 2, and Phase 3 are included in this study.  Phase 4 is described briefly below to provide closure and aid understanding; however, data from Phase 4 are not included in this study, as these data do not contribute to the stated research questions.

In Phase 1 of the project, graduate student reviewers individually reviewed the course they were assigned, using the National Standards of Quality for Online Courses (see Appendix A) and a course-specific standards alignment document (AD) aligned to state curriculum standards. After examining the content of the online course assigned, the graduate student reviewer posted the completed NS and AD documents to the appropriate wiki page dedicated to the course under review.

In Phase 2 of the project, graduate student reviewers read the review documents of the other reviewers in their group of three to four. They discussed their findings with one another, responding to specific prompts posted by the professor inside the project wiki. See Table 2 for the prompts.

In Phase 3 of the project, students synthesized their discussions to create a presentation for the LEA that highlighted strengths, weaknesses and recommendations in each of the standards areas (e.g., content, instructional design, student assessment, technology, and 21st Century Skills). They used a wiki page created by the professor called the “discussion summary page” where they could draft their synthesis. Then, one member of the group pasted the information into a PowerPoint presentation and uploaded to an online presentation tool called VoiceThread. Through VoiceThread, all members of the group were able to present multiple slides to the teacher designers and administrators at the LEA using a microphone or web camera.

In Phase 4, graduate student reviewers viewed all of the final presentations by other groups in their course (of the same semester), and they engaged in a discussion to reflect on the themes they observed across all presentations and ways to improve the project.

Data Analysis

Quantitative analysis.

Data from the NS instrument, inputted by each student into an Excel spreadsheet, were merged into a single data file and imported into IBM SPSS Statistics 19 to run descriptive analyses.

In order to answer the research questions, the researchers determined the school mean, the course mean, the standard mean by school, and the standard mean by course (including standard deviation information). The school mean is the mean of responses given for all standards (all 39 variables on the NS instrument) of a single school, as rated by all graduate student reviewers.  The course mean is the mean of responses given for all standards (all 39 variables on the NS instrument) of a single course, as rated by all graduate student reviewers. The standard mean by school is the mean of responses given for a single standard (the number of variables pertaining to each standard differs) for all courses in a single school. Finally, the standard mean or median by course is the mean of responses given for a single standard for a single course. 

The 39 variables of the NS instrument are used to measure five standards: Content, Instructional Design, Student Assessment, and 21st Century Skills.  Each of these standards has a unique number of variables: Content has nine, Instructional Design has 15, Student Assessment has seven, Technology has seven, and 21st Century Skills has one variable. 

The researchers decided that given the sufficient number of variables comprising each standard, it would be appropriate to calculate mean scores with accompanying standard deviation information. The exception to this is the 21st Century Skills standard.  Due to its not having a sufficient quantity of responses to merit the calculation of means, the researchers chose to calculate median scores, providing information on minimum and maximum responses. This exception applies only to the reporting of the standard mean or median by course.  In calculating school mean, course mean, and standard mean by school, medians were not used.

For each of the 39 variables measured on the NS, scores may range between 0.00 and 4.00.The researchers of this study have determined that a mean score of 3.00 or greater indicates quality. 

Qualitative analysis.

As a follow-up to quantitative data analysis, data from the audio and textual VoiceThread presentations were transcribed and coded using the constant comparative method of coding as described by Strauss and Corbin (1990).  For open coding, a priori codes of content, instructional design, student assessment, technology, and 21st Century Skills were used due to the specific nature of the template used (see Table 2 above).  Additional categories were generated based on the comments of the graduate student reviewers.  All categories for both schools were maintained on a single master list. (The process began with the idea of using two separate lists, but the categories developed for the blended middle school proved applicable for the online high school). Using both a priori and inductive codes, therefore, data were open-coded to identify categories. Axial coding was used to organize and prioritize categories. 

Because an explanatory mixed-methods design is used, the quantitative results take precedence over the qualitative: “the qualitative data and analysis are used to elaborate on, refine, or explain the quantitative findings” (Mertler & Charles, 2011, p. 320).  During the quantitative phase of data collection, the NS instrument used to collect the data of the graduate student reviewers served to define quality in online courses.  Because this instrument is used to define the way the graduate student reviewers understood and discussed quality in online and blended courses, the researchers felt that it was important that the qualitative data be analyzed in such a way that it conformed to the definitions from the NS. 

Coding was completed by one researcher and peer review was performed by a second researcher between each step of the coding process. 

RQ1: Blended middle school.

To what extent are teachers in the blended middle school designing quality online learning environments for students as measured by the National Standards of Quality for Online Courses instrument?

The school mean for the blended middle school was 2.58 (SD=1.22). The course means ranged from 2.16 to 3.10.  The standard means by school ranged from 1.90 to 3.23.  The standard means or medians by course are reported, with means ranging from 1.46 to 3.42 and medians (for 21st Century Skills) ranging from 3.00 to 4.00 with an overall range from 2.00 to 4.00. All results for the blended middle school are reported in Table 4.
RQ2: Online high school.

To what extent are teachers in the online supplemental high school program designing quality online learning environments for students as measured by the National Standards of Quality for Online Courses instrument?

The school mean for the online high school was 2.87 (SD=1.11). The course means ranged from 2.73 to 3.08. The standard means by school ranged from 2.59 to 3.07.  The standard means or medians by course are reported, with means ranging from 2.02 to 3.29 and medians (for 21st Century Skills) ranging from 2.00 to 3.00 with an overall range from 2.00 to 4.00. All results for the online high school are reported in Table 5.


In order to ensure that the qualitative data concord with the quantitative, the researchers began the open coding process with several a priori codes: content, instructional design, student assessment, technology, and 21st Century Skills.  From there, the researcher performing the coding induced subcategories from the audio and textual data recorded in the VoiceThread presentations. All qualitative data were recorded in an Excel 2007 workbook by course and category (e.g., content, instructional design, etc.). See the Master List of Categories and Subcategories that arose from open coding in Table 6.
During axial coding, a new Excel 2007 workbook was created to organize the data by category (the a priori standards-based categories).  Data for the blended middle school were placed on five worksheets related to content, instructional design, student assessment, technology, and 21st Century Skills.  Data segments were described as “strengths” or “weaknesses.”  Then, each of the five worksheets was sorted by whether the entry was a strength or weakness, the subcategory (e.g., Alignment to State Standards), and course.  The researchers felt the division of qualitative data based on whether it reflected a noted strength or weakness was important because it fits with the evaluative nature of the NS instrument used during the quantitative phase of data collection. 

In the reporting of qualitative results, it is possible that some teams noted both strengths and weaknesses in a single area.  For example, in the area of instructional design, one middle school team liked the design of the modules in terms of the way in which content was being presented, noting this as a strength.  However, the team members wanted to see more in the way of module overviews used to introduce expectations in each module.

RQ1: Blended middle school.

To what extent are teachers in the blended middle school designing quality online learning environments for students as measured by the National Standards of Quality for Online Courses instrument?

Content strengths.

In terms of content of the blended middle school courses, four teams noted appropriate alignment to state standards. One team found the middle school course under review to be rigorous. Two teams commented on the quality of the teaching of cross-disciplinary skills such as writing.  One of these teams wrote this: 

The students write quite a bit in this science course, and that is an important skill at the middle school level—to learn how to write through the curriculum—not just in language arts and not just for pleasure.
Content weaknesses.

Three teams identified the lack of a syllabus, and recommended including one to address a number of things: online learning skills, contacting the teacher, course objectives, grading, required materials, honesty policies, and parent communication. Two teams noted some gaps in alignment to state standards.  To improve on this aspect of course building, one of these teams recommended “posting Georgia Performance Standards along with essential questions. The performance standards provide clear expectations for student learning.”

Two teams wanted changes in the way in which course content was presented: one team wanted to see less wordiness, and the other team wanted to see more “visual and auditory stimuli related to teacher instruction.”

Finally, one team noted an absence of guidance related to the teaching of netiquette for those students using discussion boards. Another team recommended more intentional teaching of source citation and rules of copyright: “We felt that a reference page could be implemented on citing sources during the lessons in which students were to prepare PowerPoint presentations.”

Table 4

Blended Middle School Course Means, Standard Means by School, and Standard Means and Medians by Course



Instructional Design

Student Assessment


21st Century Skills

Course Means














6th Grade Science













6th Grade Math













6th Grade Ag Science













7th Grade Lang. Arts













8th Grade Soc. Stud.













Standard Means









3.23 (M)

0.60 (SD)



Table 5

Online High School Course Means, Standard Means by School, and Standard Means and Medians by Course



Instructional Design

Student Assessment


21st Century Skills

Course Means














English 1













English 4













US History













World History













Standard Means














Table 6

Master List of Categories and Subcategories




Presentation of Content
Alignment to State Standards
Rigor, Depth, and Breadth
Teaching Cross-disciplinary Skills

Instructional Design

Student Retention of Information
Course Organization 
Student Engagement
Instructor-student Interaction
Student-student Interaction
Student Choice of Work
Higher Order and Critical Thinking
Course Revision

Student Assessment

Communication of Expectations: Assessment
Appropriate Assessment
Assessment of Special Needs  
Grading Policy


Currency of Technology
Appropriate Use of Technology
Course Navigation
Communication of Expectations: Technology

21st Century Skills

21st Century Skills

Instructional design strengths.

In terms of course organization, four teams noted instructional design as a strength in the courses they reviewed. The graduate student reviewers across all teams were interested to see consistency of design, detailed modules, and the division of these modules into appropriate units and lessons. Also, one team was interested in the nature of communication in a course module:

The course requirements are clearly stated at the beginning of each unit. I would also like to talk about the opening video that the instructor did: it was awesome. We all really enjoyed it, and it stimulated my interest as I could not wait to see what was after that video. Maybe consider putting those throughout the course while introducing big units or topics, as that was a great component.
Three teams noted the presence of instructor-student interaction opportunities.  Two teams found the course to be engaging, with one team describing the course as “meaningful, interesting, challenging, and fun for students” and the other team describing a “wide variety of engaging activities that support standards.” One team appreciated the presence of student choice of work:
There was also lot of assignment choices that we thought were meaningful for this course. This is bringing in a differentiation component, and allowing students the choice on the product that they produce is always helpful. . . .the students really get more excited when they have a choice of products to produce. They feel as if they have ownership of the products and it just gets them excited to share it with the class.
One team indicated the course they reviewed was designed to support student retention of learning.  

Instructional design weaknesses.

All five teams noted that the courses they reviewed did not facilitate student-student interaction.  Across the teams, several recommendations were made in this area: clearly state the expectation for interaction and how this is graded, use the discussion boards to address open-ended questions related to course content, and have students peer-evaluate each other’s contributions.

Four teams noted weakness in course organization. These teams recommended increased communication of expectations at the beginning of each module, using such tools as checklists to indicate what students would need to do to complete the module and statements of module goals and objectives. Two of these teams recommended more clear delineation of module expectations to benefit students who were absent or for serving as a review later for students who visit the module.
Each unit should have an introduction and then move into the content, activities, resources and then the assessments by following a pattern. We clearly saw that the semesters and units were outlined, but only a few of them were fleshed out and the ones that were fleshed out, they didn’t look the same. We thought it would be very helpful to have an introduction for each one within a statement about the goals and objectives and then what are going to be the activities of the work and what are going to be the assessments. We recommend that the teacher try to make the unit look the same each time.  Also, because this is a blended course, we believe it would be helpful if each unit had everything online so that the students could access it.
Two teams noted the lack of opportunities for students to make choices in the way in which work was submitted. One team recommended creating “technology challenges by allowing students to use new and varied methods of proving their mastery of subjects.” There were several instances in which teams observed single areas for improvement: one team felt that course materials were too text-heavy to appeal to a wide audience of middle school students.  Another team noted the lack of higher-order thinking opportunities in the course. One team wanted to see the course under review take more advantage of inherent opportunities to address multicultural topics:
We thought this course could be a great venue to include some of the multi-cultural activities as through animals and foods you can discuss different regions of the United States.  You could also discuss different regions of the world.
Student assessment strengths.

All five teams noted areas in which assessment was appropriate, noting the presence of performance-based, ongoing and frequent, varied, consistent, and flexible assessments. One team remarked, “Our team thought that the strength of the course was project-based assessments. We thought these were fantastic, the students are really engaged with the content, they are working with you, and you know their understanding of it.” One team indicated the assessment was tailored to a variety of learners, and another team noted the course under review was effective in its use of rubrics for communication of assessment expectations.

Student assessment weaknesses.

All five teams noted areas in which assessment could be improved.  One team recommended the use of more formative assessments, including summarizing activities.  Three teams noted the opportunity to use a wider variety of assessment formats, such as discussions and short quizzes. One team observed an “excessive use of worksheets, some of which had scribbled notes” and recommended that the teacher consider ways to have students collaborate on projects: “Allow students to collaborate and create an interactive project which displays the concepts learned. However, it is acknowledged that this is a math class and that project assessments will not be applicable to all units.”

One team recommended the teacher hold the students accountable for the web-based grammar activities in the course, which were hard for the teacher to track:
The grammar games that are listed are very useful but how does the teacher know that the students have actually completed it?  This offers little opportunity to reflect upon learning.  Have students write a reflection of their learning and submit to a drop box.
One team wanted to see students gain more experience in submitting work in the online format.

In terms of communicating assessment expectations, four teams saw this as an area in need of improvement. Three of the four teams recommended the teacher provide examples of expected products.  Two of the four teams wanted the use of rubrics to communicate expectations to students. 

Finally, two teams saw an opportunity to improve the grading policy by clarifying how grades are determined. One team was unable to see a policy and stated this:
Include a clear grading policy outlined for the entire course. This could be done through a syllabus, and as you develop each unit, also include a grading policy within each unit or during the introduction. If the instructional design is consistent, then the grading policy would be easier to formulate and embed in every unit.
Technology strengths.

Four teams noted appropriate uses of technology in the courses.  One team observed the use of the Angel learning management system (LMS) facilitated student acquisition of ICT skills.  Another team remarked on the effective use of web resources such as virtual labs for what they termed “hands-on” experiences and Brainpop for hooking the learner at the beginning of a lesson.  The team reviewing the language arts course saw effective use of online grammar games. In the social studies course, the review team noted the successful use of podcasting and moviemaking to report on events of the Civil War and Reconstruction. One group noted the course they reviewed was easy to navigate.

Technology weaknesses.

All five of the blended middle school review teams noted opportunities for improvement in the area of appropriate use of technology. One group recommended the teacher designer of the course bring real farmers into the classroom through online videos. Another group noted an overreliance on Brainppop and recommended the teacher locate more websites for use in the course. The team reviewing the math course wanted to see more interactive games and instructional videos.

One team suggested the teacher designer direct the creation of a collection of student-produced teaching artifacts:
Also, the students could create grammar tutorials for this class, and the teacher could save those and use them for future classes. They could create a tutorial using a tool like VoiceThread or PowerPoint or podcasts etc. The teacher could save those and keep those to provide future students an example of how to do it and actually use it to learn grammar.
Another team observed the course might benefit from increased use of interactive technologies such as discussion boards, wikis, and blogs.

In terms of course navigation, three groups identified opportunities for improvement. One group recommended the course instructor check that all links, both internal and external, function properly.  Another group recommended that when using external links (links to outside websites), it may be preferable to have URLs open in new windows rather than inside the LMS. The third group observed duplication of links in the course, which they found to be confusing and contributing to a sense of “clutter.”

One team recommended the teacher designer clarify technology prerequisites in the course.  Another team recommended the teacher update some PowerPoints used in the course.  Finally, one team observed the need for the teacher designer to be more explicit in the delineation of modifications for special needs students. 

21st Century Skills strengths.

Two teams noted strengths in the area of 21st Century Skills. One team remarked:
We saw a definite connection of the assignment content and global awareness especially in the units of cultural science and also the lessons that touch on the Future Farmers of America.  In our course review, it was evident that many of the assignments exhibited self-directed learning skills and used a variety of skills to assess student learning.
Another team indicated the students were exposed to real-world math situations and self-directed learning experiences.    

21st Century Skills weaknesses.

Two teams noted areas in which the incorporation of 21st Century Skills could be improved: one team recommended the use of student-to-student cooperation, and the other team wanted to see an increase in the teaching of global awareness. 

RQ2: Online high school.

To what extent are teachers in the online supplemental high school program designing quality online learning environments for students as measured by the National Standards of Quality for Online Courses instrument?

Content strengths.

Two teams noted appropriate alignment to state standards, and one of these teams agreed with the teacher designer’s use of chronological ordering of history standards. Two teams observed that content was presented appropriately. In one instance, the content was deemed developmentally appropriate to its 9th grade audience.

One team found the syllabus to be sufficient in communicating course expectations. Another team noted the appropriate inclusion of teaching such cross-disciplinary skills as how to follow netiquette and how to function ethically in a digital environment.

Content weaknesses.

Two teams noted presentation of content as an area that could be improved. One team recommended breaking materials into smaller chunks while the other team urged the teacher designer to evaluate websites for age appropriateness, saying that one website was more appropriate to college-aged students than to this secondary audience.

Two teams observed an opportunity for improvement of course syllabi. One of these teams could find no syllabus in the course and recommended that a syllabus be included that “addresses all the objectives at the beginning of the course.”  The other team indicated the existing syllabus should also address course objectives and should include information about technology requirements.

One team was uncertain about alignment to state standards, and urged the teacher designer to more intentionally declare what standards are addressed in each unit. Another team questioned the rigor of the course, saying that “most lessons only have one activity.  We would like to also see more rigor in the class.”

Instructional design strengths.

In terms of course organization, three teams noted strengths. One team said “The pacing guide provided a great overview of the course. We could see how the course was and would be organized into units and lessons.”  Another team was unequivocal in its positive view of the course layout, while still another team appreciated the course layout with some reservations:
The goals and the objectives of the class as a whole were very clearly stated in the beginning. Our group felt that the course was well organized with units and lessons. And of course this makes the site more navigationally friendly.  Yes, it was organized but sometimes takes too many of clicks to navigate around.
Two groups observed positive instructor-student interaction, and one of these groups noted that it took place “through the discussion forums, emails, and course announcements feature.”  Two teams remarked respectively that the use of “excellent visuals” and “several different delivery methods” was likely to make the course more engaging. One group felt the course afforded students with opportunities to use higher-order or critical thinking skills. Finally, one group observed the presence of student-student interaction in the discussion board and added the recommendation that the teacher designer incorporate more such interaction through “Web 2.0 tools and other formats.”

Instructional design weaknesses.

Three teams saw opportunities for improvement in terms of incorporating student-student interaction. One team noticed the course under review would benefit from greater use of the Angel discussion board and wanted to see more group work. Another team recommended that the teacher build in:
. . . a learning community among the students. We’d like to see more discussion, how world history has a wealth of topics that are controversial, that would encourage and engage students if they have an opportunity to interact with each other and take more ownership of the course.
Three teams saw ways to improve higher-order or critical thinking skills. One team observed the course used too many quizzes and traditional assessments to be able to promote higher-order thinking:
The materials covered in these tests contained very low-level questioning and therefore restricted the opportunities for students to demonstrate mastery of the material. To encourage more mastery learning, we recommend an increase in thought-provoking discussions and an increase in open-ended questioning. Each of these will encourage students to analyze conditions and synthesize their own explanations to more elaborate questioning.
Three teams wanted to see improvement in the area of course organization. All teams wanted to see goals and standards included in lessons/modules. One team recommended the use of module checklists: “There were times when I was not sure what else needed to be covered in a lesson. So, a checklist definitely would be helpful.”

Two teams recommended the design of more instructor-student interactions. One team noted the absence of multiculturalism in the course, and another urged the teacher designer to allow students more choice in assignment formats and submissions. 

Student assessment strengths.

All four teams observed strengths in the way in which assessments were used in their courses. Two teams noted the presence of ongoing assessment in the course. Two teams observed the effective incorporation of formative and summative assessments in the courses under study.  Two teams cited the immediate feedback given to students through short quizzes as being a beneficial formative assessment tool.  Two teams found the use of varied assessment in the course to be a strength. 

Two groups noted the presence of self-assessment in the course.  For both teams, the Angel LMS itself was inclined to provide helpful metrics on student performance such as number of discussion postings, student progress, grades, and date of last login. 

One team observed that the course allowed for “a variety of choices for assignment submissions including multimedia projects such as PowerPoint, Audacity, VoiceThread essays, videos, MovieMaker and Flash projects.” 

Student assessment weaknesses.

Just as all four teams reviewing online high school courses noted strengths in terms of appropriate assessment, they also noted some opportunities for improvement. In fact, most of the components noted as present in some courses in the strengths section above, were noted as lacking in the remaining courses. For example, one team noted the lack of variety in assessments and wanted to see more assessment formats such as discussions and different interaction patterns such as those afforded by peer review designs.

Expanding the variety of assessments, noted one team, would also be beneficial in terms of differentiating based on student needs. Another team observed that more in the way of formative assessment needs to take place before moving directly to summative assessment.

One team observed the course tests and quizzes too heavily emphasized recall and wanted to see an increased use of assessments that stimulated deeper thinking by students.

One team reviewing an English course said this:
All assessments are objective either multiple choice or matching.  There were not any projects, whether group or individual. We would like to see assessments that require critical thinking skills and also some assessments that address the writing GPS [state standard] by incorporating writing components into the assessments.
All four teams observed the need to clarify assessment expectations through use of rubrics. One group said, “We would like to see a rubric added for each assignment that shows what is expected in the assignment and helps the students while completing the assignments.” One team reviewing an English course noted that rubrics were a crucial assessment tool for students engaging in writing tasks.

Finally, one team wanted more clarification on the grading policy in terms of how late assignments would be addressed.

 Technology strengths.

Three teams identified strengths of technology use in the course.  One team noted the use of instructional alternative modes (alternative to strictly text-based presentation) such as use of PowerPoints as a strength.  Another team found that the online text used in the course was beneficial.  The third team mentioned technology was being used well “by virtue of the class being online.”

In terms of communicating expectations for technology use in the course, two teams found that hardware and software expectations were appropriately specified early in the course. Two teams found the course easy to navigate. 

Technology weaknesses.

All four teams noted some areas in which technology could be used better in the courses under review.  Graduate student reviewers on all teams wanted to see increased use of interactive, web-based tools such as VoiceThread,, wikis, and blogs.  The teams urged the increased use of computer-based multimedia tools, too, such as podcasting technologies like Audacity and video production tools such as iMovie.

One team remarked, “The course did not make use of the maximum capabilities of the online medium and all those resources that were available were not utilized in this course.”  Another team said, “The course is somewhat bland, so we recommend that you just add more to keep the students’ interest.” A third team suggested increased technology use had the potential to increase higher-order thinking skills: “The students did not appear to create any projects using technology that would promote higher-order thinking skills.”

One team noted more clarification of required hardware and software and related skills would be necessary, and another team found course navigation needed simplification: “You had to click through about 3 or 4 links to finally get the assignment to open up.”
21st Century Skills strengths.
No strengths were noted by the teams.
21st Century Skills weaknesses.
One team noted a lack of 21st Century Skills in the course.


General Discussion

After reviewing the quantitative data, neither campus met the minimum benchmark of quality with a school mean score of 3.00 (satisfactory). However, both rated between somewhat satisfactory and satisfactory with both the blended middle school (overall mean 2.58) and the online high school (overall mean 2.87) trending more toward satisfactory

Individually, the only courses to meet the minimum benchmark of quality were 7th Grade Language Arts at the blended middle school (course mean 3.10) with satisfactory scores on all standards except Content and English 4 at the online high school (course mean of 3.08) with scores above or very near a mean of 3.0 on all indicators except 21st Century Skills.

Upon closer examination, the 7th Grade Language Arts course fared less well in the Content standard than it did in the other areas. Looking at the qualitative data shared about this standard, the course was observed to be missing some of the content in terms of required state standards. According to the reviewers, the course lacked vocabulary development and acquisition, technical writing, and the examination of graphs, charts and photographs, among other required content standards. These results suggest that the course is not covering all required state standards. 

However, it is possible, and we think likely, that, when seen as a whole, the course was addressing these standards. There were two issues at play: first, because only one semester of content was observed in a course that has two semesters, one could argue that it is likely the missing standards would be addressed in the other semester of the course.  Second, the nature of the blended learning environment makes it difficult to capture all evidence of teaching inside of the learning management system.  In the qualitative responses to this course, the graduate student reviewers stated, “Online student to student interaction is limited. There is also little student-student editing. We recommend that you use the discussion board for student-student interaction.”  In preparing for the project (before Phase 1 began), one of the research investigators visited the class for approximately three hours, and she saw frequent face-to-face interaction and feedback taking place among students at their tables, including collaborative group work. While interaction may have appeared to be lacking in the online piece of the course, we have to be open to the idea that it is difficult to fairly review the blended courses using the current criteria. 

In general, the blended middle school scores were lower than those of the fully online high school.  And, based on qualitative analysis, these distinctions may be attributable to the inherent differences between the blended environment and the fully online environment e.g., in the blended environment, the reviewers could not see all of the course content, and they may have therefore rated the content lower.  Moreover, some of the instructional design weaknesses noted for the blended middle school may be attributed to the fact that course content, organization, and interaction in the blended courses could not be fully observed. 

For both the middle school and high school courses, the standard mean scores for technology and student assessment trended lower. Based on analysis of the qualitative data from the student reviewers as well as follow up interviews with LEA administration officials, the lower numbers on these particular standards may be ascribed to a few potential factors: first, reviewers tended to recommend more student choice in assessments, and often their suggestions were centered around greater use of technology in assessments. Second, reviewers were consistent in calling for greater use of emerging technologies and available web-based resources in the classes. In follow up interviews, the LEA administrators acknowledged the need to increase use of these types of emerging technologies and other resources and expounded on several reasons why the courses may have been viewed as less than satisfactory on the technology standard. The administrators explained the LEA's acceptable use policies often prevented access to emerging technologies and web resources for their teachers. Additionally, they explained there were limited professional development opportunities to train teachers on appropriate emerging technologies needed to engage modern students, foster community, expand interactivity, and support other appropriate pedagogical goals in blended and online classes.

The project is currently in its second round of implementation, and we feel it has had a positive impact on both the graduate students as well as teacher designers and the students they serve. The LEA administrators are enthusiastic about continuing the project, saying the external feedback from these reviewers has motivated their teachers to continue to improve their courses in both the blended and fully online environments. The student reviewers have offered positive feedback as well, describing this project as one of the most valuable in their program of graduate study.

Overall, the results are encouraging and indicate both the blended middle school and the fully online high school are doing an effective job of creating somewhat satisfactory to satisfactory courses. Based on the data presented here, targeted adjustments in course design along with increases in professional development and emerging technology access should result in future courses in the satisfactory to very satisfactory range.

Lessons Learned from the Process So Far

As an administrator concerned with explosive growth in K-12 online learning, increased levels of scrutiny, and the strong push for accountability, one must clearly demonstrate efforts to develop and implement a quality control process for online and blended learning. Regardless of the specific quality control process the organization selects, this study sheds light on considerations that could benefit other administrators looking to develop such a system. Creating a platform for improvement and quality control legitimizes the transition to new delivery mediums. In that vein, what follows is a breakdown of what worked well in this course review project along with some suggested changes. 

What worked well. 

The Wiki.

The course reviewers in this study were graduate students taking a class within a closed learning management system. In planning the first iteration of this process, the project manager decided to develop a wiki outside of the university system that would allow both the graduate student reviewers and the administrators and teachers to login to a private space on the web without the barriers created by the password protected university-only learning management system. This worked very well. The homepage of the wiki was used for communicating the phases and dates of the project implementation. The details of those phases were linked on pages in the wiki navigation, complete with training videos for the graduate student reviewers and an explanation of the instruments being used. The wiki also served as a social networking space for all the stakeholders. The wiki supported the project in more ways than mentioned here, but the recommendation is that administrators select an accessible, protected, collaborative workspace, such as a wiki, to give users the power to easily cooperate, communicate, participate, build knowledge, and access needed resources.

Increased cooperation and transparency.
Through participation in this process, two educational institutions cooperated to mutually beneficial ends. This required that the LEA administrators and teachers open up their online and blended courses for analysis and critique.  On the university side, the administrators were allowed to enter the distance education course in which the graduate student reviewers were enrolled. They had access to everything the graduate students were doing in the course prior to the first phase of this project.  And, all parties worked together on the wiki. Schools, whether K-12 or higher, are often unaccustomed to such a level of cooperation and transparency. However, this openness to constructive critique and rather intimate level of sharing and collaboration among different groups was vital to the process and was, in the authors’ opinions, one of the necessary components to this endeavor. In fact, increased levels of cooperation and transparency may be the cornerstones of educational reform. If one is looking to administer a similar quality control process, it would be wise to explore reciprocal partnerships with local institutions of higher learning or other potential partners.

Reviewing in phases.
We feel that the design of the project phases is strong. In Phase 1, the graduate student reviewers reviewed the course independently and submitted their instruments to the course review page on the wiki. In Phase 2, the team of reviewers discussed their findings and shaped them into a collective, uniform review. Phase 3 involved the presentation of their findings. From an administrative perspective, we found this arrangement to be productive, and suspect it is because we have built in the time for the reviewers to both think and work independently, then come together to discuss and to debate their reviews to consensus. This experience also provided an appropriate challenge for graduate students studying online learning and resulted in constructive and immediately applicable feedback to the teachers, presented in a thoughtful yet easily-digestible manner, for targeted course improvements.

Suggested Changes.

Develop a separate review process for blended courses.
During this study, we discovered that the fully online courses better lend themselves to this asynchronous review process. The blended courses were difficult to accurately review since the reviewers, working a distance, were unable to participate in the synchronous classroom activities. It is difficult to accurately gauge what one cannot fully observe. While we have no doubt that the feedback provided to the teachers in the blended courses was valuable and has resulted in positive improvements, a separate review process that incorporates face-to-face classroom time for the reviewer is warranted for blended courses.  

 Stronger ties between reviewers and teachers.
In a meeting with the LEA administrators, it was agreed that we should incorporate more opportunities for the reviewers and teachers to directly interact. The reviewers had limited direct contact with the teachers and, it is presumed, that strengthening this relationship could lead to deeper understanding of the course content, design, objectives, structure, etc., and this should result in a more accurate measure of course quality. 

Expanding types of feedback.
As a result of the reviewer’s work, the teachers received the completed review instruments and a 5-10 minute Voicethread overview of course strengths and weaknesses. In the interest of diversifying the feedback, we believe that the addition of video walk-throughs of the courses under review is appropriate.  In a video walk-through, each reviewer could pinpoint suggested areas for improvement and communicate those suggestions more effectively. The teacher could see, firsthand, the types of changes being suggested inside the course by the reviewer. This suggestion is currently being implemented in a new round of reviews with positive reaction.

Design targeted professional development.
In future iterations of the project, it may be appropriate to include the creation of targeted professional development activities. It has become apparent that simply reviewing the courses may not be enough to affect necessary levels of improvement without the creation of accompanying professional development that the teachers could use to improve their technological and instructional design skills in online and blended environments. For instance, if a reviewer were to recommend that a teacher incorporate podcasting into a particular lesson, the recommendation itself, while sound, is not likely to be acted upon unless it is accompanied by a training on how to podcast.

Barbour, M., & Reeves, T. (2009). The reality of virtual schools: A review of the literature. Computers & Education, 52(2), 402-416.

Barbour, M. K., & Stewart, R. (2008). A snapshot state of the nation study: K–12 online learning in Canada. Vienna, VA: North American Council for Online Learning. Retrieved from

Bebell, D., & O'Dwyer, L.M. (2010). Educational outcomes and research from 1:1 computing settings. The Journal of Technology, Learning, and Assessment, 9(1), 5-15.

Blomeyer, R. (2002). Online learning for K–12 students: What do we know now? Naperville, IL: North Central Regional Educational Laboratory. Retrieved from

Cavanaugh, C., Barbour, M., & Clark, T. (2009). Research and Practice in K-12 Online Learning: A Review of Open Access Literature. International Review of Research in Open and Distance Learning, 10(1).

Creswell, J. W., Clark, V. L. P., Gutmann, M. L., & Hanson, W. E. (2003). Advanced mixed methods research designs. In A. Tashakkori & C. Teddlie (Eds.), Handbook of mixed methods in social and behavioral research (pp. 209–240). Thousand Oaks, CA: Sage.

Davis, N., Roblyer, M., Charania, A., Ferdig, R., Harms, C., Compton, L., & Cho, M. (2007). Illustrating the "virtual" in virtual schooling: Challenges and strategies for creating real tools to prepare virtual teachers. Internet and Higher Education, 10(1), 27-39.

Georgia Department of Education. (2007). Report card. Retrieved from

Georgia Professional Standards Commission. (2010) Teacher certification section. Retrieved from

Henry County Schools (2008). Luella Middle School belief and mission statement. Retrieved from

Henry County Schools (2009). Henry County Online Academy. Retrieved from

Kranch, D. (2008). Getting it right gradually: An iterative method for online instruction development. Quarterly Review of Distance Education, 9(1), 29-34.

Means, B., Toyama, Y., Murphy R., Bakia, M., & Jones, K.(2009). Evaluation of evidence-based practices in online learning: A meta-analysis and review of online learning studies. Washington, D.C.: U.S. Department of Education. Retrieved from

Mertler, C., & Charles, C. (2011). Introduction to educational research. (7th ed.). Boston: Allyn and Bacon.

National Education Association (NEA). (2011). NEA Handbook. Retrieved from 

Palak, D., & Walls, R. (2009). Teachers' beliefs and technology practices: A mixed-methods approach. Journal of Research on Technology in Education, 41(4), 417-441.

Picciano, A., & Seaman, J. (2007). K-12 online learning: A survey of U.S. school district administrators. Retrieved from Sloan Consortium website:

Rice, K. (2006). A comprehensive look at distance education in the K-12 context. Journal of Research on Technology in Education, 38(4), 425-448.

Annetta, L., & Shymansky, J. A. (2008). A comparison of rural elementary school teacher attitudes toward three modes of distance education for science professional development. Journal of Science Teacher Education, 19(3), 255-267. doi:10.1007/s10972-008-9089-4

Stone, A. (2008). The holistic model for blended learning: A new model for K-12 district-level cyber schools. International Journal of Information and Communication Technology Education, 4(1), 56-57,60-71.

Strauss, A., & Corbin, J. (1990). Basics of qualitative research: Grounded theory procedures and techniques. Newbury Park, CA: Sage.

Virtual High School (VHS). (2011). Participation requirements and graduate credit.  Retrieved from

Watson, J., & Gemin, B. (2009). Keeping pace with K-12 online learning: An annual review of policy and practice. Retrieved from

Watson, J., Murin, A., Vashaw, L., Gemin, B., & Rapp, C. (2010). Keeping pace with K-12 online learning: An annual review of policy and practice. Retrieved from

Appendix A

NS Instrument

Online Course Review Instrument
Used for a course-reviewing partnership between Henry County Schools and University of West Georgia









To learn more about this instrument, visit the National Standards of Quality for Online Courses online.



Course :




Grade Level:




Teacher(s) of Course:




School (HCOA or Luella):




Date of Review:









Rating Scale


0 Absent - component is missing



1 Unsatisfactory - needs significant improvement



2 Somewhat satisfactory - needs targeted improvements



3 Satisfactory - discretionary improvement needed



4 Very Satisfactory - no improvement needed









Instructions: As you move through the online course materials, score the course based on the criteria below.  If you don't understand a particular criterion, hover over the cell of the criterion to see an additional note that offers additional information.  These explanations come from a supplemental document (SD) created by iNACOL and Region 4, and are used with permission. When giving a score of 2 or below, please give a specific comment to explain the score, and offer recommendations to improve the course in this area. 


Standard A: Content




The course goals and objectives are measurable and clearly state what the participants will know or be able to do at the end of the course.




The course content and assignments are of sufficient rigor, depth, and breadth to teach the standards being addressed.




Information literacy and communication skills are incorporated and taught as an integral part of the curriculum.




Sufficient learning resources and materials to increase student success are available to students before the course begins.




A clear, complete course overview and syllabus are included in the course.




Course requirements are consistent with course goals, representative of the scope of the course, and clearly stated.




Information is provided to students, parents and mentors on how to communicate with the online teacher and course provider, including information on the process for these communications.




Issues associated with the use of copyrighted materials are addressed.




Academic integrity and netiquette (Internet etiquette) expectations regarding lesson activities, discussions, e-mail communications and plagiarism are clearly stated.




Assessment and assignment answers and explanations are included.




Standard B: Instructional Design




Course design reflects a clear understanding of student needs, and incorporates varied ways to learn and multiple levels of mastery of the curriculum.




The course is organized into units and lessons.




The course unit overview describes the objectives, activities and resources that frame the unit. It includes a description of the activities and assignments that are central to the unit.




Each lesson includes a lesson overview, content and activities, assignments, and assessments to provide multiple learning opportunities for students to master the content.




The course is designed to teach concepts and skills that students will retain over time.




The course instruction includes activities that engage students in active learning.




Instruction provides students with multiple learning paths to master the content, based on student needs.




The teacher engages students in learning activities that address a variety of learning styles and preferences.




The course provides opportunities for students to engage in higher-order thinking, critical-reasoning activities and thinking in increasingly complex ways.




The course reflects multicultural education and is accurate, current and free of bias.




Readability levels, written language assignments and mathematical requirements are appropriate for the course content and the students.




The course design provides opportunities for appropriate instructor-student interaction, including timely and frequent feedback about student progress.




The course provides opportunities for appropriate instructor-student and student-student interaction to foster mastery and application of the material and a plan for monitoring that interaction.




The course provides opportunities for appropriate student interaction with the content to foster mastery and application of the material.




Students have access to resources that enrich the course content.




Standard C: Student Assessment




Student evaluation strategies are consistent with course goals and objectives, representative of the scope of the course and clearly stated.




The course structure includes adequate and appropriate methods and procedures to assess students’ mastery of content.




Ongoing and frequent assessments are conducted to verify each student’s readiness for the next lesson.




Assessment strategies and tools make the student continuously aware of his/her progress in class and mastery of the content beyond letter grades.




Assessment materials provide the teacher with the flexibility to assess students in a variety of ways.




Grading rubrics and models of partially to fully completed assignments are provided to the teacher.




Grading policy and practices are easy to understand.




Standard D: Technology




The course architecture permits the online teacher to add content, activities and assessments to extend learning opportunities.




The course is easy to navigate.




The course makes maximum use of the capabilities of the online medium and makes resources available by alternative means; e.g., video, CDs and pod casts.




Hardware, Web browser and software requirements are specified.




Prerequisite skills in the use of technology are identified.




The course utilizes appropriate content-specific tools and software.




The course meets universal design principles, Section 508 standards and W3C guidelines to ensure access for all students.




Standard F: 21st Century Skills




The course intentionally emphasizes 21st century skills in the course, including using 21st century skills in the core subjects, 21st century content, learning and thinking skills, ICT literacy, self-directed learning, global awareness, and includes 21st century assessments, as identified by the Partnership for 21st Century Skills.



Course Strengths



Instructor-student interaction (Describe the type of interaction and its frequency and quality.)



Student-student interaction (Describe the type of interaction and its frequency and quality.)



Recommended Changes




Appendix B

Phase 2 Discussion Prompts


Instructional Design

Student Assessment

Technology and 21st Century Skills

Identify strengths and weaknesses of the course content. Was coverage adequate? What improvements do you recommend? List all GPSs not observed or only partially observed.

Identify strengths and weaknesses of the course design, paying particular attention to levels of interaction and whether the course engages online learners through its design. Is the design sound for online learning/learners? What improvements do you recommend?

Identify strengths and weaknesses of student assessment, paying particular attention to types of assessment and frequency of assessment. What improvements do you recommend?

Identify strengths and weaknesses of the course in relation to these areas. What improvements do you recommend?

Online Journal of Distance Learning Administration, Volume XIV, Number V, Winter 2011
University of West Georgia, Distance Education Center
Back to the Online Journal of Distance Learning Administration Contents