Evaluating Technology to Prevent Academic Integrity Violations in Online Environments

Victoria Brown
Florida Atlantic University


Protection of academic integrity in online environments can be challenging. Understanding how the technology works and concerns about each of the methods for monitoring online interactions can assist in the selection of the best proctoring tools. Depending on the content, the type of assessment and the comfort level with the technology, a combination of academic integrity solutions may be necessary.


Maintaining academic integrity is challenging. Almost one-third of higher-education students have considered cheating regardless of the teaching environment (Watson & Sottile, 2010). About 43%, community college students, reported that they observed academic misconduct and nearly 46% admitted engaging in cheating at least once (Smyth & Davis, 2003). Additionally, students reported that they were four times as likely to engage in academic misconduct in distance learning courses (Watson & Sottile, 2010). Many students believe that it was easier to cheat in online courses (Harmon, Lambrinos, & Kennedy, 2008; King, Guyette, & Piotrowski, 2009). Moreover, students thought that cheating was more likely to occur in online courses because it was easy to do. The students felt that resources for violating the integrity of tests were readily available (Burnett, Smith, & Wessel, 2016). In spite of students’ abilities, class rank, or grade point average, academic dishonesty was more likely to occur in online courses that lacked test-monitoring technology (Beck, 2014).

Verifying whether the perception about online cheating is accurate is difficult. Most reports on the prevalence of cheating in both traditional and online classrooms rely on students’ self-reporting. Examining differences between traditional and online classes exams is also problematic. For example, students taking proctored online exams were more likely to score lower and spend less time than those in online nonproctored exams (Alessio, Malay, Maurer, Bailer, & Rubin, 2017). Improved test results due to cheating are possibly offset by the increased test distractions experienced by noncheating students in a proctoring situation. Noncheating students appeared to experience increased stress in using technology and missed the ability to solicit clarifications on test questions, leading to differentiation in test results (Fask, Englander, & Wang, 2014).

The Need to Verify Students’ Identities in Online Classes

Protecting academic integrity is important in both traditional and online classes. The methods of cheating on tests in the classroom have become more sophisticated with technology. Students are using printers to photocopy water bottle labels, then writing or typing cheat notes inside the photocopied labels. Next, the new labels are pasted onto the water bottles to be used during testing (Jenkins, 2012). Another approach is to use smartphones to query answers during exams from the Internet from curated publisher test banks. Surprisingly, professor-created tests are also readily available from study aids or cheating websites. Students prepared cheat sheets which were posted online for access during the tests (Jenkins, 2012). Finally, smartphones can be used to snap pictures of test questions to share with classmates or friends taking the exams later (Jenkins, 2012).

In the United States, the Higher Education Opportunity Act requires institutions to have processes in place that ensure that students who registers in a distance education courses are the same individual participating in and completing the course. The act identified three methods to accomplish this goal: (a) secure login and passwords, (b) proctored exams, and (c) new technologies or practices that verify the students’ identities (Bart, 2009).

Monitoring student identification within courses is important to the viability of distance learning, particularly considering the increased financial aid fraud in the distance classes (Mills, 2015). For example, large loosely affiliated groups targeted institutions with low tuition in distance learning programs. Using identifiable information, these groups created straw students who completed multiple financial aid applications. These straw students participated in academic programs to secure the disbursements from the institutions (Baime & Mullin, 2012; Office of Inspector General, 2011). In 2011, the Office of Inspector General for the U.S. Department of Education required the Department of Education to implement a corrective action plan to address the financial fraud (Baime & Mullin, 2012). Institutions in the United States are now required to demonstrate participation in courses before distributing funds to students.

Protecting Integrity Online

The quality of online courses is often challenged due to issues of academic integrity. To ensure the quality of their distance learning courses, educational institutions are addressing how to verify the identity of students enrolled in online courses by proctoring online exams. To maintain the integrity of online assessments, faculty want to (a) ensure identification of the students turning in assignments, (b) prevent communication between students during testing, (c) restrict access to the Internet, and (d) impede the use of secondary devices while taking tests.

The Internet is very useful in enhancing students learning; it can also facilitate the academic misconduct for students who have a predisposition to cheat. For example, groups of students can collaborate through Google Docs by creating test-question matrixes which can be used to group-think possible answers for short answer or essay questions. Then the students can engage in a peer review of possible answers deciding on the best question responses (Krueger, 2015). Online testing compounds the challenges for instructors in securing online tests. The students can engage in similar tactics as those in the face-to-face classes by accessing test bank questions, preparing cheat sheets, and by texting friends. Because of the autonomy in online classes, students occasionally hire someone to take their online exams (Krueger, 2015; Newton, 2015).

Lockdown Browsers

To prevent students from engaging in academic violations during testing, online proctoring services were developed to identify incidents of cheating. Lockdown browsers were the first attempt to address the challenges of ensuring the integrity of online assessments. These systems prevented students from using alternative programs on their computers during an exam. Lockdown browsers also prohibited the students from performing other tasks on a computer, such as access to Internet sites and other applications. Students also were unable to print and copy (Respondus, 2017).

Lockdown browsers are very effective in preventing students from accessing the Internet or other computer-based programs to access information. However, lockdown browsers are unable to preclude students from using their books and notes or calling a friend to collaborate on tests. In addition, lockdown browsers cannot detect whether the students are using a secondary device to access the Internet for potential test answers.

Video Monitoring

 Faculty may add a layer of security to the lockdown browser by requiring online students to take tests on campus computers in a testing center or to use a video camera on the students’ computers. Going to a testing center creates additional barriers for nontraditional students who may be unable to travel to the school’s campus to complete the course assessments during business hours. Therefore, combining the lockdown browsers with webcams to monitor student behavior grew in popularity. Using video monitoring for online test environments addresses some of the concerns about cheating. Before starting the exam, the students are required to verify their identities and to scan their environment. Students can verify their identity with a government-issued photo identification card or a student card. The environmental scan of the testing environment usually is a 360-degree view of the room and of the work surface.

Proctored online tests typically use one of two methods to monitor testing behavior. One option is to use live proctors who remotely watch the students. The second option involves recording the students’ behavior during the test session. Those video recordings are reviewed later for possible violations. In some instances, the professors personally review the recordings or they delegate this task to their teaching assistants. In other instances, the software company completes the review of the recordings for an additional fee. When possible violations surface, the vendor sends the information to the professors who then determine whether disciplinary actions are warranted. Flagged academic integrity violations include tracking eye movements away from the computer to potential cheat sheets and leaving the room.

Biometric Identification

Recently, biometric software emerged in the market as a potential solution for verification of an individual’s identity. Biometric software uses the learners’ fingerprints, faces, irises, voices, signatures, typing patterns, or a combination of one or more these to confirm that the person on the other side of the computer screen is the intended recipient of the assessment. These systems use either physical or behavioral characteristics which are unique to an individual and often require the creation of a database with the sample of the bio signature that can be accessed and compared at the time of the exam (Berkey & Halfond, 2015).

Biometric solutions alone have limitations in that (a) they do not guarantee the users’ continuous presence, and (b) they introduce concerns about safeguarding students’ personal and sensitive information. Although biometrics constitutes a robust solution to the problem of user authentication at login, these programs do not address the myriad of ways students can search for information or collaborate with one another. In addition, educational institutions need to consider the legal and moral ramifications of collecting more information that uniquely identifies students and can potentially put them at risk if that information is lost, stolen, or misused (Educause Learning Initiative, 2016).

Search Algorithm Analysis

To address these concerns about the robustness of academic integrity solutions to prevent academic misconduct in online testing, another approach for protecting the integrity of these vulnerable tests was developed by a team of innovative alumni and students from the Business College at Florida Atlantic University. Built upon a patent-pending, anti-cheating algorithm, the team developed software to detect cheating in real time by securing and monitoring online exam content. Operating seamlessly in the background during exams, the software does not intimidate or interfere with student test sessions addressing concerns related to increased test anxiety created with the use of a webcam. Rather, the proctoring tool gathers behavior data from the actions on the Internet. Adding the video recording option provides additional evidence about the students’ actions at the time the academic integrity violation occurred. Using an explicitly defined list of violations, the results are resistant to false-positives. The instructors have a dashboard which allows them to observe testing in real time. Violations in academic integrity have two videos with time stamps: (a) the students’ actions on the Internet and (b) videos of the students.

This technology has yielded promising results. For this process to work, however, the tests need to be available to the company. This can take time to acquire when working with third- party vendors. As with other products, students are concerned about their invasion of privacy with the ability to track their actions on the Internet.

Table 1:
Summary of Pros and Cons for Online Proctoring Approaches

Challenges to Using Online Proctoring Services

The online proctoring approach has some challenges. For one, students perceive live proctoring as an invasion of their privacy as they are taking exams in their home or work areas. (Mills, 2015). Second, the online visual proctoring method places onerous restrictions upon students that do not exist in live testing situations. For example, deviations from looking directly at the screen or sitting upright can result in a flag indicating a possible violation to the faculty member. The possible testing rule violation can result in an interruption of the test by the proctor or with a distracting alert message appearing on the screen (Singer, 2015).

A concern by many students is the thief of the personal identifiable information through the online proctoring process. The online proctoring requires the student to visually present identification or to use personal identifiable information to verify their identity. This creates opportunities for identity theft (Signer, 2015; Skinner, 2015). So much data is gathered during the testing session generating concerns by students about what happens to the data once the testing session is over (Singer, 2015). This concern has validity. The Family Educational Rights and Privacy Act (FERPA) of 1974 requires institutions to protect the academic record including course name, grades and video session of a proctored exam. Educational institutions are trusting the vendors with this information housed on their servers as safe from malicious attacks. Even the best service provides have experienced theft of data. One recommendation is to keep the video recordings and the identifiable data for short periods of time on the servers (Kassner, Morgan, & Hayes, 2018). This does make it very important to be able to download and safe videos with violations on to a local server. Appeals for violation of academic integrity can take longer than company is willing to store the data.

Challenges also exist in using online proctoring for the instructors as well. The lockdown browsers interfere with the launching of some programs or other Internet browser windows needed in some testing situations. As a direct result, the faculty are limited in the types of test questions they can use. For example, students cannot create a spreadsheet, create a multimedia product, or view a specific webpage. Viewing of the yellow and red flags generated by the automated systems or the online proctors take time as faculty replay the video clips of the test violations to determine if the evidence is strong enough to start the academic integrity violation process.

The Internet has contributed to academic misconduct by providing students with options for navigating around the online proctoring with webcams and sharing ideas as to how to cheat on the online tests. For technically savvy students, websites provide detailed step-by-step directions with diagrams on how to connect a second monitor using long cables to a second person outside the webcam view to share answers. A communication method is then established between the test taker and the second person providing answers (Hsu, 2013; Schaffhauser, 2016; Smith, 2016).

Identification of Which Types of Technology to Use

Selecting the right technology for monitoring online exams can be challenging. Depending on the level of security desired, the cost increases. No single academic integrity solution can address all the possible needs for the various content areas. Selecting only one product for an institution can be difficult based upon the variety of content areas or types of tests required by the institution. In those situations, providing a range of products allows for flexibility in choosing the approach that is best for their situation. Below are factors that can impact the selection of a proctoring service.

Low-stakes testing requires some level of security assurance. These courses often have high numbers of students and are lower-level courses. In these courses, a faculty member may want to use a lockdown browser with video monitoring with randomized selection of exam sessions reviewed. For additional security, the videos can be scanned using a software program that flags possible violations. Due to the high numbers, the video monitoring with live proctoring or review of videos by an outside party can act as a deterrent but can be cost prohibitive. Utilizing software that provides the video with faculty review can lower the cost for the institution or students, whomever is paying the proctoring fees.

High-stake exams are those that are required for certification for courses that deal with critical skills in the field. In this situation, a product that uses live monitors would be best. Live monitors can interrupt an exam when a violation is occurring. Typically, the test monitor corrects the situation. The faculty member can review the exam to determine if the incident should be considered a violation.

For exams, which have a high probability of the test banks being published online, a proctoring tool which secures and monitors Internet activity should be used to determine if the students are searching for answers to the exam. Even instructor-created exams can be posted online shortly after release to a class. To determine whether answers are available to exams, do a Google or Bing search for the test questions shortly before and after the exam.

Another concern revolving around the selection of the technology used in proctoring online exams is the comfort level with the technology by both the faculty and students. Students may be concerned about their privacy and not want to be observed as they are taking the exam (Educause Learning Initiative, 2016). Faculty may be concerned about the window that is created into the students’ private home settings. Involving the faculty in the selection of the proctoring tools is helpful in determining the comfort level with the technology.

Process in Selecting a Tool

The selection of a proctoring tool requires input from two important stakeholders at the institution, technology support and instructors. The technology support needs to be involved because of their knowledge about the implementation of the learning management system. This team will integrate the proctoring in the learning management system and provide help desk support for students and instructors in the use of the tools.

Instructors are invaluable in the process. First, they are aware of the content and the types of test questions used for their courses. The proctoring software must meet their needs. Issues that faculty typically have several issues. They are often concern about lock-down-browsers blocking the use of software packages or web pages with required multimedia elements to be viewed or built. The faculty area also interested in knowing what types of test aides can be used in the room. For example, faculty frequently want identification of blank paper to work out mathematic problems or the limited use of notes. Computer science faculty will want to know how the system can be broken by their students. Mathematic faculty are very concern about access to websites that walk students through the steps to solve a problem (which they can write on their paper) frequently with the correct answer. Some content areas cannot use video because the students for a variety of reasons including exposing their roles as undercover police officers.

Students can be a third group of stakeholders. However, the faculty often know the concerns that this group. As the proctoring tools are implemented, students did indicate they did not have the required webcams or computers to take the online proctored tests. Other concerns were privacy concerns in recording the students in their homes. To address the concerns, arrangements were made for testing to take place on campus. Interesting note, students typically declined the arrangements and found the equipment needed or choose to take the online exam.

The first step of this process is to form a committee with the two stakeholder groups. This group should decide features they would like to see in an online proctoring service. The next step is through the series of meetings establish a set of criteria. Appendix A has a series of questions that can be used to brainstorm and finalize the types of questions the committee may want to ask the various vendors. Once the criteria are established, create the announcement for the search of an online proctoring in a competitive bidding process to receive the best pricing. The committee can evaluate the various vendors for the type of proctoring the committee is looking for based upon the criteria.

An alternative step before full implementation is to conduct a pilot test with the proctoring service select. During the implementation stage, the technology support team can ensure the smooth integration of the proctoring service into the learning management system. The faculty can identify additional issues in the deployment of the tool. For example, the pilot test can identify if the directions to the students are clear and what types of computer, operating systems, and software works best for the implementation of the proctoring services.

Alternatives to Technology

For large classes, exams continue to be the assessment of choice due to grading challenges the large classrooms pose.  The construction of exams can reduce academic integrity violation by not using online test banks produced by publishers. Text based exams can be compromised in multiple of different ways. Remember, they are being used by multiple of different instructors and institutions with different goals. One instructor may want to post questions for study before the exam. If this happens, the exam questions are on the tutoring websites. The best strategy is to create exams for use by the class at regular intervals (Brown, 2016). This also allows the instructor to assess on those points that were emphasized in the class presentations and instructional material. Another strategy is to use the randomization test building tool built into the learning management systems to create a different version of the exam for each student (Brown, 2016). Finally, use a variety of test prompts that require critical thinking. Testing beyond multiple-choice questions requires the students to explain their answers (Brown, 2016).

Designing online course to reduce the opportunities to cheat is a good strategy. One strategy is to use a variety of assessments within the course. Using a series of low stake assessments or quizzes that build to an end-of-course evaluation builds the students confidence and gives the students feedback on their progress throughout the courses. The low stake exams or activities must have perceived importance for the students to put the required effort into the assessments to be viewed as important enough to effort score well on the exams (Cole, Bergin, & Whittaker, 2008).

For smaller classes, authentic or project-based assessments can be a good approach for addressing academic integrity. Both types of assessment use projects that allow the students to demonstrate their ability to apply their knowledge of the content. At the same time, the students develop other skills critical to future employment. Authentic assessment is typically grounded in authentic or real world. Because the assignments are real world, the content of the projects and assessment changes with real world events or changes in the technology used in the content area. Projects are often used in authentic assessment as part of a series of assignments. However, projects can also be standalone assignments.

Chances of academic violations using authentic or project-based assignments are reduced for several reasons. Students select topics based upon their interests (Ma, Wan, & Lu, 2008). As a result, the availability of downloadable papers or purchasing of a paper is reduced. If students use a paper from a previous class, the instructor is likely going to remember that paper. Change various aspects of the assignment each semester allows the instruction to identify incorrectly formatted papers from a previous semester (Baron & Crooks, 2005). The copied or reused paper will not have the required elements for that semester. Authentic assessments tend to require extensive writing with rough drafts and revisions (Baron & Crooks, 2005). The frequent writing assignments allow the instructor to learn the students writing style and patterns of writing promoting quick identification of academic violations in extreme changes in the submission of projects.

How Online Proctoring Looks at a Large University

Over time, at this university, the online enrollment has grown to approximately one quarter of the total enrollment. Because of the scope of the online enrollment, flexibility in the adoption of online proctoring services was key to the adoption of distance learning at the university. In the beginning, the use of the lock-down browser software was the only product used. The lock-down browser which now has video recording and electronic scanning of the videos with flagging of possible aberrant behavior. That product is still used today because of the low cost. The university is able to purchase blocks of seats. As the use expands, additional seats are purchased. The service is used for low-stakes testing and face-to-face courses that want to use electronic testing. The proctoring tool is frequently paired with on-campus proctoring in computer classrooms.

As the need for more advanced proctoring services were required in the online courses, a committee was formed to select the second-generation proctoring tool. The company selected offered video proctoring without live proctors at the time of the test. However, a live proctor rapidly scanned the video once the exam was completed. This allowed for large classroom exams and met the need for high demand testing at mid-term and finals. The cost was lower than the live proctored services and addressed the primary concern about a live proctor watching during the exam, a feature the faculty did not want.

The mathematics and sciences had a unique concern in the use of alternative devices used to answer mathematic based problems. Students were able to by-pass the lock-down browsers and were able to identify ways to mobile devices hidden in the room during live proctored sessions to access these sites. As a result, another proctoring service was selected that allowed the identification of soliciting of answers with a second device or solicitation of answers by friends. Another concern was students taking the exams in large groups in the library. Science needed a way to gather paper-based handwritten diagrams and mathematic solutions. This resulted in a pilot test of a very new product to the market that addressed these specific concerns. The service tracked the use of second devices and could identify IP addresses to determine if students were taking the exam at the same time and location. Faculty can watch the students’ activity during the exam. The product was piloted tested and adopted by the institution.

Currently, the three different proctoring services are available and supported by the technology and instructional design teams. The instructional designers walk faculty through the different options available to protect the academic integrity for their class. The instructional designers encourage the not-technology methods as the best strategies. The based upon their unique needs, the faculty select the proctoring service that best meets their needs. The added benefit is faculty are move willing to migrate their courses into an online format.


The faculty are concerned about the ease in cheating for online exams (Mills, 2015). This concern often prevents adopting distance learning as a delivery option and poses questions about the quality of the courses. Online proctoring has become a viable way of both verification of students in the classes and monitoring the quality of programs. Selecting the best method for monitoring the academic integrity should be based upon the content being tested and the comfort level of the technology for both students and faculty.

No one academic integrity product seems to be sufficient to address all the difficulties in protecting and maintaining academic integrity in distance learning courses. However, similar challenges exist in the face-to-face testing environments. Faculty will continue to balance the evolving ingenuity of students as they create new ways to cheat in both online and traditional environments. Students will use technology in new ways. Faculty will need to find new ways to counter those efforts with new products.


Alessio, H. M., Malay, N., Maurer, N. J.K., Bailer, A. J., & Rubin, B. (2017). Examining the effect of proctoring on online test scores. Online Leaning 21(1), 146-161.

Baime, D. S., & Mullin, C. M. (2012). Preventing abuse in federal student aid: Community college practices. Washington, DC: American Association of Community Colleges.

Baron, J., & Crooks, S. M. (2005). Academic integrity in web based distance education. TechTrends, 49(2), 40-45.

 Bart, M. (2009). Understanding the HEOA’s student authentication provision for distance education programs. Faculty Focus. Retrieved from https://www.facultyfocus.com/ articles/distance-learning/understanding-the-heoa-student-authentication-provision-for-distance-education-programs/

Beck, V. (2014). Testing a model to predict online cheating: Much ado about nothing. Active Learning in Higher Education, 15(1), 65-75.

Berkey, D., & Halfond, J. (2015). Cheating, student authentication, and proctoring in online programs. The New England Journal of Higher Education. Retrieved from http://www.nebhe.org/thejournal/cheating-student-authentication-and-proctoring-in-online-programs/

Brown, B. (2016). Online testing, is it fair? eLearning Magazine, 2016(2). Retrieved from http://elearnmag.acm.org/featured.cfm?aid=2893355

Burnett, A. J., Enyeart Smith, T. M., & Wessel, M. T. (2016). Use of the social cognitive theory to frame university students’ perceptions of cheating. Journal of Academic Ethics, 14(1), 49-69.

Cole, J. S., Bergin, D. A., & Whittaker, T. A. (2008). Predicting student achievement for low stakes tests with effort and task value. Contemporary Education Psychology, 22(3), 609-624.

Educause Learning Initiative. (2016). 7 things you should know about remoter proctoring. Educause. Retrieved from https://library.educause.edu/~/media/files/library/ 2016/5/eli7133.pdf

Fask, A., Englander, F., & Wang, Z. (2014). Do online exams facilitate cheating? An experiment designed to separate possible cheating from the effect of the online test-taking environment. Journal of Academic Ethics, 12(2), 101-112.

Harmon, O. R., Lambrinos, J., & Kennedy, P. (2008). Are online exams an invitation to cheat? Journal of Economic Education, 39(2), 116-125.

Hsu, S. (2013). How to beat online exam proctoring. Retrieved from http://infoproc.blogspot.com/2013/04/how-to-cheat-online-exam-proctoring.html

Jenkins, B. (2012). 10 cleverest ways to cheat on a test. ODDEE. Retrieved from http://www.oddee.com/item_98395.aspx

Kassner, D., Morgan, J., & Hayes, F. (2018). Adhering to FERPA while computing in the cloud and proctoring exams online. Online Learning Consortium. Retrieved form https://secure.onlinelearningconsortium.org/effective_practices/adhering-ferpa-while-computing-cloud-and-proctoring-exams-online

King, C. G., Guyette, R. W., Jr., & Piotrowski, C. (2009). Online exams and cheating: An empirical analysis of business students' views. Journal of Educators Online, 6(1), 1-11.

Krueger, K. (2015). How to catch students cheating on online tests. Mediashift. Retrieved from http://mediashift.org/2015/08/how-to-catch-students-cheating-on-online-tests/

Ma, H. J., Wan, G., & Lu, E. Y. (2008). Digital cheating and plagiarism in schools. Theory into Practice, 47(3), 197-203.

Mills, P. (June 2015). Ensuring academic integrity in higher ed. University Business. Retrieved from https://www.universitybusiness.com/article/ensuring-academic-integrity-higher-ed

My PC Channel. (2014). How to cheat on some online multiple choice tests. Retrieved from https://www.youtube.com/watch?v=kxnCLxiSQuc

Newton, D. (2015). Cheating in online classes is now big business. The Atlantic. Retrieved from http://www.theatlantic.com/education/archive/2015/11/cheating-through-online-courses/413770/

Office of Inspector General (2011). Investigative program advisory report: Distance education fraud rings (Control No. L42L0001). Washington, DC: U.S. Department of Education. Retrieved from: https://www2.ed.gov/about/offices/list/oig/invtreports/l42l0001.pdf

Respondus, (2017). Changing the world of assessments. Retrieved from: http://www.respondus.com

Schaffhauser, D. (2016). How students try to bamboozle online proctors. Campus Technology. Retrieved from https://campustechnology.com/articles/2016/04/06/how-students-try-to-bamboozle-online-proctors.aspx

Singer, N. (2015). Technology: Online test-takers feel anti-cheating software’s uneasy glare. The New York Times. Retrieved from http://www.nytimes.com/2015/04/06/technology/online-test-takers-feel-anti-cheating-softwares-uneasy-glare.html?_r=0

Skinner, V. (2015). Students object to online courses recording facial features, knuckles, voice. EAG News. Retrieved from http://eagnews.org/students-object-to-online-courses-recording-facial-features-knuckles-voice/

Smith, A. (2016). Beating, cheating, and defeating online proctoring. Executive Academics. Retrieved from http://www.executiveacademics.com/#!Beating-Cheating-and-Defeating-Online-Proctoring/ck38/568c3fd40cf223ef44cb280b

Smyth, M. L., & Davis, J. R. (2003). An examination of student cheating in the two-year college. Community College Review31(1), 17-32.

Watson, G., & Sottile, J. (2010). Cheating in the digital age: Do students cheat more in online courses? Online Journal of Distance Learning Administration, 13(1).

Appendix A

Suggested Questions for Evaluation of Online Proctoring Services


  1. Can the proctoring service produce certification that the service and storage requirements meet FERPA regulations?
  2. Where does the proctoring service store the data? (Make sure this is inside the United States)
  3. How long with the exams be stored? (Need to make sure the exams are available for grade appeal processes)


  1. Can the company scale up to handle multiple classes at a specific time of the semester? mid-terms? Finals?
  2. How many exams will the proctor be monitoring? (If they are using live proctoring methods)
  3. Does the company have the capability to handle large classes in a short time frame?
  4. Does the student need to make an appointment? Can the student request immediate proctoring for an exam?

Integration into the Learning Management Systems

  1. Is the professor able to create the testing session within the learning management systems or are they using a secondary system?
  2. Is the professor able to view the dashboards with the testing session flagging system?
  3. How easy is it for the student to access the tests?
  4. What type of technology is the student require to use the software? Browser? Operating system?
  5. Can the software be downloaded and use easily by the student?
  6. How does the software work if the test is not housed in the institution’s learning management system but linked through the system to a publishing testing bank?
  7. Does the company have 24/7 support service or does the institution provide help desk services?

Student Identification System

  1. What processes does the proctoring tool use to validate the student?
  2. Video systems
    1. If the student is displaying their ID card, can that screen be viewed separately from the test?
    2. Does the system compare the ID card picture to the student’s facial features?
  3. Biometry systems
    1. How accurate is the biometry systems used? (for biometry systems such as finger printing or typing patterns)
    2. Can a biometry system be reset if changes occur with the student?

Test Implementation

  1. Is there a place for the student to review the testing rules and verify they understand the rules?
  2. What types of exams can be monitored by the software?
  3. For classes that require work to be handwritten, is there a process to verify the submission?
  4. If the test requires opening of a program such as Excel or a web browser, is there an ability to do so outside of the lock down browser?
  5. Can the faculty member indicate whether they want the student to be interrupted during the exam if they are not following the indicated exam rules? If the student displays aberrant behavior?
  6. Will the proctor stop the exam if blatant cheating occurs?

Reporting of Aberrant Behavior

  1. Is the dashboard easy to view, interpret, and use?
  2. What types of aberrant behavior will the software flag?
  3. Can the faculty members determine the aberrant behavior they want flagged?
  4. How easy is it to view and evaluate the aberrant behavior when flagged by the service?
  5. Is it possible to save the video or evidence of the aberrant behavior for later challenges to disciplinary actions?


Online Journal of Distance Learning Administration, Volume XXI, Number 1, Spring 2018
University of West Georgia, Distance Education Center
Back to the Online Journal of Distance Learning Administration Contents