Predicting Student Performance in Web-Based Distance Education Courses Based on Survey Instruments Measuring Personality Traits and Technical Skills



Michael Hall, Ph.D, P.E.
Ivy Tech Community College
mhall@ivytech.edu

Abstract

Two common web-based surveys, “Is Online Learning Right for Me?’ and “What Technical Skills Do I Need?”, were combined into a single survey instrument and given to 228 on-campus and 83 distance education students. The students were enrolled in four different classes (business, computer information services, criminal justice, and early childhood development) on three different regional campuses of a mid-western community college. Multiple regression equations were developed with the survey scores and the type of class to determine the extent to which the survey scores predicted final semester grade percentages. Although distance education students scored significantly higher on the survey instruments than on-campus students, the survey scores explained only 8% of the observed variance in their final grade percentage. The course in which they enrolled (business, computer information services, criminal justice, and early childhood development) explained most of the variance in final grade percentage. There was no significant difference in survey scores between students that withdrew from their classes and those that remained to receive a final semester grade. Recommendations for further research are suggested.

Introduction

There is a general consensus that the number of students receiving an education through distance learning has continued to grow steadily (Saba, 2005). In 2004 the National Center for Education Statistics (NCES) reported that course enrollments in distance education increased from 1.7 million in 1997-98 to 3.1 million in the academic year 2000-01. Not only are institutions offering individual courses via distance education, but the number of entire programs has increased as well. Among institutions offering any distance education courses, the NCES reported that the proportion offering degree programs completed entirely via distance education increased between 1997-98 and 2000-01 from 22 to 30%. Certificate programs increased from 7 to 16% during the same period. Although various technologies can be used to transmit educational content, the dominant delivery system used is the Internet coupled with browser-based technology (Compora, 2003; Howell, William, & Lindsay, 2003).

However, it seems reasonable to assume that not all students would find the web-based distance education learning environment as attractive as a traditional face-to-face environment (Lorenzetti, 2005a). A number of researchers have suggested that the retention rate for distance education students is lower than for traditional face-to-face students (Nash, 2005; O'Brien & Reneer, 2002; Scalese, 2001). Administrators interviewed for The Chronicle of Higher Education (Carr, 2000) agreed that course completion rates in distance education courses are often 10-20 percentage points lower than in traditional classes. The lower retention rates for distance education students would indicate that some type of pre-assessment might be useful in identifying students that could have potential problems in a distance education environment.

Several researchers have discussed the need for self-assessments for potential distance education students (Biner, Bink, Huffman, & Dean, 1995; Buchanan, 1999; Lorenzetti, 2005b; Maki & Maki, 2003; Restauri, 2004; Valasek, 2001). Kidder (2004) reported that 48% of faculty surveyed indicated that pre-assessment of students' technological skills was not present, but that pre-assessment was important to the success of on-line classes. A total of 77% reported pre-assessment is important, while only 29% recorded that it was present in their institution. However, commonly-used readiness questionnaires given to students prior to the start of a course may not be useful in selecting students who will do well in and be satisfied with technology-mediated courses (Maki & Maki).

Purpose of the Study

The purpose of this study was to determine the extent to which two commonly available survey instruments, “Is Online Learning Right for Me?” and “What Technical Skills Do I Need?”, are accurate predictors of student performance in web-based distance education courses. The survey “Is Online Learning Right for Me?” was developed by the Northern Virginia Community College Extended Learning Institute. It is a 16-question, forced response, self-scoring guide that has face validity for individual traits and skills believed to contribute to potential success in an Internet-based course (Buchanan, 1999; Noah, 2001). Each multiple choice question has three answers. Students are advised to select only one of the three answers. Selecting answer “A” equates to 3 points; selecting “B” equates to 2 points; selecting “C” equates to 1 point. The total score is calculated by adding up the individual question scores. The survey is designed such that the higher the total score, the greater the presumption for success in a distance education course. A copy of this survey is provided in Appendix A.

The survey “What Technical Skills Do I Need?”, provided in Appendix B, was developed by Palm Beach Community College. It is a 15-question, forced response, self-scoring guide with face validity for computer-self efficacy. Most of the skills assessed are directly related to Internet applications such as e-mail, chat rooms, and Internet searches. Each multiple choice question also has three answers. Students are advised to select only one of the three answers. Selecting answer “A” equates to 3 points; selecting “B” equates to 2 points; selecting “C” equates to 1 point. The total score is calculated by adding up the individual question scores. The survey is designed such that the higher the total score, the greater the presumption for success in a distance education course. A copy of this survey is provided in Appendix B.

Both surveys provide one of three responses based on the total score achieved by the respondent. Tables 1 and 2 list the three responses and the total scores needed to receive the indicated response.

Table 1

“Is Online Learning Right for Me?” Survey Responses and Total Score Ranges

Abbreviated Survey Responses

Total Score Ranges

“You are well suited …”

36 – 48

“You may succeed in an online course, but …”

28 – 35

“You are not well suited …”

16 – 27

McMillan and Schumacher (2001) define face validity as “a judgment that the items appear to be relevant” (p. 241). An examination of the surveys (Appendices A and B) reveals that the items included have the appearance of relevancy to individual traits and skills believed necessary for success in distance education. Yet face validity is considered the weakest form of construct validity (Trochim, 2006, Face Validity section, ¶ 1).

Table 2

“What Technical Skills Do I Need?” Survey Responses and Total Score Ranges

Abbreviated Survey Responses

Total Score Ranges

“…you have the technical skills and knowledge”

33 - 45

“You may succeed in an online course, but …”

26 - 32

“…your present technical skills are not sufficient”

15 - 25

Predictive validity is the “ability to predict something it should theoretically be able to predict” (Trochim, 2006, Predictive Validity section, ¶ 1). Predictive validity is determined by establishing the relationship between scores on an assessment and some measure of success in the situation of interest. The criterion of successful future prediction makes predictive validity stronger than face validity.

This study is a predictive validity study of the two survey instruments, “Is Online Learning Right for Me?” and “What Technical Skills Do I Need?” The two research questions addressed by this study were:

1. To what extent is the survey instrument “Is Online Learning Right for Me?” an accurate predictor of student performance in web-based distance education courses?

2. To what extent is the survey instrument “What Technical Skills Do I Need?” an accurate predictor of student performance in web-based distance education courses?

Summary of Procedures

Introduction

Both surveys were combined into a single instrument. A paper and pencil version was distributed to on-campus education students. A web-based version was developed for use by students enrolled in distance education classes.

The combined survey was then administered during the first two weeks of the semester to distance education and traditional on-campus students at three different regional campuses of a Midwestern community college. Students from four different programs were sampled: business (BUS), computer information services (CIS), criminal justice (CRJ), and early childhood development (ECE). These programs were selected to ensure a variety of computer skill levels.

Participants

The participants consisted of 351 community college students from three different regional campuses of a Midwestern community college. Of the 345 students reporting their gender, 51.6% were female and 48.4% were male. A total of 341 students reported their ethnicity. The ethnic breakdown was 76.5% White Caucasian, 15.0% African American, 4.4% Hispanic, 2.0% Multiracial/Other, 1.5% Pacific Islander/Asian, and 0.6% American Indian. The average age was 27.6 (SD = 9.1) with a minimum of 18 and a maximum of 65 years.

Returned Surveys

Surveys were distributed to 340 students enrolled in traditional on-campus courses and 289 students enrolled in web-based distance education classes. Valid responses were obtained from 261 traditional on-campus students and 90 web-based distance education students with return rates of 76.7% and 31.1%, respectively.

Table 3 presents the survey return results by regional campus site. Site 1 was located within a large urban population center. Site 1 had a total student enrollment in excess of 14,000 fulltime and part time students. Site 2 was located near a mid-sized city and had a total student enrollment of just over 7,800 students. Site 3 was the smallest site. It was located in a more rural setting and had a total enrollment of just under 2,600 students.  Table 4 presents the survey return results by course category (BUS, CIS, CRJ, and ECE). 

Table 3

Total Returned Surveys by Site

 

Type of Instruction

 

 Site 1

 

Site 2

 

Site 3

 

Subtotals

On-campus Instruction

    156

      80

      25

      261

Web-based Instruction

      63

        8

      19

        90

Subtotals

    219

      88

      44

      351

Table 4

Total Returned Surveys by Course

 

Academic Course

 

Type of Instruction

BUS

CIS

CRJ

ECE

Subtotals

On-campus Instruction

    65

     78

     78

     40

261

Web-based Instruction

    17

     42

     14

     17

 90

Subtotals

    82

   120

     92

     57

351

During the course of the semester several students opted to drop from their courses. Of the original 351 students that returned surveys 40 later dropped from their courses, leaving 311 students who received final grade percentages at the end of the semester. Table 5 lists the number of students in each course who returned a survey and received a final semester grade percentage for completing the course. Table 6 lists the means and standard deviations for the survey scores from the two surveys by type of instruction for the participants completing their courses.

Table 7 lists the percentage of participants receiving advice in the three categories listed in Table 1 for the survey “Is Online Learning Right for Me?” Table 8 lists the percentage of participants receiving advice in the three categories listed in Table 2 for the survey “What Technical Skills Do I Need?” It will be noted that the majority of participants were advised that they possessed both the personality traits and technical skills needed to be successful in distance education.

Table 5

Number of Students Completing both the Survey and Completing Their Course by Course Total Returned Surveys by Course

 

Academic Course

 

Type of Instruction

BUS

CIS

CRJ

ECE

Subtotals

On-campus Instruction

    57

     69

68

     34

228

Web-based Instruction

    17

     37

12

     17

  83

Subtotals

    74

   106

80

     51

311

Table 6

Means and Standard Deviations for both Surveys by Type of Instruction

 

Is Online Learning Right for Me?

What Technical Skills Do I Need?

 

On-campus

Distance Education

On-campus

Distance Education

M

37.0

40.0

38.6

40.8

SD

3.5

3.5

5.3

3.7

Table 7

Percentage of Participants Receiving Advice from “Is Online Learning Right for Me?”

Abbreviated Survey Responses

On-campus

Distance Education

“You are well suited …”

71.5

91.6

“You may succeed in an online course, but …”

28.1

8.4

“You are not well suited …”

0.4

0.0

Table 8

Percentage of Participants Receiving Advice from “What Technical Skills Do I Need?”

Abbreviated Survey Responses

On-campus

Distance Education

“…you have the technical skills and knowledge”

72.0

89.2

“You may succeed in an online course, but …”

26.2

10.8

“…your present technical skills are not sufficient”

1.8

0.0

Statistical Analysis

This study is an evaluation of the predictive validity of these two survey instruments. To answer the two research questions several regression models were constructed. The general form of the multiple regression equation was:

FGP = b1Online_Learning + b2Tech_Skills + b3x3 + b4x4 + b5x5 + c

FGP = Final Grade Percentage
b1 = Regression coefficient associated with Online Learning
Online_Learning = Score from the survey “Is Online Learning Right for Me?”
b2 = Regression coefficient associated with Tech_Skills
Tech_Skills = Score from the survey “What Technical Skills Do I Need?”
b3 = Regression coefficient associated with independent variable x3
x3 = First dichotomous variable representing course type (see Table 9)
b4 = Regression coefficient associated with independent variable x4
x4 = Second dichotomous variable representing course type (see Table 9)
b5 = Regression coefficient associated with independent variable x5
x5 = Third dichotomous variable representing course type (see Table 9)
c = Constant

The independent variables consisted of the scores from the two survey instruments and the type of course (BUS, CIS, CRJ, or ECE) taken by the student. The four types of courses (BUS, CIS, CRJ, or ECE) were coded into the regression equation using three dichotomous variables. The dependent variable was the student’s final grade expressed as a percentage. A total of six regression models were constructed. The first three models were generated from data obtained from on-campus students. The first on-campus model used all five independent variables. The second on-campus model contained only the two variables  representing the survey instrument scores. The third on-campus model employed only the three dichotomous variables representing the type of course.  The remaining three regression models, constructed from data obtained from the distance education students, followed the same format as the three on-campus regression models. In addition to multiple regression analysis ANOVAs were conducted comparing the average survey scores between students who completed their courses and those who dropped.

Table 9

Values for Dichotomous Variables x3, x4 and x5

Type of Class

Value for x3

Value for x4

Value for x5

BUS

0

0

0

CIS

1

0

0

CRJ

0

1

0

ECE

0

0

1

Statistical Results

Regression Models

The categorical variables used to describe the course taken by the participants explained more of the variance in final grade percentages than the two survey scores. This can be shown by examining the Adjusted R2 values for the different regression models applied to the same set of participants (Table 10).

Table 10

Comparison of Regression Model Statistics

 

Regression Model

 

n

 

Adjusted R2

 

F

 

p

On-campus complete regression model

228

.07

F(5,222) = 4.62

< .001

On-campus survey scores only

228

-.01

F(2,227) = .06

.94

On-campus categorical variables only

228

.08

F(3,224) = 7.71

< .001

Distance education complete regression model

83

.20

F(5,77) = 4.96

.001

Distance education survey scores only

83

.08

F(2,80) = 4.66

.01

Distance education categorical variables only

83

.11

F(3,79) = 4.45

.006

In the case of the on-campus participants the categorical variables accounted for all of the variance in the final grade percentage. The majority of the observed variance in the final grade percentage for the subset of distance education participants is also explained by the categorical variables. Approximately 11% of the observed variance in the final grade percentage of the distance education participants was explained by the categorical variables.

Only 8% of the observed variance in the final grade percentage of distance education students was explained by the two survey scores. Thus, the course chosen by a student is more predictive of their final grade point percentage than their scores on either of the two surveys administered.

Analysis of Variance (ANOVA)

A one-way ANOVA was conducted to determine if there were differences in the “Is Online Learning Right for Me?” survey scores between students who withdrew from a class and those who received a final grade. A total of 40 students chose to withdraw from their particular class out of the original sample of 351 participants. A random sample of 40 students was selected from the remaining 311 students who completed their coursework and received a grade. There was not a statistically significant difference in the “Is Online Learning Right for Me?” survey scores between the two groups, F (1,78) = 1.66, p = .20.

A second one-way ANOVA was conducted to determine if there were differences in the “What Technical Skills Do I Need?” survey scores between students who withdraw from a class and those who received a final grade. Using the previous sample of 80 participants (40 completers, 40 non-completers), there was not a statistically significant difference in the “What Technical Skills Do I Need?” survey scores between the two groups, F (1,78) = .47, p = .49.

Conclusions

This study suggests that the two survey instruments have little predictive value in determining a student’s final semester grade percentage for three reasons. First, the observed variance in the final grade percentages of distance education students explained by the independent variables representing the survey instruments was only 8%. This was less than half of the total observed variance in the tested regression model for distance education students. The categorical variables (representing the course in which the student enrolled) explained more of the observed variance in the final grade percentages in both models.

Second, the feedback provided by the surveys is categorical. Students do not receive a score after completing the survey. Instead, they receive one of three responses.  As shown in Table 5 and Table 6 most participants received positive feedback regarding their traits or technical skills. As compared to numerical responses, categorical data leads to a loss of participant discrimination. Finally, the one-way ANOVAs comparing surveys scores between students who withdrew from classes and those who received a final grade percentage were not significantly different.

Recommendations for Further Research

The evidence suggests that the two surveys “Is Online Learning Right for Me?” and “What Technical Skills Do I Need?” have little ability to predict student performance in distance education courses. Any use of these instruments, particularly by institutions of higher education for counseling or providing advice, should be carefully considered. The primary value of the surveys may lie in raising awareness for any student considering enrolling in a distance education course. The items listed in the surveys reflect individual traits and technical skills generally believed necessary to be successful in a distance education course. However, the rising use of the Internet for instructional delivery, coupled with the desire to improve student retention, continues to generate a need for a viable prediction instrument for advising students considering distance education courses. Four recommendations have been provided.

Recommendation One

The first recommendation would be to employ a different statistical test. Instead of developing multiple regression models to predict final grade percentages, logistic regression or discriminate function analysis may be employed. The dependent variable, student success, would be defined as passing the course with a grade of “D” or higher. A dichotomous distribution would divide participants into two groups: those who pass and those who fail. The two survey scores would serve as the independent variables.

Recommendation Two

Another recommendation for further research would be to explore the use of other existing assessments as potential predictors of student performance in distance education courses. One additional set of tools that could be considered are those instruments associated with measuring the Big Five personality factors. Unlike previous personality traits derived from personality theories, the Big Five personality traits are empirically derived from statistical analysis of traits that occur together within the population (Srivastava, 2006). The Big Five personality traits, or personality dimensions, are (a) Openness to Experience, (b) Conscientiousness, (c) Extraversion, (d) Agreeableness, and (e) Neuroticism. Within the inventory these personality dimensions are referred to as OCEAN.

Several inventories have been developed to measure the Big Five Personality Factors. Among these inventories include the Big Five Inventory (BFI; Benet-Martinez & John, 1998), the International Personality Inventory Pool (IPIP; Pennsylvania State University, n.d.), and the NEO Five Factor Inventory (NEO-FFI; Costa & McCrae, 1999).

Another potential assessment for further study is the Readiness for Online Learning questionnaire previously studied by Smith, Murphy, & Mahoney (2003) and Smith (2005). Smith (2005) suggests that further work to improve the two items that did not factor load distinctively may further enhance the value of this instrument. An additional attraction of the Readiness for Online Learning questionnaire is its brevity (13 items). Further items could be added to enhance its suitability to assess readiness and potentially predict performance.

Recommendation Three

Another suggestion for further research is to investigate the possibility of improving the predictive validity of the existing surveys. A factor analysis may suggest which items should be rewritten or deleted from the surveys.

Recommendation Four

The final recommendation is to develop a completely new survey instrument.

Limitations

Efforts were made to improve the generalizability of this study. Three different community college campus sites, each with unique characteristics, were chosen. Four different programs, each with a different emphasis on the applicability of computers to their subject, were sampled. These steps were taken in an attempt to create a more heterogeneous sample. However, the generalizability of this study is constrained by the following limitations:

  1. Return rates will reflect what is expected for in class surveys and for Internet-based distance education surveys.
  2. Participants may not respond honestly to each survey item.
  3. The participant pool was limited to students enrolled in community college courses.
  4. The participant pool was a convenience sample drawn from students self selecting into on-campus or distance education courses.

It is worth noting, however, that one limitation of this study has particular significance to both this study, recommendations for further research, and implications for practice. It is the second limitation which states “participants may not respond honestly to each survey item”.

Several researchers have noted the tendency for individuals to inflate their skills in both social and intellectual domains (Dunning, Heath, & Suls, 2004; Kruger & Dunning, 1999; Strube, Lot, Le-Xuan-Hy, Oxenburg, & Deichmann, 1986). This self-inflation of skills may also serve as a contributor to the low predictive reliability of the two surveys studied in this research. Researchers pursuing any of the four research recommendations made earlier will need to take this into consideration in the modification or design of any self-assessment for distance education students. Paradoxically, Kruger and Dunning (1999) note that improving the skills of participants also improved their self-assessments of the same skill set.

Implications for Practice

The rising enrollments in distance education classes coupled with lower retention of distance education students have led researchers to discuss the need for self-assessments for potential distance education students (Biner, Bink, Huffman, & Dean, 1995; Buchanan, 1999; Lorenzetti, 2005a; Maki & Maki, 2003; Restauri, 2004; Valasek, 2001). Although the two surveys “Is Online Learning Right for Me?” and “What Technical Skills Do I Need?” are readily available and widely used, their lack of predictive validity should discourage their continued use by organizations and institutions of higher education. Although these two surveys appear to have face validity in terms of the conventional wisdom associated with individual traits and skills deemed necessary for success in distance education, the lack of predictive validity should remove these instruments as candidates for self-assessments for potential distance education students.

Until a more suitable instrument is developed institutions of higher education should utilize two processes to prepare potential distance education students for academic success and improve retention in distance education courses. First, institutions should continue their current practices with regard to academic pre-assessment of incoming students, regardless of the method of instruction. Second, institutions should provide orientation sessions for students seeking to enroll in distance education courses. In a study of predictors of student success in online classes, Wojciechowski and Louann (2005) found that participation in an optional orientation session prior to taking the online class had the second highest relationship to the final grade received in that class, for both the overall student population (r = .338; p < .001), as well as within the group of students earning a grade of “C” or better ( r = .240; p = .012). In a study of 478 distance education students, 57% of those receiving a “D”, “F”, or an incomplete (“I”) in a distance education course felt that an orientation to distance learning prior to the start of the course would have been beneficial (Nash, 2005). Restauri (2004) believes such orientation sessions should be mandatory for distance learning students.

Some institutions, such as the community college in this study, currently offer a mandatory general orientation session for all incoming students. This session is designed to introduce students to general study skills, time management skills, and services available for academic success. It may be possible to incorporate additional content related to success in distance education courses. This could include introducing the basic technical skills identified for success (Dupin-Bryant, 2004; Osika & Sharp, 2003) along with any study skills, time management skills, or services considered unique to the distance learning environment. Those institutions without such an orientation should consider implementing such sessions.


References

Biner, P., Bink, M. L., Huffman, M. L., & Dean, R. S. (1995). Personality characteristics differentiating and predicting the achievement of televised-course students and traditional-course students. The American Journal of Distance Education, 9, 46-60.

Benet-Martınez, V., & John, O. P. (1998). Los Cinco Grandes Across cultures and ethnic groups: Multitrait-multimethod analyses of the Big Five in Spanish and English. Journal of Personality and Social Psychology, 75, 729–750. Retrieved November 26, 2007, from the Testmaster, Inc. site: http://www.testmasterinc.com/products/

Buchanan, E. A. (1999). Assessment measures: Pre-tests for successful distance teaching and learning? Journal of Distance Learning Administration, 2(4). Retrieved January 30, 2006, from http://www.westga.edu/~distance/buchanan24.html

Carr, S. (2000, February 11). As distance education comes of age, the challenge is keeping the students. The Chronicle of Higher Education, A39.

Compora, D. E. (2003). Current trends in distance education: An administrative model. Online Journal of Distance Learning Administration, 6(2). Retrieved May 27, 2008 from http://www.westga.edu/~distance/ojdla/summer62/compora62.html

Costa, P. T., Jr., & McCrae, R. R. (1992). Revised NEO Personality Inventory (NEO-PI-R) and NEO Five-Factor Inventory (NEO-FFI) professional manual. Odessa, FL: Psychological Assessment Resources.

Dupin-Bryant, P. A. (2004). Pre-entry variables related to retention in online distance education. American Journal of Distance Education, 18, 199-206.

Howell, S. L., Williams, P. B., & Lindsay, N. K. (2003). Online Journal of Distance Learning Administration 6(3). Thirty-two trends affecting distance education: An informed foundation for strategic planning. Retrieved May 27, 2008 from http://www.westga.edu/~distance/ojdla/fall63/howell63.html

Kidder, M. L. (2004). Institutional elements that encourage or discourage the implementation of on-line classes in Illinois community colleges (Doctoral dissertation, Northern Illinois University, 2004). Dissertation Abstracts International-A 65 (05). (UMI No. 3132423)

Lorenzetti, J. P. (2005a). Lessons learned about student issues in online learning. Distance Education Report, 9(6), 4-5.

Lorenzetti, J. P. (2005b). Secrets of online success: Lessons learned from community colleges. Distance Education Report, 9(11), 3-6.

Maki, R. H., & Maki, W. S. (2003). Prediction of learning and satisfaction in web-based and lecture courses. Journal of Educational Computing Research, 28, 197-219.

McMillan, J. H., & Schumacher, S. (2001). Research in education: A conceptual introduction (5th ed.). New York: Addison Wesley.

Nash, R. D. (2005). Course completion rates among distance learners: Identifying possible methods to improve retention. Online Journal of Distance Learning Administration 8(4). Retrieved July 29, 2006 from http://www.westga.edu/~distance/ojdla/winter84/nash84.htm

National Center for Education Statistics. (2004). Contexts of postsecondary education: Learning opportunities: Distance education at postsecondary institutions. Retrieved September 23, 2006, from http://nces.ed.gov/programs/coe/2004/section5/indicator32.asp

Noah, C. (2001). Making the grade in distance education. (ERIC Document Reproduction Service No. EJ 639486)

O'Brien, B. S., & Reneer, A. L. (2002). Online student retention: Can it be done? Paper presented at the ED-MEDIA 2002 World Conference on Educational Media, Hypermedia & Telecommunications Proceedings, Ohio: Wright State University.

Osika, E. R., & Sharp, D. P. (2003). Minimum technical competencies for distance learning students. Journal of Research on Technology in Education, 34, 318-325.

Pennsylvania State University. (n.d.). The IPIP-NEO (International Personality Item Pool Representation of the NEO PI-R™). Retrieved November 26, 2007 from the Pennsylvania State University site: http://www.personal.psu.edu/~j5j/IPIP/

Restauri, S. L. (2004). Creating an effective online distance education program using targeted support factors. TechTrends, 48(6), 32-48.

Saba, F. (2005). Critical Issues in distance education: A report from the United States. Distance Education, 26, 255-272.

Scalese, E. R. (2001). What can a college distance education program do to increase student persistence and decrease attrition? Journal of Instructional Delivery Systems, 15(3), 16-20.

Smith, P. J. (2005). Learning preferences and readiness for online learning. Educational Psychology, 25, 3-12.

Smith, P. J., Murphy, K. L., & Mahoney, S. E. (2003). Towards identifying factors underlying readiness for online learning: An exploratory study. Distance Education, 24, 57-68.

Srivastava, S. (2006). Measuring the big five personality factors. Retrieved July 12, 2007 from http://www.uoregon.edu/~sanjay/bigfive.html

Trochim, W. M. (2006). Measurement validity types. Retrieved November18, 2007, from the Web Center for Social Research Methods Web site:
http://www.socialresearchmethods.net/kb/measval.php

Valasek, T. (2001). Student persistence in web-based courses: Identifying a profile for success. (ERIC Document Reproduction Service No. 466276)

Wojciechowski, A., & Louann, B. P. (2005). Individual student characteristics: Can any be predictors of success in online classes? Online Journal of Distance Learning Administration, 8(2). Retrieved August 23, 2006 from http://www.westga.edu/~distance/ojdla/summer82/wojciechowski82.htm


Appendix A

Is Online Learning Right For Me?

Please circle the answer that best represents your response to each question. Answer each question as truthfully as possible.

1. My need to take this course this semester was:
            A. High – I needed it immediately for a degree, job or other important reason.
            B. Moderate – I could have taken it later or substituted another course.
            C. Low – It’s a personal interest that could have been postponed.

2. Feeling that I am a part of a class is:
            A. Not particularly important to me.
            B. Somewhat important to me.
            C. Very important to me.

3. I would classify myself as someone who:
            A. Often gets things done ahead of time.
            B. Needs reminding to get things done on time.
            C. Puts things off until the last minute.

4. Classroom discussion is:
            A. Rarely helpful to me.
            B. Sometimes helpful to me.
            C. Almost always helpful to me.

5. When an instructor hands out directions for an assignment I prefer:
            A. Figuring out the instructions for myself.
            B. Trying to follow the directions on my own, then asking for help as needed.
            C. Having the instructions explained to me.

6. I need faculty comments on my assignments:
            A. Within a few weeks, so I can review what I did.
            B. Within a few days, or I forget what I did.
            C. Right away, or I get frustrated.

7. Considering my professional and personal schedule, the amount of time I have to work on an online version of this course would be:
            A. More than 8 hours a week.
            B. 4 to 8 hours a week.
            C. Less than 4 hours a week.

8. When I am asked to use computers, or the Internet, new software, E-mail, or other technologies new to me:
            A. I look forward to exploring and learning new skills.
            B. I feel apprehensive, but try it anyway.
            C. I put it off or try to avoid it.

9. As a reader, I would classify myself as:
            A. Good – I usually understand the material without help.
            B. Average – I sometimes need help to understand the material.
            C. Slower than average.

10. My experience/level of comfort with the Internet is:
            A. Strong (I use the Internet a lot).
            B. Average (I use the Internet occasionally).
            C. Weak (I have never used the Internet before or have used it very little).

11. My experience/level of comfort with computers in general:
            A. Strong (I use computers every day).
            B. Average (I use computers when I need to).
            C. Weak (I have never used a computer before or very little).

12. For students considering taking MAT 111, PHY 101, PHY 102, DSN 221, DSN 222 or MAT121. My level of comfort with mathematics is:
            A. Strong (I have always done well in math classes).
            B. Average (I don’t have too many problems in math classes).
            C. Weak (I find math very difficult).
            D. I’m not going to take any of those courses.

13. If I take an online class I would access the Internet through a computer:
            A. In my home.
            B. At school or at work.
            C. At another location.

14. When I need help understanding a topic or directions:
            A. I am comfortable approaching the instructor to ask for clarification.
            B. I am uncomfortable approaching the instructor, but do it anyway.
            C. I don’t approach an instructor to admit I don’t understand.

15. When faced with situations that fail to go as I planned or expected, I:
            A. Independently seek solutions to problems as they arise.
            B. Wait for instructions or assistance before tackling a problem.
            C. Am easily frustrated and tend to give up when confronting difficulties.

16. I express myself best with:
            A. Word-processed communication, including E-mail.
            B. Handwritten communication and U.S. mail.
            C. Oral, verbal communication.


Appendix B

What Technical Skills do I Need?

1. I have regular access to:
           A. A computer and the Internet at home.
           B. A computer but not the Internet at home.
           C. A computer and the Internet only at school.

2. The access speed to the Internet Service Provider (ISP) which I use is:
           A. very fast and is through a TV cable or some other high speed line.
           B. through a fast modem (56K or higher).
           C. through a slow modem (below 56K).

3. How often do you send, receive and open email attachments?
           A. I use email several times each day.
           B. I use it infrequently (one a week or less).
           C. I have never used it.

4. How often do you use bookmarks (also called Favorites) to manage the sites you visit frequently on the Internet?
           A. I use them to manage the sites I visit frequently on the Internet.
           B. I use them but infrequently.
           C. I never use them.

5. How often do you use search engines to locate information on the Internet?
           A. I use them frequently and successfully.
           B. I use them but before but not often.
           C. I have never conducted an Internet search.

6. How often do you create attached files in the email messages you that you send?
           A. I create, save, and attach files to email frequently.
           B. I have emailed attachments but not very often.
           C. I never attached a file to an email message.

7. When requested to use or save documents in a different file type such as an “RTF” “Rich Text Format” or an HTML file,
           A. I would have no difficulty.
           B. I have done it but a reminder of the process would help.
           C. I am not sure that I would know how to do that.

8. If a plug-in or other software were required for a computer,
           A. I would be able to download and install it.
           B. I have done it before, but some instructions would help.
           C. I have no idea what you are talking about or how to do such a thing.

9. If the computer system I was using had problems,
           A. I would be able to decide how to handle the problem.
           B. I think I would call a help line and be able to describe the problem.
           C. I would have no idea what to do.

10. Do you know how to use bulletin (discussion) boards?
           A. I use them with little or no difficulty.
           B. I have used them but a refresher on their use would help.
           C. I have not used them.

11. Do you know how to use chat rooms?
            A. I use them with little or no difficulty.
            B. I have used them but a refresher on their use would help.
            C. I have not used them.

12. My keyboarding skills and my ability to use word processing software is:
           A. Very good.
           B. Okay, but it takes me a while.
           C. Non-existent.

13. I would access the Internet through a computer:
           A. In my home.
           B. At school or at work.
           C. At another location.

14. When asked to print a web page:
           A. I would have no difficulty.
           B. I have done it but a reminder of the process would help.
           C. I am not sure that I would know how to do that.

15. How would you describe your ability to work with multiple windows, i.e., resizing, minimizing, closing, etc.?
           A. I can successfully manage several windows on my desktop.
           B. More than one open application or more than one window confuses me.
           C. I am not sure what the question means.



Online Journal of Distance Learning Administration, Volume XI, Number III, Fall 2008
University of West Georgia, Distance Education Center
Back to the Online Journal of Distance Learning Administration Contents