Information Found and Not Found: What University Websites Tell Students



Katrina A. Meyer
University of Memphis
kmeyer@memphis.edu


Stephanie Jones
Texas Tech University
stephanie.j.jones@ttu.edu


Abstract

This study investigates how graduate students experience their university websites, or the institutional “virtual face.” The sample included graduate students admitted to online and blended higher education programs at Texas Tech University and the University of Memphis. A total of 42 students provided open-ended answers to questions about information they needed, could not find, or found with much effort. Their responses paint a picture of adult students who often struggle to find basic information or services (e.g., email login, registration) on institutional websites that are important functions for graduate students. They were also asked what messages the websites produced and should produce and who the intended audience was. The students perceive the audience to be students, but still find the messages mostly to be about marketing the institution rather than addressing their functional needs.

Introduction

Colleges and universities focus time and staff resources on developing and improving their websites in an attempt to convey to students important information about the institution.  The institutional website is its “virtual face,” the face it has chosen to present to the online world, including potential and current students. Institutions, and especially distance learning administrators, depend upon their website to recruit students and ensure the best match of student needs and institutional resources, but do institutions ask students what they want from a website? And what might an institution learn if they did so?  This process reverses the usual flow of information, which is from institution to student, to an information flow from student to institution.  This research asks the question, “What do graduate students learn from their institutions’ websites?” Or, another way to conceptualize this question is to discover what they think their institution’s virtual face is saying to and about students. Distance learning administrators must be sensitive to the message the “virtual face” is sending to potential students who do their research about degree programs online.

Review of Literature

To better understand the need and justification for this research, it is essential to review communication theory as it pertains to websites, the current state of research on higher education websites, and to establish the importance of these websites to students.

Communication Theory and Websites

Communication theory and human communication deals with the making and exchange of meaning. It is the transmitting of information from one person to another, which is communication at its most mechanistic. In psychology, communication is the sending of a message to a receiver, with attention to the feelings and thoughts of the receiver as he or she contemplates the meaning of the message.  Therefore, communication is not a simple or perfect process, but one open to interpretation and misunderstandings on the part of both sender and receiver.  It is perhaps not surprising that websites – as creations of humans attempting to communicate – are a fertile field for applying communication theory as well as studying how and whether communication occurs.

An entire body of theory and literature has grown up around the design and evaluation of websites.  Olsina, Covella, and Rossi (2006) discuss at length the importance of and criteria for web quality. Quality varies, in their view, depending on the type of site, users’ viewpoints, and the context for use. They review ISO (from the International Organization for Standardization) standards and propose a “quality in use” measure that “can be used to validate the extent to which the software or Web application meets specific user needs” (Olsina et al., 2006, p. 116). Two of the four qualities in use characteristics are particularly pertinent for the proposed research. “Effectiveness” is the capability “to enable users to achieve specified goals with accuracy and completeness” and “Productivity” enables “users to expend appropriate amounts of resources in relation to the effectiveness achieved” (Olsina et al., 2006, p. 117). 

Chiou, Lin, and Perng (2010) conducted a review of the website evaluation literature from 1995 to 2006 and compiled an extensive list of factors that have been evaluated, from “ease of navigation” (49 studies) to “content relevance and usefulness” (44 studies). They propose a five-stage evaluation process based on their literature review focusing on developing metrics for each site. Marcus and Gould (2000) stress the importance of attending cultural differences when designing websites, which is especially important for businesses operating internationally. Universities, too, need to pay attention to different cultures’ interpretation of power, order, authority, and social role when designing a website intended to attract international students.

Research on Higher Education Websites

The research on institutional websites is comprised of few research studies. Green (2002, 2003, 2004, 2005, 2006) has tracked the implementation of a number of online services (e.g., online course registration, course management systems) in his annual Campus Computing Project. Although extremely valuable for tracking the addition of new online services each year, identifying emerging “hot” issues, or benchmarking the progress of an institution against its Carnegie peers, the Project has not evaluated how well websites perform. While the data from Green’s annual Campus Computing surveys are certainly valuable, they do not analyze the effectiveness of higher education websites in satisfying the informational needs of college students.

An exception to the lack of research is a study described by St. Sauver (2003, 2004a, 2004b). The study evaluated 172 university websites for the University of Oregon. All of the universities were members of the American Association of Universities (AAU) and/or Tier 1 or 2 doctoral universities as established by the U.S. News and World Report in 2003. While much of the study looked at technical issues (e.g., popular software products, use of cookies), it also focused on ways university websites were used such as noting the popularity of audience segmentation (which groups services of likely interest to “future students,” “current students,” and “faculty and staff,” among others) and different search functions (use of A-to-Z indices or search windows).

Gordon and Berhow (2009) conducted a content analysis of 232 university websites, specifically looking for “dialogic features” that allow the visitor to request more information, an RSS feed, an appointment, or to send an email to a particular office (such as admissions or financial aid). The number of dialogic features ranged from a low of 6.8 to a high of 53.2 features. They found that liberal arts colleges used more of these dialogic features on their websites than national doctoral universities. They also found a small correlation (r = 0.146, p < 0.05) between the number of dialogic features and student retention rate.

As for studies investigating the availability of specific types of information on higher education websites, Eduventures (2007) surveyed more than 500 adult students. In this study, only 63% of those surveyed found their search for information useful.  Eduventures (2007) concluded that higher education websites “come up short with respect to content” (¶ 2). The Eduventures report urged higher education to improve the quality and depth of content as well as search functions on home pages. This is one of the few studies that asked students about their experiences with institutional websites.

Further study of institutional websites required a methodology that did not rely on user input, but on objective measures. Meyer (2008a) developed such a methodology and used it to ask how higher education institutions were using their home pages and how well these home pages performed. The analysis was based on Gurak’s (2001) criteria for evaluating websites and it found that 34% of the links on the home pages dealt with student needs (e.g., admissions, registration, and course listings) and another 43% were classified as “functionality” (e.g., providing services to faculty and staff and providing functions aligned with operating the institution).  One of the hardest pieces of information to find was tuition and fees, which could be found within three to four clicks of the mouse for doctoral/research institutions, but one to two clicks for community colleges, and not found at all in 15% of the sample (or six of 40 institutions).  And while many home pages were well designed, others were messy and/or required users to hunt for important services and were deemed difficult for the inexperienced user.  The findings indicated that higher education’s “virtual face” may indeed be functional for insiders, but it was confusing to users who are new to higher education or the web.

Using the same methodology, Meyer (2008b) identified 20 information items that legislators and/or parents were interested in knowing about higher education institutions.  A total of 58.5% of the data sought could not be found, and 40% of the data elements found were more than three “clicks” of the mouse away from the institution’s home page. In other words, even when some of the information could be found, it took skill and persistence to find it.

This same methodology was then applied to find information of interest to prospective African-American and Hispanic undergraduate students by Wilson and Meyer (2009). Findings indicated that about half of the sample of 40 institutions did not provide information on offices for minority students and many other targeted services were missing from institutional websites.  This can be interpreted in one of two ways: either these services were offered, but could not be found, or they were not offered.  Either explanation is interesting, but could also be important for a planner wishing to know how well competitors were doing in appealing to minority students and what services were being offered by successful institutions. This final question was explored in Meyer and Wilson (2010), which proposed a way to use institutional websites as a way for planners at other institutions to explore the “competitive advantage” of degree programs of any kind, be they online or traditional.

Students’ “E-expectations” for “E-recruitment”

There is little doubt that students use institutional websites to do their research on potential colleges. Noel-Levitz (2007a) asked high school students about their preferences for investigating potential colleges and universities.  College-bound high school students were interested in information on academic programs, admissions, and financial aid (Noel-Levitz, 2007a). When the study was repeated (Noel-Levitz, 2010), one in four students reported “removing a school from their prospective list because of a bad experience on that school’s Web site” (p. 1) and 92% said they would be disappointed with a school or remove it from further consideration if they “didn’t find the information they needed on the school’s Web site” (p. 1).  Clearly, the university website is an essential part of recruiting students, and is perhaps becoming more critical as students rely more on web-based information to make decisions on where to attend college.

When Noel-Levitz (2007b) asked 1000 prospective graduate students about their e-expectations, students stated that they wanted “a connection” with the institution and faculty (p. 1). Most of these students want information about programs of study and financial aid in an electronic form (i.e., on the web or email).  In other words, “graduate programs need to put as much information as possible within a few mouse clicks of their target audience” (Noel-Levitz, 2007b, p. 1). Prospective graduate students, in ranking the importance of different types of information from 1 (not important) to 5 (extremely important), ranked “graduate program detail” as 4.77, scholarship/assistantship information as 4.62, tuition/cost/fees as 4.38, and details on the faculty as 4.03 (p. 3). These studies tend to support the assertion that an institution’s “virtual face” is essential to recruiting graduate students.

What is clear from this review of studies on institutional websites is the lack of focus on what current graduate students need to find and what current websites convey to them about the institution.  This study is an attempt to ask students’ to assess their institutional websites to find answers to the general question, “What do graduate students learn from their institutions’ websites?”

Methodology

Design

Because this research is the first of its kind to ask students what they learn from institutional websites, a qualitative approach was felt to be best suited. Patton (2002, p. 14) noted “qualitative methods typically produce a wealth of detailed information about a much smaller number of people and cases.” Researchers utilize a qualitative approach with the intent of allowing themes to emerge from data collected (Creswell, 2003). “Qualitative inquiry is particularly oriented toward exploration, discovery, and inductive logic” (Patton, 2002, p. 55). Because this study asked students to comment on their uses of and messages received from institutional websites, an exploratory and inductive approach to analysis seemed most appropriate.

Settings

This research draws upon the students admitted and enrolled in two graduate-level programs in higher education. Because it was important to ensure that students had extensive experiences with institutional websites, it was decided to focus on students enrolled in online and blended programs and specifically the needs of adult, graduate students.  Two such programs were found.  First, Texas Tech University offers four graduate-level programs, a blended Ed.D. and Ph.D. in higher education, as well as an online Ed.D. program in higher education with a community college administration emphasis.  They also offer a blended Master's of Education in Higher Education and Student Affairs. Second, the University of Memphis offers three graduate-level programs online, a Master of Science in Leadership and two Ed.D. programs in Adult Education and in Higher Education, plus a blended master’s program in Student Personnel. While the institutions are different, they are also similar. Both are located in the southern region of the U.S., with Texas Tech University in the southwest and University of Memphis in the southeast.  Both are large, publically-supported research institutions, offering degrees at the undergraduate and graduate levels.  Both are developing online programs to serve a larger regional student population.

Sample and Population

The population of graduate students in the online and blended doctoral and master’s programs at Texas Tech totaled 90 students; the final sample included responses from 22 students for a 24.4% response rate. The population of graduate students in the online and blended programs at the University of Memphis is 85 students; the final sample included responses from 20 students for a 23.5% response rate. Table 1 presents a profile of the sample in comparison to the population of both programs based on five data elements.

Table 1

Sample and Population of Programs (Data as of Spring 2011)

 

Data Elements

Texas Tech University

University of Memphis

Sample

Population

Sample

Population

Gender
   Female
   Male

77.3% 
22.7% 

56.2%
43.8%

42.0%
58.0%

54.0%
46.0%

I work
   Full time
   Part time

90.9% 
 9.1% 

98.1%
  1.9%

100.0%
0.0%

97.7%
 2.3%

Student Type
   Master’s
   Doctoral

22.7% 
77.3% 

45.1%
54.9%

10.0%
90.0%

  6.0%
94.0%

The students ranged in age from 21-30 (19.5%), 31-40 (31.7%), 41-50 (24.4%), and 51-60 (24.4%), placing this sample clearly within the adult student category with half of the sample in the over 40 category. To further describe the sample as representative of the working professional, students were asked their reasons for taking courses, and 40% were pursuing the degree for professional development, 25% were trying to advance in their careers, and 30% wanted to prepare for a new career.  Based on the profile in Table 1, the sample was deemed relatively representative of the population of students admitted to the separate graduate programs and that these are primarily adult, working professionals.

The Institutional Review Boards of both institutions granted approval to conduct this research.

Instrument Development

The instrument used to collect data for this study was developed by the authors. As a first step, the instrument would review the students' rights in research and, if agreeable, allow the students to proceed on to the instrument. If they did not indicate their understanding of, and agreement to, their rights as human subjects, the survey would not let them proceed further. Then, students were asked to open the home page to their respective institution so that they could focus on that site as they pondered answers to the survey questions. Largely open-ended questions were developed based on the research reviewed earlier, and to focus students’ attention on what they could or could not find on their institution’s website and what the site seemed to communicate to them. The six survey questions were:

1.         What information that you need should be on this home page (or one click away from this page)?
2.         What information that you needed was never found?
3.         What information that you needed was found after much effort or only through the help of another?
4.         When you look at your institution’s home page, what are its two or three main messages?
5.         Who (what audiences) are these messages for?
6.         What do you think are the two or three main messages that should be made (or displayed) on the university’s home page?

The first author created the instrument on SurveyMonkey and the second author reviewed the instrument. A student was asked to pilot the instrument on SurveyMonkey to ensure it worked appropriately and errors were identified and corrected.

Data Collection

Both institutions create email programs for all students admitted into specific degree programs at the institution and these email programs are available to the author located at the respective institution. Data were collected during the spring 2011 semester, with emails sent by the authors on January 18 inviting students to participate in the study and providing them with the link to the SurveyMonkey site.  After two weeks, a follow-up email was sent to thank individuals who had completed the survey and to invite remaining students to participate in the study; this email also provided a deadline for completing the survey of one week thereafter. A final email announced the closing of the SurveyMonkey site and thanking all of the students for their participation.

Data Analysis

For ease of analysis, the first three survey questions can be viewed as asking students to evaluate the information found (or not found) on their institutional websites. The answers were compiled, coded, and themes developed. With themes in hand, tentative answers were crafted and relevant literature or theory was applied to understand the answers in context to previous literature.

The last three survey questions can be described as an attempt to capture those messages the website seems to hold for students, who those messages are for, and what the messages should be. The answers were also compiled, coded, and themes developed for each question and then across questions. With themes developed, tentative answers were crafted and relevant research or theoretical literature was consulted to understand the meaning of the students’ answers. 

To add reliability of the analysis, one author took the lead in developing codes and initial themes and the second author checked approximately 5% of all codes and themes.  The development of tentative answers to the questions was done collaboratively, with each author suggesting answers to be challenged or questioned by the other author. These conversations were held online or over email, with both authors discussing, suggesting alternative explanations, and developing meaning based on our understanding of research and theory in this field. This continuous process of questioning, discussing, and developing resolutions helped to add reliability to the results and greater confidence in the conclusions.

Findings

The results are presented for each question in tabular form and are followed by general comments. This section will close with an analysis of the students’ responses across all of the questions and answers.

Information Needed, Not Found, or Found with Effort

Table 2 presents the summarized answers to the question, “What information that you need should be on this home page (or one click away from this page)?” Responses have been grouped by number of mentions to ease analysis.

Table 2

Information Needed

Needed Information

Number of Mentions

Admission information
Course management system
Libraries
Calendar

6

Search window
Financial aid

5

List of degree programs
Program requirements
Email
List of departments

4

List of faculty, staff and contact information
Student services
Jobs
Portal

3

Program costs
Bursar’s office
Forms
Graduate School
Registration
Recreation and events

2

Items that received a single mention were:  transcripts, facts, information about current students, paying tuition, list of colleges, course schedule, syllabi, helpdesk, news, campus map, and weather closure information.

Despite the differences in the frequency of mentions, all of the information needed appears to be of one type:  information that helps the student function as a student.  From finding programs, getting admitted, registering for classes, and using the library, all of these help the student navigate the forms, processes, and policies of being a graduate student. If there is an interesting message from these responses, it is to the developers of websites. These are the services that students need to have within one click of the university’s home page, and not buried through multiple pages of information. A good search mechanism might obviate this requirement, but students first need to know the words to use to result in a fruitful search.

The answers to the question, “What information that you needed was never found?” are easier to present. Twenty-five students indicated that they had always found the information they needed.  However, three students mentioned that the search results (from using the search window on the home page) were useless, which puts the importance of a good search mechanism mentioned previously in a different light. All other answers were a single mention for each of the following:  how to contact the registrar by phone, online learning courses, email login and passwords, paying tuition, the business office, faculty contact information, course syllabi, course sequencing, and how to access online courses.

For the question, “What information that you needed was found after much effort or only through the help of another?” the answers have been compiled in Table 3, grouped by frequency of mention.

Table 3

Information Found After Effort or Help

Information

Number of Mentions

Faculty, staff directory
Graduation requirements & forms
Job openings
Scholarships

3

Calendar
Registrar’s office
Log in to course management system

2

Single mentions of items that were eventually found included: list of degree programs, bursar’s office, library, institutional research, graduate catalog, personal student records, course descriptions, course schedule, ID cards, application, final exam schedule, and the institution’s portal. This list of items that were hard to find but eventually were found is perhaps a tribute to students helping students, or the difficulty of navigating the home pages for these students. They do learn to navigate to the services they need, but perhaps it need not be as difficult for them as it currently is.

One is struck by the basic nature (registration, email) of many of the services mentioned in response to these first three questions; how did they survive without this information?  But they did stay enrolled, so they found the information they needed in some way, perhaps by asking for help, searching the site, or calling the departmental secretary. But obviously, it was not easy for some of them and they seem to clearly remember their travails through the web-based maze of university pages. 

Messages from the Virtual Face

The next three questions focus on the university home page as a means of communication, its messages and audiences.  Table 4 presents the students’ responses to the question, “When you look at your institution’s home page, what are its two or three main messages?”  Again, the responses are presented in groups and by the number of mentions.

Table 4

Main Messages


Messages

Number of mentions

“Here’s important information “

19

“Here’s what’s going on or events to do” or “a lot happens here”

16

Marketing to new students or “this is the place to attend”

13

“Who we are” (branding, school colors, mascot, logo, mission)

11

“Great things faculty and students are doing”

8

“Research is important”
“Giving or donating is important”
“Using social media (e.g., Facebook) to connect to students”

5

“Diversity is valued”
“Student services are available”

4

“The campus is picturesque”

2

Two additional messages were mentioned only once, but are worth reporting. For one student, the site was “mostly marketing, not student-centered” which captures the intended purpose of many of the messages above.  Another student wrote the site was “administratively organized rather than functionally organized.” It is intriguing that students are able to read the home page much like communication theory proposes: it conveys messages that they can read like a book, and that they respond to psychologically, including an interpretation of the message. To them, these messages seem to indicate that the site not only does not serve them (or perhaps that it is not meant to) as well as they think it could. This insight is contradicted in part by the next set of responses.

Table 5 presents the responses to the question, “Who (what audiences) are these messages for?” 

Table 5

Audiences for Home Page


Audience

Number of mentions

Students

31

Visitors (community, public, legislators)

11

Parents

9

Faculty and staff

8

Donors
Alumni

5

University administrators

3

Despite the overwhelming insight that the home page is for or about students, one student wrote, “Not students.”  Why might there be such a disconnection between the messages (in Table 3) and the perceived audience?  Are these graduate students saying that the home page works better as a recruitment tool for prospective students than an everyday tool for current students?  Perhaps they recognize that the audience is mainly for students, but not for students like themselves. These questions will require additional research to unravel.

The answers to the final question, “What do you think are the two or three main messages that should be made (or displayed) on the university’s home page?” can be found in Table 6 and can be usefully contrasted with Table 4. 

Table 6

The Messages That Should Be

 

Message

Number of mentions

“We are functional for students” (log in to email, apply, find classes, meet deadlines)

16

“Who we are” (the mission, priorities, history, and accomplishments)

11

“What’s going on” (events)

10

“Why you should go to college here” (recruitment)

8

“What is newsworthy”

7

“We make it easy to find things” (EZ contacts, search window)
“We are an education-focused institution” and “Student success is important to us”

6

“Welcome to the university [name]”
“We value racial diversity”
“Students are valued” (or “we are student-friendly”)

4

“Faculty research and teaching is interesting”

3

“Support the university” (with a donation or gift)

2

Single mentions included: “We are accountable” (with a link to required reports to boards or accreditors), “We have solutions to problems” (as in computer bugs), “Our alumni are successful,” “Here are examples of student research,” and “The city of Memphis has a rich culture.”

We can see that the students still believe that institutions ought to communicate their newsworthy accomplishments and advertise events, to make clear to all what it is and stands for, and to recruit new students.  These are students of higher education so they understand the importance to institutions of addressing these needs.  But they clearly feel that the number one purpose of the home page is to make it easier or functional for them. Or as one person wrote, it looks like a “big bucket of a bunch of people dumped their information” onto the page.  In contrast, the website should “be a tool (like the library)” rather than a repository of disparate bits of information. 

Discussion

Of course, a home page must serve many purposes, and it is difficult to make everyone happy. However, the results from this study imply that these graduate students are not happy with the current version of their university home page, they find important information difficult to find, and they believe the home page may be sending the wrong message.  They recognize the importance of the home page so the university can communicate to its various internal and external publics, recruit students and “toot its horn” as one student put it. But why is it so difficult to find what they need so they can function as a student? They need to find and enroll in courses, log into email or the course management system, call the bursar’s office, or find what tuition will cost.  The information invariably is there and they find it after some effort or help, but why must it be so hard to go from screen to screen, guessing that they have found the right link, but having to try several times to get what they want?  Why are these connections not clearer for students?

These graduate students are largely adult students and may not be as tech-savvy as many undergraduates are, but it is unfair to blame them for this situation. As reasonably intelligent individuals, they have signed up for online or blended degree programs and are committed to being good learners and earning a degree. So addressing their difficulties with finding their way to the services they need cannot be the sole responsibility of the student, it is also the responsibility of the program, program faculty, distance learning administrators, and university as well.  The faculty at the University of Memphis developed a “one-stop” resource center for online doctoral students (at http://www.memphis.edu/lead/hiad/online/edd_resources.php) with links to many of the functions the students need throughout their career, as well as aids to writing, using the library, and various policies. It needs constant updating and is not easy to find for novice students, but perhaps its usefulness will grow over time. Perhaps an introduction to the institution’s web-based services needs to be incorporated into every orientation session for new graduate students, including guides in text, video, or audio.  Perhaps distance learning administrators need to create tools specifically for the adult student who struggles with finding important information at the beginning of their studies.

What can the university do to alleviate these problems? It can incorporate the needs of current students in future redesigns of the university web pages and recognize the legitimate needs of students for easy functionality or access to the programs and information they need every day. For the novice, perhaps a link to information on how to access this information is needed rather than depend on students’ willingness to drill through several pages of material to find what they want. Many graduate students in professional programs like these have full-time and demanding jobs, and find wasting time a stressor and unnecessary. Many of these students work as administrators at colleges and universities, and their prior knowledge of a college’s inner workings was not always sufficient to get them the information they wanted.  What does it say when even experienced individuals in higher education have trouble with navigating these sites?

These results indicate that all universities must make an effort to ask their students – the adult and traditional-aged student, the undergraduate and graduate student – about how the website is working for them.  And not only ask them, but to ask them regularly, since the needs of students may change as the composition and prior experiences of students change over time. After asking them, universities can listen when students tell them what the messages mean to them, and revise accordingly. But do not revise a website with only one function in view, such as recruiting students, since the site has many constituents and is the virtual face of the institution to its new students as well as its continuing students. While recruiting is no doubt an important purpose of the site, it is not the only one and designing for one purpose can communicate a lack of interest or concern in the needs of other users.

So what can we conclude from this effort at asking our students about their experiences on our university websites?  We conclude that we can do better as program faculty to recognize and address our students’ problems with getting information they need from these websites. Both faculty and distance education administrators can advocate that universities ask students to assess the website and to listen to what they say.  We can also urge that all users -- be they distance learning administrators or faculty -- advocate for changes in website design. Studies such as this one confirm that our institutional websites are the “virtual face” of a college or university and the face communicates messages that students understand. Perhaps our “virtual faces” need some work.   


References

Chiou, W., Lin, C., & Perng, C.  (2010). Information & Management, 47(5-6), 282-290.

Creswell, J.W. (2003).  Research Design: Qualitative, Quantitative, and Mixed Methods Approaches. Thousand Oaks, CA:  Sage Publications.

Eduventures. (2007).  Optimizing school web sites as a marketing and recruitment tool (part II).  Boston, MA:  Author.

Gordon, J., & Berhow, S.  (2008). University websites and dialogic features for building relationships with potential students.  Public Relations Review, 35(2) 150-152.

Green, K. C. (2004).  Tech budgets get some relief; cautious support for open source applications.  Retrieved from http://www.campuscomputing.net/sites/www.campuscomputing.net/files/2004-CCP.pdf

Green, K. C.  (2005).  Growing concern about campus IT security; slow progress on IT disaster recovery planning.  Retrieved from http://www.campuscomputing.net/sites/www.campuscomputing.net/files/2005-CCP.pdf

Green, K. C.  (2006).  Wireless networks reach half of college classrooms; IT security incidents decline this past year.  Retrieved from http://www.campuscomputing.net/sites/www.campuscomputing.net/files/2006-CCP.pdf

Green, K. C.  (2003).  Campus Computing 2003.  Encino, CA: Campus Computing Project. 

Green, K. C.  (2002).  Campus portals make progress; technology budgets suffer significant cuts. Retrieved from http://www.campuscomputing.net/summaries/2002/index.html

Gurak, L.P. (2001).  Cyberliteracy.  New Haven, CT: Yale University Press.

Marcus, A., & Gould, E. W.  (2000, July/August).  Crosscurrents: Cultural dimensions and global web user-interface design. Interactions, 7(4).

Meyer, K.A. & Wilson, J.L.  (2010). The “virtual face” of planning: How to use higher education websites to assess competitive advantage. Planning for Higher Education, 38(2), 11-21.

Meyer, K.A.  (2008a).  The “virtual face” of institutions:  What do home pages reveal about higher education? Innovative Higher Education, 33(3), 141-157.

Meyer, K.A.  (2008b).  The “virtual face” of institutions:  Why legislators and other outsiders view higher education as aloof.  The Internet and Higher Education, 11(3). 

Noel-Levitz.  (2007a).  Embracing diversity: Looking at freshman attitudes by race/ethnicity. Retrieved from https://www.noellevitz.com/NR/rdonlyres/FC75324B-6601-414B-9165-B5F28E790C01/0/2008FreshmenAttitudesReport.pdf.

Noel-Levitz. (2007b). Advanced degrees of e-recruitment. Retrieved from https://www.noellevitz.com/documents/shared/Papers_and_Research/2007/E-Expectations%20Graduate%20Edition_0107.pdf

Noel-Levitz.  (2010).  Focusing your e-recruitment efforts to meet the expectations of college- bound students. Retrieved from https://www.noellevitz.com/NR/rdonlyres/52057241-5FD7-450E-8399-C772C1F9A3F9/0/EExpectations_FocusingYourERecruitmentEfforts_0710.pdf

Olsina, L., Covella, G., & Rossi, G. (2006).  Web quality. Retrieved from  http://www.springerlink.com/content/n2h18q8700026212/

Patton, M.Q. (2002). Qualitative Research & Evaluation Methods, 3rd ed. Thousand Oaks, CA: Sage Publications.

St. Sauver, J. (Fall 2003). Selected elements of the 2003 university home page study. Eugene, OR: University of Oregon.

St. Sauver, J. (Winter 2004). More selected elements of the 2003 university home page study. Eugene, OR: University of Oregon.

St. Sauver, J. (Spring 2004). More selected elements of the 2003 university home page study. Eugene, OR: University of Oregon.             

Wilson, J. L., & Meyer, K.A.  (2009).  Higher education websites: The “virtual face” of diversity.  Journal of Diversity in Higher Education, 2(2), 91-102.


Online Journal of Distance Learning Administration, Volume XIV, Number III, Fall 2011
University of West Georgia, Distance Education Center
Back to the Online Journal of Distance Learning Administration Contents