Area 4 (SACS Report 2009-2010)
SACS Fifth-Year Interim Report
Area 4: The institution identifies expected outcomes, assesses the extent to which it achieves these outcomes, and provides evidence of improvement based on analysis of the results in the following area: (Comprehensive Standard 3.3.1) (22.214.171.124 educational programs, to include student learning outcomes)
Judgment of Compliance
In Partial Compliance
Assessment is well-integrated into many of the daily operations of The University of West Georgia. The University’s strategic planning model includes individualized objectives, strategies, assessment methodologies, and decisions based on data used to support the University’s goals. Additionally, expected learning outcomes are identified for all degree programs and data are collected to support needed program changes. However, assessment of the student learning outcomes is not fully systematic or completely implemented in all degree programs. In degree programs that do not currently have well designed and implemented assessment processes, plans and processes are being implemented to rectify this deficiency as noted below.
The University defines its Academic Units as the College of Business, the College of Education, the College of Arts and Sciences, the School of Nursing, the Honor’s College, and the Graduate School. All academic units are under the direction of the Office of the Provost and Vice President for Academic Affairs (VPAA).
As academic units, each of these offices annually engages assessment methodologies to evaluate the effectiveness of its degree programs or demonstrate how these methodologies drive efforts to improve student learning outcomes. Each academic degree program is reviewed annually with the department or college and every five years by the University of West Georgia and every seven years by the University System of Georgia. This review includes both student learning outcomes assessment as well as specific program goals in terms of enrollment, recruiting, and course offerings.
While the process of institutionalizing continuous and systematic feedback is and will always be progressing, currently each unit identifies its mission, objectives, and strategies which are part of the overall University mission and goals. This process includes documentation of collected student outcome assessment data and how these pieces of information are used to make decisions for classroom improvements. This process is aligned with a university-wide process of institutional effectiveness that ensures intentional and transparent budgeting and decision-making in a continual effort to enhance student learning.
Each academic college or school has distinct systems and processes that support the assessment of student learning outcomes. The plans for each academic college and school, along with samples of several degree program assessments, are included below as evidence that assessment of student learning outcomes and the implementation of improvements based on the results is spreading across the UWG campus.
Richards College of Business – Assessment Summary
The Richards College of Business offers undergraduate degrees (BBA, BS and BA) in ten major study areas and five degrees at the graduate level.
The undergraduate BBA degree programs have six learning goals in common and each degree program major has additional learning goals specific to the major. In addition, the BS in Economics, the BA in International Economic Affairs, the BS in Economics with Secondary Education Certification, and the BS in Education in Business Education (offered in conjunction with the College of Education) all have learning goals specific to these degrees and majors. The graduate degree programs have learning goals specified for each separate degree program.
Assurance of learning (assessment) of the common learning goals for programs in the College is conducted on a yearly systematic schedule at the college level. Assessment results, recommended changes or adjustments, and subsequent results of modifications are recorded at the end of each semester. Within a given semester, upon completion of the assessment assignment(s), results are tabulated and reviewed by faculty who teach in the course(s) or area and appropriate adjustments to curriculum, pedagogy and evaluation methods are suggested. The faculty recommendations are then submitted to the RCOB Undergraduate Programs Committee or Graduate Programs Committee for review depending on the degree level of the program goal being assessed. Upon approval from the UPC or GPC, the recommended actions of the faculty and any others added by the respective review committee are implemented in the next cycle of assessment and results of the changes are evaluated at that time.
Assessment of the major specific learning goals at the undergraduate level is conducted by the appropriate department on a yearly basis. Results are examined and improvements are made based on the evaluation of faculty in the respective department.
Assessment of the degree programs that are offered in conjunction with the College of Education is directed and coordinated by the College of Education through faculty in the RCOB who teach in those areas.
External accrediting agencies, (i.e.,AACSB for the BBA, MBA and MPAcc degree programs and NCATE for the degree programs offered in conjunction with the College of Education) focus on the assessment of programs and continuous improvement methods employed within the college.
Program Specific Examples
College of Education - Assessment Summary
The College of Education at the University of West Georgia offers undergraduate, masters, specialist, and doctoral programs across six departments (Curriculum and Instruction; Special Education and Speech Language Pathology; Counseling and Educational Psychology; Educational Leadership and Professional Studies; Health, Physical Education and Sport Studies; and Media and Instructional Technology) along with departments in the College of Arts and Sciences and the Richards College of Business. The College of Education has been committed to programmatic assessment for the past decade and has worked to refine assessment practices over time, as described below.
Since the last SACS visit, the College of Education has developed a more systematic assessment system of its candidates and the impact they are having on P-12 student learning. From summer 2003 – spring 2009 candidates in all programs were assessed at five common points: (1) program admission, (2) in-program, (3) field experience, (4) program completion, and (5) post graduation. In both initial and advanced preparation programs, admission and exit requirements were designed to ensure candidate competencies prior to entrance into a program and prior to recommendation for a certificate or degree. Further, candidates were expected to demonstrate adequate progress throughout their program in order to continue. Assessments included entrance test scores, GPA, work samples, field experience evaluations (completed by the university supervisor and supervising teacher), culminating experience assessments (for advanced programs), and reports on entrance and licensing exam scores (Praxis I, Praxis II, and GACE).
The assessment plan called for the Assessment Coordinator to collect previous year’s data on initial and advanced candidates and generate program-specific reports drawn from the multiple sources of data described above. These reports were to be reported to department chairs by October of each academic year. From there, the Assessment Coordinator was responsible for summarizing the data across the unit, and providing the summary to the Dean, Department Chairs, and faculty.
Technologies used to maintain this assessment system include database software, such as Access, and other software, such as Banner and CourseDen. Many of the programs archived evidence of candidate achievement using Foliotek software, which is also where the data were entered indicating whether candidates had met or not met program standards.
During AY 2009-2010, areas for improvement were identified in the assessment system. First, the ways in which candidates in programs were being assessed varied widely. Second, the data entered into Foliotek were too cumbersome to manage. Foliotek is not a data analysis program and so aggregating data was problematic. Third, rating the candidates using only “met” or “not met” resulted in data that were not useful for program improvement. Because virtually all candidates meet standards by the time they graduate, no discriminations were noted that could lead to specific program changes. Fourth, the assessment system was linked directly to one person, and when that person changed roles, summary reports for programs and the unit were not completed. Finally, very little input was sought on the assessment system or unit evaluation from the professional education community or the other colleges at UWG.
In AY 2009-2010 numerous improvements were implemented to improve the assessment system. First, during a college-wide faculty meeting at the beginning of the fall 2009 semester, the commitment to an evidence-based practice was discussed with a focus on what needed to be done to continue moving forward. Feedback from the faculty meeting in the form of an electronic questionnaire was analyzed, summarized, and distributed to faculty for review with a request for additional feedback. Second, the Assessment Advisory Committee, which has representatives from the College of Education, Richards College of Business, and the College of Arts and Sciences, facilitated a comprehensive assessment system across the unit – examining what we have, and identifying gaps and weaknesses. Third, all programs worked together to create a unified assessment plan that is articulated more clearly with six types of assessments that are more aligned with newer NCATE and Georgia Professional Standards Commission (PSC) requirements: (1) program admission, (2) key assessments at transition points in the program, (3) field experience, (4) impact on P-12 student learning, (5) program completion (the graduation semester), and (6) post graduation (see Comprehensive Candidate Assessment System linked below in Supporting Documentation). Fourth, performance assessments are now scored on a four-point scale (unacceptable, acceptable, proficient, exemplary), which will give a common basis for summarizing data and providing programs with more robust results on which to base programmatic improvements. Fifth, multiple points of responsibility for the assessment system have been identified to ensure that its success is not linked to just one person. Sixth, rubrics were developed and piloted, with candidate ratings recorded with Excel software, which is easy to use and which produces data that are easily downloaded into various other analysis programs. Finally, realistic deadlines for summarizing and reporting data results were established. These summarized data are now being used to make program improvements.
The improved assessment system was piloted in the fall semester 2009, with full implementation in spring semester 2010. Program faculty met in January 2010 to assess the effectiveness of the pilot. Specifically, teams examined whether the new assessment system provided effective and useful data, and also whether the system was manageable from a workload standpoint. The primary goal was to make data-driven decisions/changes to improve programs. Program faculty provided positive feedback in the form of a questionnaire about the effectiveness of the improved assessment system.
As shown in the document linked above, candidates must meet certain requirements for admission to programs, retention in programs, and in some cases, certification or licensure. Key assessments are conducted throughout and following a candidate’s program of study.
The document linked above shows an example of a key assessment conducted in the Bachelor’s in Special Education program. This example is one of the many key assessments that are collectively evaluated to rate a candidate’s performance.
The document linked above shows an example from the School Library Media program. The document includes guidelines for candidate performance on the program’s Exit Assessment. The professional standards of the American Library Association and the American Association of School Librarians, along with descriptors of the College of Education Conceptual Framework, are included. The document also includes examples of representative evidence to be submitted by candidates and rubrics for a candidate’s performance assessment.
The document linked above shows actual data collected for the Exit Assessment described above for the School Library Media program. These data were collected as part of the overall evaluation of the candidates and the program.
School of Nursing - Assessment Summary
The faculty of the School of Nursing are committed to fulfilling the purpose of the University of West Georgia by creating a milieu for learning that fosters “educational excellence in a personal environment.” In the early 1990s the faculty of the SON began to explore more innovative approaches to nursing education The philosophy of the SON was developed and describes this environment as one in which the concept of “caring” is central to the practice of nursing as well as in educational practice.
The School of Nursing offers a Bachelor of Science in Nursing (BSN) degree with two tracks: Generic and RN-BSN. The generic track is for those students who have never been licensed as an RN. This program prepares graduates who are eligible to apply to take the NCLEX-RN, the national licensing examination to become a Registered Nurse. The generic track program admits once each year in the Summer. The RN-BSN track provides the opportunity for individuals who are already licensed as Registered Nurses to complete a BSN degree.
The School of Nursing offers a Master of Science in Nursing degree with role options in either education or health systems leadership and a post- masters certificate program in education and health systems leadership. The M.S.N. program is a professional degree program requiring 36-39 semester hours of credit.
The assessment and evaluation of these BSN and MSN goals are conducted on an ongoing basis through various committees and the School of Nursing as a whole. Methods of assessment of student learning and achievement go well beyond student evaluations, and include the use of the assessment results to inform teaching practices, and curriculum development and revision. These methods include the following:
External assessment measures of student achievement
A standardized testing and assessment program is employed as a component of the admission, progression and graduation process in the pre-licensure BSN program. The initial component of the assessment program measures students’ essential academic abilities, and the score is considered in determining eligibility for admission to nursing. Additional testing within nursing courses is used to help students identify their learning styles and assess their critical thinking skills, knowledge in the clinical nursing specialty areas, and readiness for the licensure exam (NCLEX-RN). After review of aggregate student performance on these assessments, faculty identified the need to assist students in curricular content areas as well as with test-taking strategies. Increased emphasis on health promotion and maintenance is now occurring in the Health Assessment course. The use of simulation has been increased to assist students in critical thinking in maternal and child health scenarios. Faculty are also introducing test-taking strategies in the first clinical course to assist students in mastering the ability to demonstrate their knowledge while answering objective, multiple-option exam questions written at the application and synthesis levels.
NCLEX RN performance informs curriculum
The SON has an expected BSN program outcome related to the pass rates of its graduates on the nursing licensure exam (NCLEX-RN): 80% on the first attempt and 100% within one year of graduation. Despite meeting the expected outcomes at both time intervals, faculty noted a decline in the first-attempt performance of BSN graduates to 87% (2005/2006) from 92% (2004). To improve results, the faculty incorporated questions and computer testing methods into the curriculum and developed a comprehensive approach to enhancing student performance, progression, and graduation. An NCLEX-RN testing expert was hired to provide individual and group instruction for students on test-taking strategies. In addition, both the testing expert and faculty remediated students who were not performing at the expected level on standardized assessments. She also conducted workshops to help faculty enhance test writing skills to increase the level of difficulty of the questions. Progression policies related to performance on standardized assessments were strengthened to encourage student preparation for these tests, and subsequently, for the licensure examination. In the final semester of the pre-licensure undergraduate program, students enroll in an NCLEX-RN preparation course and complete a comprehensive standard assessment. Faculty review individual diagnostic reports of student test performance and direct reviews for students whose scores indicate they have less than a 95% probability of being successful on the licensing exam. Since initiation of the standardized assessment and review program, the performance of graduates has consistently improved and exceeded the expected outcome each year. Notably, since 2006, the UWG NCLEX-RN first-attempt pass rates increased to 94% in 2007 (U.S. BSN rate = 86%) and 99% in 2008 (U.S. BSN rate = 88%). Of the first Newnan campus graduating class, 100% passed on the first attempt. All UWG BSN graduates who have ever attempted the exam were licensed as RN’s within one year of graduation!
Student Advisory Meetings
The SON Total Plan for Evaluation identifies many forms of assessment that inform curriculum and teaching practices. Student Advisory Meetings, conducted annually by the Dean, provide an opportunity for 9 cohorts of MSN, RN-BSN, and BSN students to share their thoughts about strengths, weaknesses, threats and opportunities related to the nursing programs. The Dean conducts these sessions as an open forum and invites constructive critique to inform program and policy development. Faculty review the minutes of these meetings annually to identify trends in concerns that impact curriculum and teaching practices. This formative evaluation process provides the faculty with ongoing student feedback regarding programs of study and enables responsive and timely actions. Recently, undergraduate students voiced concerns related to inadequate pediatric clinical learning experiences and policies related to the utilization of standardized testing in the BSN program. After review of these comments, faculty increased the number of clinical hours in pediatrics and developed additional pediatric clinical sites to enhance student learning. Faculty also revised policies related to the use of standardized test scores in nursing courses to reflect student achievement more equitably. In response to off-campus RN-BSN student feedback, an elective course will now be offered in the summer and a communication board (“Grapevine”) for RN’s was created in WebCT CourseDen.
Active participation of students on SON committees
Student perspectives on the curriculum, policies, and methods of assessment are important to the SON and obtained through their active participation on curriculum, faculty welfare, student affairs, and evaluation committees. The SON Bylaws (Faculty Handbook, pp. 16-22) identify student membership on these committees and students are selected for membership by their peers. Students regularly raise questions and provide suggestions related to course content, teaching and learning practices, policies, and methods to assess learning and satisfaction. In spring 2009, students on the Caring for Faculty Committee identified a desire to participate in the faculty search process and, as a result, were included in the interviews.
A data-driven process for reviewing and reshaping curricula
The SON employs a Total Plan for Evaluation (TPE) that provides the framework for the systematic, ongoing and deliberative assessment of quality and effectiveness in relation to the mission, goals, and expected outcomes of the programs offered. The Evaluating the Caring Community Committee facilitates data collection in relation to program outcomes from current and graduating students, alumni, employers and others in the professional community. All data are reported and reviewed with SON faculty at an annual fall retreat. Areas of concern and activities to meet action plans for the academic year are referred to the appropriate standing committees or others as appropriate for implementation and follow-up. Documentation of this review process and action plans for each year are reflected annually in the SON minutes and identified in the TPE annual reports.
The BSN and MSN curricula are reviewed on a regular basis by the Sustaining the Caring Curriculum Committee as described in the Total Plan for Evaluation (TPE). Review is conducted to determine if the curriculum/program objectives remain relevant, logically organized, reflective of current professional standards for nursing education and practice, and prepare graduates to assume the roles of registered professional nurse, nurse educator and leader. Students are actively involved in the overall evaluation of the BSN and MSN nursing programs via representation on SON committees; student representatives on the Sustaining the Caring Curriculum Committee are encouraged to actively participate in meeting discussions and provide advisement concerning curricular decisions.
College of Arts and Sciences - Assessment Summary
The College of Arts and Sciences offers undergraduate degrees (BFA, BS, BM and BA) in 40 major study areas, 12 degrees at the master’s and doctoral levels, as well as a number of undergraduate and graduate certificates. Of these degree programs, six are fully accredited. A number of departments house programs that are affiliated with the College of Education, an NCATE-accredited unit.
The responsibility for program assessment in the College lies with the individual departments. Each program determines the types of assessment that are appropriate for the goals and outcomes of that program, and the faculty in the department analyze the data and recommend changes to the curriculum as necessary. Departments report on these assessments at various times, including program review cycles, annual reports, and in some cases, accreditation visits.
In the past year, the College has engaged in an examination of the types of assessment information that departments provide, as well as the process by which the information is provided. College administrators have identified several areas that need improvement and have implemented changes. Specifically, they have engaged in a dialogue about student learning outcomes and mapping assessment activities to those outcomes. As a result, the College has formalized the process for documenting assessment processes. Examples of how this process works are included below.
As the College moves forward, departments will document the collection, recording, and analysis of data according to the formal processes they have established. This will include the documentation of steps taken to improve student learning based on data.