![]() |
Contents Introduction The Findings
A Key Message to Campus Leaders This report offers a unique and comprehensive view of your students' perceptions regarding your institution. In it, you'll learn how satisfied your students are and what's most important to them - a combination that pinpoints your institution's strengths and areas in need of improvement. Specifically, you'll
learn the answers to questions such as:
In essence, you have in your hands a blueprint for improving your institution's effectiveness. You can use this information to identify institutional strengths which should be highlighted in student recruitment; to accelerate your student retention initiatives; to advance your efforts in strategic planning, self-studies for accreditation and total quality management; and to align your budget decisions with your students' priorities. You'll also find it is well worth your while to share the report's findings as encouragement and feedback to your faculty, staff and students. To get the most value from student satisfaction studies, we recommend that you compare your students' perceptions over time. Annual surveying allows you to provide systematic feedback to your internal and external constituents on the effectiveness of all campus programs and services. You will have the information needed to assess the effectiveness of your special initiatives and to determine priorities for current student populations. Now on to the
report! About the Student Satisfaction InventoryT
The Student Satisfaction Inventory measures students' satisfaction with a wide range of college experiences. Principles of consumer theory serve as the basis for the inventory's construction. Therefore, students are viewed as consumers who have a choice about whether to invest in education and where to enroll. In addition, students are seen as individuals who have definite expectations about what they want from their campus experience. From this perspective, satisfaction with college occurs when an expectation is met or exceeded by an institution. Students rate each item in the inventory by the importance of the specific expectation as well as their satisfaction with how well that expectation is being met. A performance gap is then determined by the difference in the importance rating and the satisfaction rating. Items with large performance gaps indicate areas on campus where students perceive their expectations are not being met adequately. Because the Student Satisfaction Inventory results in three different scores for each item, a significant amount of information is generated for institutional decision makers. Importance score ratings reflect how strongly students feel about the expectation (the higher the score, the more important it is to a student, hence the stronger the expectation). Satisfaction ratings show how satisfied students are that your institution has met the expectation (the higher the score, the more satisfied the student). Performance gap scores (importance rating minus satisfaction rating) show how well you are meeting the expectation overall. A large performance gap score for an item (e.g., 1.5) indicates that the institution is not meeting students' expectations, whereas a small or zero gap score (e.g., .50) indicates that an institution is meeting students' expectations, and a negative gap score (e.g., -.25) indicates that an institution is exceeding students' expectations. In addition to the information provided by the three measurements for each item, inventory composite scales offer a "global" perspective of your students' responses. The scales provide a good overview of your institution's strengths and areas in need of improvement. Three versions of the inventory are available: the Community, Junior and Technical College version, the 4-Year College and University version, and the 2-year Career and Private School version. Each version captures the unique features of the type of institution for which it was developed. At the end of this report, you'll find the version of the instrument your campus used. Student responses are compared to corresponding national groups as follows: 4-year private institutions are compared with other 4-year private institutions, 4-year public institutions are compared with other 4-year public institutions, community, junior and technical colleges are compared with other community, junior and technical institutions, and 2-year career and private schools are compared with other career and private schools.
The Items The Student Satisfaction Inventory collects student feedback on over 100 items. Included are:
The Scales Community, Junior and Technical College Version and Career and Private School Version For the community, junior and technical college and career and private school versions of the inventory, 70 items of expectation and 6 items that assess the institution's commitment to specific student populations are analyzed statistically and conceptually to provide the following 12 composite scales: Academic Advising and Counseling Effectiveness assesses the comprehensiveness of your academic advising program. Academic advisors and counselors are evaluated on the basis of their knowledge, competence and personal concern for student success, as well as on their approachability. Academic Services assesses services students utilize to achieve their academic goals. These services include the library, computer labs, tutoring and study areas. Admissions and Financial Aid Effectiveness assesses your institution's ability to enroll students in an effective manner. This scale covers issues such as competence and knowledge of admissions counselors, as well as the effectiveness and availability of financial aid programs. Campus Climate assesses the extent to which your institution provides experiences that promote a sense of campus pride and feelings of belonging. This scale also assesses the effectiveness of your institution's channels of communication for students. Campus Support Services assesses the quality of your support programs and services which students utilize to make their educational experiences more meaningful and productive. This scale covers career services, orientation, child care, and special programs such as Veterans' Services and support services for displaced homemakers. Concern for the Individual assesses your institution's commitment to treating each student as an individual. Those groups who frequently deal with students on a personal level (e.g., faculty, advisors, counselors) are included in this assessment. Instructional Effectiveness assesses your students' academic experience, the curriculum, and the campus's overriding commitment to academic excellence. This comprehensive scale covers areas such as the variety of courses offered, the effectiveness of your faculty in and out of the classroom, and the effectiveness of your adjunct faculty and graduate teaching assistants. Registration Effectiveness assesses issues associated with registration and billing. This scale also measures your institution's commitment to making this process as smooth and effective as possible. Responsiveness to Diverse Populations assesses your institution's commitment to specific groups of students enrolled at your institution, e.g., under-represented populations, students with disabilities, commuters, part-time students, and older, returning learners. Safety and Security assesses your institution's responsiveness to students' personal safety and security on your campus. This scale measures the effectiveness of both security personnel and campus facilities. Service Excellence assesses the attitude of staff toward students, especially front-line staff. This scale pinpoints the areas of your campus where quality service and personal concern for students are rated most and least favorably. Student Centeredness assesses your campus's efforts to convey to students that they are important to the institution. This scale measures your institution's attitude toward students and the extent to which they feel welcome and valued. Some items on the inventory contribute to more than one scale. In addition, four items (numbers 3, 9, 53 and 68) are not included in any of the two-year scales.
The Scales 4-Year College and University Version For the 4-year college and university version of the inventory, 73 items of expectation and 6 items that assess the institution's commitment to specific student populations are analyzed statistically and conceptually to provide the following 12 composite scales: Academic Advising Effectiveness assesses the comprehensiveness of your academic advising program. Academic advisors are evaluated on the basis of their knowledge, competence and personal concern for student success, as well as on their approachability. Campus Climate assesses the extent to which your institution provides experiences which promote a sense of campus pride and feelings of belonging. This scale also assesses the effectiveness of your institution's channels of communication for students. Campus Life assesses the effectiveness of student life programs offered by your institution, covering issues ranging from athletics to residence life. This scale also assesses campus policies and procedures to determine students' perceptions of their rights and responsibilities. Campus Support Services assesses the quality of your support programs and services which students utilize in order to make their educational experiences more meaningful and productive. This scale covers areas such as tutoring, the adequacy of the library and computer labs, and the availability of academic and career services. Concern for the Individual assesses your institution's commitment to treating each student as an individual. Those groups who frequently deal with students on a personal level (e.g., faculty, advisors, counselors, residence hall staff) are included in this assessment. Instructional Effectiveness assesses your students' academic experience, your curriculum, and your campus's overriding commitment to academic excellence. This comprehensive scale covers areas such as the variety of courses offered, the effectiveness of your faculty in and out of the classroom, and the effectiveness of your adjunct faculty and graduate teaching assistants. Recruitment and Financial Aid Effectiveness assesses your institution's ability to enroll students in an effective manner. This scale covers issues such as competence and knowledge of admissions counselors, as well as the effectiveness and availability of financial aid programs. Registration Effectiveness assesses issues associated with registration and billing. This scale also measures your institution's commitment to making this process as smooth and effective as possible. Responsiveness to Diverse Populations assesses your institution's commitment to specific groups of students enrolled at your institution, e.g., underrepresented populations, students with disabilities, commuters, part-time students, and older, returning learners. Safety and Security assesses your institution's responsiveness to students' personal safety and security on your campus. This scale measures the effectiveness of both security personnel and campus facilities. Service Excellence assesses the perceived attitude of your staff toward students, especially front-line staff. This scale pinpoints the areas of your campus where quality service and personal concern for students are rated most and least favorably. Student Centeredness assesses your campus's efforts to convey to students that they are important to your institution. This scale measures the extent to which students feel welcome and valued. Some items on the inventory contribute to more than one scale. In addition, there are two items (numbers 35 and 72) which are not included in any of the four-year scales.
Reliability and Validity The Student Satisfaction Inventory is a very reliable instrument. Both the two-year and four-year versions of the SSI show exceptionally high internal reliability. Cronbach's coefficient alpha is .97 for the set of importance scores and is .98 for the set of satisfaction scores. It also demonstrates good score reliability over time; the three-week, test-retest reliability coefficient is .85 for importance scores and .84 for satisfaction scores. There is also evidence to support the validity of the Student Satisfaction Inventory. Convergent validity was assessed by correlating satisfaction scores from the SSI with satisfaction scores from the College Student Satisfaction Questionnaire (CSSQ), another statistically reliable satisfaction instrument. The Pearson correlation between these two instruments (r=.71; p<.00001) is high enough to indicate that the SSI's satisfaction scores measure the same satisfaction construct as the CSSQ's scores, and yet the correlation is low enough to indicate that there are distinct differences between the two instruments.
The Inventory Authors The Student Satisfaction Inventory was developed by Laurie A. Schreiner, Ph.D., and Stephanie L. Juillerat, Ph.D., with assistance from Noel-Levitz. Dr. Schreiner is associate dean and professor of psychology at Eastern College in St. Davids, Pennsylvania, and Dr. Juillerat is assistant professor of psychology at Azusa Pacific University in Azusa, California.
A Word About Noel-Levitz Noel-Levitz is the preeminent consulting firm in the nation that provides comprehensive programs and services to colleges, universities and postsecondary systems throughout North America. Since its founding in 1984, the higher education professionals at Noel-Levitz have consulted directly with over 1,500 colleges and universities nationwide in the areas of:
Noel-Levitz has developed an array of proven tools including software programs, diagnostic tools and instruments, video-based training programs, and customized consultations, workshops and national conferences. With the Student Satisfaction Inventory, the Institutional Priorities Survey, and the Adult Student Priorities Survey (for students 25 and older), the firm brings together its many years of research and campus-based experience to enable you to get to the heart of your campus agenda. Our alliance with the USA Education family of companies has linked our content expertise to new technologies and services that together ensure top results for our clients. For more information, contact:
As you review your results, it is important to consider all of the information provided. Three areas of measurement are especially significant: importance, satisfaction and performance gaps (the difference between importance and satisfaction). Focusing on only one area of measurement, such as performance gaps, is likely to result in overlooking areas of the campus experience that your students value most. A combination of scores provides the most dynamic information for institutions to consider when developing an action agenda. Using the matrix below helps the institution conceptualize their student satisfaction data by both retention priorities and marketing opportunities. In addition, it helps pinpoint areas where resources can be redirected from areas of low expectation to areas of high expectation. Matrix for
Prioritizing Action
The national comparison scores indicated throughout the report are for institutions similar to your own. For example, if you are a 4-year private institution, your scores are compared to 4-year private institutions. The national comparison scores are specific to 4-year private institutions, 4-year public institutions, community, junior and technical colleges, or to 2-year career and private schools. Each section of the Campus Report has a distinct purpose, as described below.
Demographic Summary The two-page Demographic Summary reveals your students' responses to 13 standard demographic items and to two optional items your institution may have defined. Frequency and percentage scores are reported for each item. To learn how the optional items were defined, please consult your institution's inventory administrator.
Scale Summary This section of the report presents the summary score for the 12 scales in the traditional chart format. The three areas of measurement for each scale - importance, satisfaction, and performance gap - for your institution's data are presented alongside those of the national comparison group. In addition, standard deviations (variability of responses) are presented for the satisfaction means for both your institution and the national group.
Institutional Summary This section of the report presents all inventory data in a traditional chart format. The three areas of measurement for each scale and item - importance, satisfaction and performance gap - for your institution's data are presented alongside those of the national comparison group. In addition, standard deviations (variability of responses) are presented for the satisfaction means for both your institution and the national group. The last column shows the difference between your institution's satisfaction means and the national group satisfaction means. If the mean difference in these scores is a positive number, then your students are more satisfied than the students in the national comparison group. If the mean difference is a negative number, your students are less satisfied than the students in the national comparison group. The statistical significance in the difference of these means has also been calculated. The key for the levels of significance appears at the bottom of each page. The greater the number of asterisks, the greater the confidence in the significance of this difference, and the greater the likelihood that this did not occur by chance. For example, statistical significance at the .05 level indicates that there are five chances in 100 that the difference between your institution's satisfaction score and the national comparison group satisfaction score would occur due to chance alone. The .01 level indicates a one in 100 chance and the .001 level indicates a one in 1000 chance. Means for importance and satisfaction are calculated by summing respondents' ratings and dividing by the number of respondents. The performance gap means are calculated by taking the difference between the importance rating and the satisfaction rating. Four charts are included in this section:
Please note:
Summary Items This brief section measures overall student satisfaction with your campus by revealing the extent to which students perceive their expectations have been met, their overall level of satisfaction, and the likelihood that they would enroll again at your institution if they had it to do all over again. The means and standard deviations for both your campus and the national group are reported along with the differences between the two means.
Comparison to Other Institutions This section offers CCCCD's results of 12 scales in comparison to other community colleges. For item to item comparison, please email mailto:lfernandez@collin.edu
Target Group Reports Optional Target Group Reports, if requested by your institution, appear in one of the formats described below. These reports focus on specific groups of students on your campus. The target groups are defined by the items in the Demographic Summary section of this report. The first Target Group Report format, the Comparative Summary Analysis, offers a quick synopsis of the scores for your chosen target group(s). At a glance, you can compare your overall campus scores with such groups as males, females, full-time, part-time, day, evening, first-year, second-year, and any other group for whom you have demographic data. Scale results are presented in alphabetical order, followed by item results in order of importance to students at your institution. For easy reference, you'll also see your overall campus scores alongside the composite national comparison group. The national comparisons are specific to institutions like yours, but not specific to the target group. The second Target Group Report format, the Single Group Analysis, is similar to the Campus Report but focuses on only one target group (e.g., female students, full-time students, evening students, or any other group for whom you have demographic data). Like the Campus Report, this analysis includes a demographic summary, a complete review of scale and items scores, and the summary items. The national comparison group data provided is for the selected target group at similar institutions. Example: if you selected part-time students and your national comparison group is community, junior and technical colleges, the Single Group Analysis will provide national comparison data for part-time students at other community, junior and technical colleges. Custom Target Group Report options include:
Optional Comparison Reports, if requested by your institution, appear in one of the formats described below.
Free Phone Consultation To review your results and to discuss ideas for next steps on campus, feel free to call us at 1-800-876-1117. Or e-mail julie-bryant@noellevitz.com to arrange for a convenient time to meet. An on-campus executive summary consultation is also available. A Noel-Levitz consultant will present and review your data with campus constituencies. Additional fees apply. Call 1-800-876-1117 for more information.
Setting Priorities and Direction Now that you've identified the expectations of your students, you are ready to take the next critical step. To effectively impact your campus, you'll want to focus on key campus issues that have been brought to light by this report, then proceed to develop awareness, increase readiness for action planning, and ultimately, design and implement your action agenda. There are a
variety of ways that campuses like yours are telling us they are using the
Student Satisfaction Inventory data. These include:
Institutions are telling us that sharing the information with all campus constituencies is important to begin the improvement process. It is important to balance the identified strengths and weaknesses when disseminating the information. Cabinet and trustee meetings, faculty meetings, committee sessions, the student newspaper and the student government are all vehicles being utilized on campuses to share the data and to begin assembling feedback. Further discussion in focus group sessions is a popular method to provide additional clarification of particular items and to begin problem-solving in targeted areas. (Campuses combining the use of the Student Satisfaction Inventory with the Institutional Priorities Survey focus first on those issues identified as priorities for action by both students and campus personnel.) |