Skip to main content

Surveys: On-Campus

The Office of Institutional Effectiveness can provide accounts through a license agreement for faculty and staff to administer surveys to the campus community. The survey tool Qualtrics was chosen for its exceptional ease of use with an advanced set of features. Qualtrics is a web-based survey service allowing users to easily create surveys, collect and store data, and produce reports.

Surveys intended for distribution to members of the campus community should be approved by the Office of Institutional Effectiveness. The purpose of this oversight for surveys on campus includes the following:

  • Protect the rights, privacy, and safety of potential survey respondents.
  • Facilitate the development and administration of high quality survey tools.
  • Minimize survey fatigue of campus community populations.
  • Eliminate collection of duplication information.
  • Ensure effective dissemination of survey results within the campus community.

Survey Policy

Surveys are often administered to students, faculty, and staff to collect data for program improvements. Use of such data is encouraged, but surveys must be administered with care. Online surveys have created opportunities for duplicative efforts, excessive use of staff time, and over-surveying of campus groups. An improvement in the timing and quality of surveys will help Stritch administrators get the most out of survey data.

Survey research needs to be monitored to control both the number and type of surveys administered, and who may conduct surveys on campus and for what purpose.

Assistance with Survey Development, Administration and Reporting

The Office of Institutional Effectiveness will provide assistance with the development, deployment, and analysis of surveys conducted by academic departments, committees, and administrative units.

Alternatives to Survey Research

We must be cognizant of survey fatigue on campus to not over-burden any members of the campus community with unwanted email and to maintain good response rates for important survey endeavors. It is important; therefore, to consider alternatives to survey research when possible. Focus groups, interviews: Face-to-face techniques can be structured to provide very useful information, and can allow for follow-up questions to capture nuances very difficult to learn with a traditional survey. These strategies may eliminate the needs for a survey, or may at least inform and hone the questions to be included on a survey.

Institutional data: Use of existing information available from the Office of Institutional Effectiveness or other administrative offices about courses taken, athletics participation, changes of major, graduation rates,degrees earned, etc. may provide answers to many of your questions. This information can and should be used in lieu of surveying students.

Collect other (non-survey) data: Record performance indicators which might address your question or be compared against a goal. For example, administrative departments may be able to easily keep track of the services for which students visit the office without the need for surveying students at a later date about the reason(s) for their visit. Tracking such data is much more accurate and provides a greater amount of useful data than surveying students at a later date. Also consider measuring the goals of your department against the level of service that is delivered without having to survey students about their experience with that service. For example, rather than asking students about their satisfaction level with a service offering, determine this information based on the number of times students visited, length of time students had to wait for assistance/an appointment, how quickly or number of individuals who needed contacted before a situation was resolved, etc.

Alternative assessment techniques: Workshops and other instructional or service activities lend themselves well to assessments beyond satisfaction surveys.Think about the outcomes you’d like to see, then find ways to determine if they happen. Examples: a) Find behaviors that link to the goals of the activity, and see if those behaviors differ for participants and non-participants, or before and after the activity. Perhaps number of counseling appointments increased after a workshop about coping mechanisms for depression; b) Embed an exercise near the end of an activity that demonstrates learning of the key points. Aggregate performance on the exercise will reflect the effectiveness of the delivery technique or information presented. This method also provides immediate feedback about whether learning is happening or the information is getting across.