Survey fatigue or over-surveying is something to recognize and guard against.
Students can feel burnt out from requests for their feedback about academic courses, campus events, and general college experiences — all on top of customer satisfaction surveys they receive from retailers.
Survey fatigue can lead to declining response rates, and data collection is only worthwhile when a substantial quantity and quality of data is collected. This needs to be accounted for whenever you coordinate assessment projects.
Before I say more, it is important to identify the most appropriate data collection method according to your needs. Although surveys are familiar to staff and somewhat easy to create, they shouldn’t always be your default assessment option. There are many data collection methods that should be considered in addition to surveys. First identify what you intend to measure, then determine the best way to measure it. (My favorite method is rubrics, but see page 25 of this assessment guide for more.)
When surveying is the most appropriate method, there are a number of considerations and strategies related to design, administration, use, and deliberate engagement of stakeholders that you can decide between. Those elements can make for good survey methodology, while also increasing your response rates!
Toss out nice-to-know questions or questions you can’t pinpoint a specific use for their data.
This way, students should only receive relevant questions and not have to select N/A if we know it wouldn’t apply to them. (For example, fourth-year students only get questions relevant to their experience vs. first-years.)
As long as you collect a common identifier (like a student’s email or ID number), you could obtain data like demographics, class year, and major from student information systems.
These sorts of questions are quick and easy to fill out – so not much effort to close out the instrument – but putting them early in the survey can delay getting to the “point” of the survey. The only exception here is if you need a demographic question for skip-logic purposes (such as capturing their class year to determine follow up questions).
You may think that your questions make sense, but since you’re not the targeted survey taker, be sure to get feedback from your intended audience and adjust accordingly.
It’s always disappointing to check your survey results only to see a question was worded incorrectly, that you missed including a response option, or that the logic and links didn’t work. The good news is that, with testing, you can catch those omissions and errors before sending out your survey.
Realize students may access or look to respond to your survey from various devices — including laptops, smartphones, and tablets. So, it’s critical to make sure your survey adapts and works well across device experience.
Consider the academic calendar, student schedules, and student attention patterns when determining when to send out your surveys and ask for responses back.
To avoid sending an email invitation out of the blue, send a pre-announcement explaining the what, why, and when of a survey to students. Then, send your actual survey invitation and follow up, beyond reminders, to thank students for their attention, consideration, and participation. As a bonus, share some results and implications for the data in the follow-up.
The latter note of going only to non-respondents is key: when possible, don’t send reminders to everyone, as it’ll seem like you didn’t recognize the people who have responded. Also, know that sending 2-3 reminders while the survey is active is plenty.
If at all possible, address the students by name in the invitation and share how their perspective is relevant and important to your data collection purpose.
Similar to the last point, use language to generate curiosity, empower students’ voices, and increase their desire to engage on the topic. For example, a survey about future campus-wide events could stress the desire to allocate resources to be relevant and aligned with student interests, making their response to the survey important to capture diverse perspectives of the student body.
Beyond just saying why you are collecting the data, talk about how you will use the results.
Be sure to include a note about confidentiality (that you’ll only share responses with relevant decision-makers) and anonymity (that there is no way to connect responses to any student’s name) if those are part of your methodology.
Leverage survey technology estimates or use a stopwatch to time students piloting your survey. Then, include an estimate in the invitation about how long it will take to complete the survey. This should motivate your design to only contain the necessary information in order to be quick, but also give you an opportunity to be transparent with expectations when it is a long or involved survey.
A great way to generate interest and engage your readers is to offer more than text; share some data! For example, you could write about past student dissatisfaction with event scheduling which led to more convenient, future offerings. This is a great way to demonstrate that past data was reviewed, communicating to students that you will indeed review their data should they participate.
Some survey technologies allow you to embed the first survey question in the email invitation. Doing so is a great way to encourage participation and get their response collection underway.
To best inform your strategic thinking about survey priorities, make sure to get student input. As much as you may think you know what’s best or pressing for students, it helps to hear from them directly about priorities and needs.
Students can be great design collaborators. They can help provide question language, tone feedback, and comment on anticipated student experiences. Pilot review of a designed instrument from other students later is still valuable, as not all students are the same or think alike, nor does the intent of the student voice in the design necessarily shine through clearly in the resulting survey.
Do your homework to present to them options, but value their feedback with respect to when to send out the survey, what email to present it from, and marketing campaigns you might consider beyond the invitations.
Use your learning management system (LMS), student engagement platform, and website to promote or increase access of your survey to students. Think about where students traffic and cater to their digital domain.
Just as students have valuable perspectives on strategy and design, they can offer brilliant observations for interpreting results. Involving students reinforces that their voices are being heard, which could encourage future peer-to-peer promotion and motivation to participate in surveys.
Be sure to re-engage students once data is analyzed to share the results and, even better, announce the improvements you plan to make based on the survey findings.
Do not go through this process alone. If you’re collecting data on behalf of one or multiple offices, make sure those employees are aware of the purpose and value of the survey.
Leadership involvement signals importance; include their voices and invite their presence and perspective in your process.
Literature and research can reinforce the credibility and value of surveys. JMU’s instrument design information, covering psychometric properties like validity and reliability, illustrates how surveys can be designed with the utmost integrity. Resources like these can help equip, prepare, and build competence for folks to know more about and engage in the process.
Sharing response rates with staff can keep them updated on data collection status. Plus, this data can be used to generate group and individual conversations on what is or isn’t working with your survey creation and promotion.
Even if you are only issuing one survey a year, you can discuss data needs and priorities, content considerations, design process and review, updates on administration and data collection, analysis discussion, sharing results, identifying actions to be taken, and plans for future data needs.
Reinforce the relevance and importance of the collected data in answering questions you had or providing essential information about your services and student learning. It can even be a big deal to celebrate a stellar response rate.
Discussing results with students reiterates that they are stakeholders and not just your subjects of study.
Not everyone is going to be part of the survey process, so share broadly (but intentionally!) for awareness and to open the door for collaboration regarding data implications.
When designed well, each question will hopefully yield answers to essential questions and have implications to improve your programs and services.
Check that your questions gathered anticipated data to satisfy objectives, then consider what changes you should make for future surveys
Beyond future surveys, consider if results yielded from the survey necessitate any follow-up surveys or focus groups to learn more about a particular issue, topic, or outcome.
Beyond informing future assessment efforts, results and actions for improvement can be used as part of marketing for your programs and services. When students know that programs are based on student-collected interests or have been improved based on past student feedback, they may be more likely to engage.
Remember: a survey is more than just design and data collection. Factor in time to plan, design, pilot and test, revise, administer, collect data, analyze data, report, and put together action plans based on results. By setting aside appropriate time for the life of your survey, you’ll likely be able to engage and involve students or otherwise best execute a survey experience to garner optimal participation and results.
Don’t forget you can carve out time for students to complete your survey while participating in your event, program, or class experience.
Take time to think of the color, logo, and message you intend to convey to students as they receive and experience your survey. It can be simple; consistency in colors, font, survey naming convention, and tone with email invitations and instructions can go a long way.
You could consider incentives for standalone surveys or a reward program for the completion of multiple surveys. But be aware that incentives can attract students who only want the prize; so, be prepared for inauthentic or inaccurate responses from some students and perhaps only offer incentives for big or critical surveys.
Some institutions have had incredible success coordinating with leadership to arrange for assessment day(s) when there are no classes, but students and employees are asked to engage in various assessment activities all day. This helps encourage participation with the promise of focused surveying at a given time of year instead of spreading it out across the calendar.
While there are, indeed, free tools and software available for surveys, you usually have to pay to have access to features like skip logic, non-respondent reminders, invitation personalization or custom coding tools, and robust reporting.
Feel free to let us know if you have additional tips or tricks! We’d also love to know which tip is your favorite or will be a priority starting point for your surveys. Connect with us on Twitter @themoderncampus and @JoeBooksLevy.