We need customer/employee data in order to make some key decisions based on real opinions - I know, let's put out a survey! OK, survey is launched, now let's wait for all the responses. Huh, we only got 4 responses and half of those are from project team members. Large womp.
Does this scenario sound painfully familiar to you? Think the lack of response is due to so-called "survey fatigue*"? Could be, but it could also be that you're going through the motions in preparing a survey. Here are some things that I've done that have consistently gotten me at least 40% response rate** or much more on surveys:
Shorter is definitely sweeter
I once worked on a project where the person in charge insisted on a survey with over 100 questions, citing statistical accuracy. Accuracy or not, very few people have the time or patience to answer 100 questions on a survey. Not only that, but if they do end up getting to 100, they'll be so tired and annoyed that their answers will probably be tainted. (Ever answer 3 on a scale of 1-5 on multiple questions in a row just to get it over with? Yeah.)
I like to go with an absolute maximum of 10 questions, but I think 4 is the sweet spot. 4 is completely manageable and when you tell your audience that the survey is only 4 questions or will only take them 2 minutes, they're more likely to go for it. But what if you have way more than 4 questions to ask? Your options are to analyze and prioritize OR split your survey up into multiple parts. Just keep in mind that with each subsequent survey you send out, you're going to get drop-off of respondents as people lose interest/patience.
Order is important
Ask your harder questions first. Leave demographic questions until the end. This way, you're using up the respondent's energy and attention for the harder stuff and questions that are easy - name, location, etc., are a quick closer.
Don’t ask for what you already know
That reminds me - don't ask for information that you can easily get yourself. With Google Form, for example, you have the option of automatically recording the user's email address. Don't ask for the email address again if you're already getting it automatically! If the survey you're sending out is to a customer via personalized link, which means you have all their identification details, don't ask them again to fill out their name, etc. That's asking them to do extra work unnecessarily and reveals that you don't know how to make the best use of your tool.
Yes and no doesn't tell you much
Let's say you want to measure the usefulness of a product. You could ask, "Do you find this product useful?" with choices of yes and no for the response. Sure, the data is easy to collect, but is it accurate? Maybe the answer is actually "kinda", but you've trapped the respondent into answering either yes or no. It's too black and white. What would be more useful is asking, "To what extent do you find this product useful?" with a scale for their response (1-5 or 1-10). With this one, it's important to analyze the data carefully too. If the average of the responses comes to somewhere in the middle, however you actually had many on the low end and many on the high end, that's worth exploring.
A spoonful of sugar helps the medicine go down, it has been said (sung). Offering a little something to each person who answers your survey to show them that you value their time can help increase your response rate. If you can't afford to give an individual incentive to each respondent, you can put them into a draw for a prize. It doesn't have to be a big prize - some people just like the thrill of a win, e.g. swag, a gift card, treats.
Mandatory vs. Optional
If your survey is short, then all questions should be mandatory by default because ask yourself, if it's optional, then do you actually need to ask it? The only types of questions I would ever put as optional are additional comment fields. These can be really really useful to get more context and details as to why someone answered the way they did, but not everyone has the time or interest to share. Making these optional means that those who WANT to share (i.e. those who feel strongly about the subject) will do so, but those who don't won't get annoyed and quit the survey or simply write "N/A".
Anonymity = better response rate + better insights
Many people are kind-hearted and don't want to hurt anyone's feelings with negative feedback. Unfortunately, hearing the bad stuff is exactly what is needed in order to make improvements. Ensuring anonymity with your survey will get you more responses, more honesty, and more useful information. As an added reinforcement, I like to ask people specifically for their frankness in my survey invitation because that is what will help me the most.
Finally, be honest about your survey intentions. Are you really planning on using the data from your survey to make decisions? Or have you already decided what you're going to do, regardless of what answers you get and are just doing the survey to check a box? If it's the latter, why waste your time? That might seem harsh, but truly, if you know you're not going to do anything different, then it's a waste of resources to do a survey. Instead, take a look at your plan and build into it that you are going to make changes based on what comes back from a survey. If the survey responses end up confirming your assumptions, you've just unlocked some free time - bonus!
Surveys are a tried, tested, and true way of finding out what people think. If you really want to know, you have to ask. Until we have the ability to read minds, if we don't ask, we're only guessing. And some decisions are just too big to base on a guess.
* * *
In her role as People Strategist, Heidi sometimes uses surveys to learn more about what team members are thinking to design programs that work for them.
* It's probably not survey fatigue. How do I know? I did a survey on it.
** According to this, a good response rate for a customer satisfaction survey ranges between 5-30%