Guide to designing a survey

A guide to designing a survey from the user experience to survey outcomes and focus areas

Jared Ellis avatar
Written by Jared Ellis
Updated over a week ago

What can I learn from this page?

A guide to designing a survey from the user experience to survey outcomes and focus areas

Who is this guide for?

Account Admins, Survey Admins, Survey Creators

We spend a lot of time at Culture Amp (internally and with other People Geeks) writing, editing, and testing out surveys, that’s why we have so many templates for our customers to use. We always recommend sticking to these templates because of the rigor we take in developing them (just check out our “Science Behind” articles) and the platform features that come with templated items (like benchmarks and inspirations).

But we also understand that there are reasons to customize, or we may not have the right template for you. This article is to share with you what we’ve learned about the art and science of designing surveys. Before you dive headfirst into question crafting, we want to cover a few broader concepts related to your survey goals and how these might guide you in selecting and writing the right questions. Here are our thoughts on:

  • User experience

  • Survey length

  • Outcomes and focus

  • Coverage

💡 Tip: Prefer to learn about survey design via a video course? Click here (US server link)

1. Survey User Experience (UX)

These days most employees have been exposed to hundreds of online applications that have been designed to minimize cognitive load and be quickly 'usable'. We think this has made the survey user experience (UX) more important than ever. A great survey UX can result in not only improved response rates, but also allow participants to direct more cognitive effort at answering the questions of the survey rather than where or what to click on.

Recommendations:

  • Use rating questions for the majority of your survey: You want to use a response format that allows people to focus on the question’s meaning, and not learning what all the options are. Research has shown that, for rating scales, 5-7 response options is sufficient for good reliability and validity. Most Culture Amp surveys use a 5-point agreement scale to provide a strong anchor for each survey question. This avoids biases that might be present in more ambiguous response formats. Additionally, using fewer response options makes responding easier and faster (especially on a phone). The other thing that will slow people down and possibly confuse them is switching response formats, making them rank too many options, and anything else that will stop the flow of their experience. We use the agreement scale as much as possible for this reason, though we do have alternatives.

  • Write great questions:

    • Avoid grammatical negatives as much as possible (e.g., not, do not, un-): mixing positive and negative statements can confuse survey takers

    • Statement should be fairly simple and also believable (e.g., avoid using extreme language such as always, never, and best unless it is really appropriate or there is a reason to create a higher bar)

    • Try to avoid double-barreled questions: statements where people could often agree with one part but disagree with another aspect (e.g., I am happy and active at work) since these are difficult for survey takers to answer and for you to act on

    • Try to ask things that people will know about (e.g., it's easier for someone to know if they feel something is 'fair' versus if it is 'accurate'). This can reduce the number of neutral responses due to lack of knowledge

  • Place free-text questions at the end of the survey: Switching from one question type to another increases the cognitive load required of the survey taker. Free-text questions throughout your survey can disrupt the flow of responding to rating questions. On Culture Amp surveys, employees can pause and write comments on any rating question if they feel compelled to do so, and additional free-text questions are all located at the end of the template. This can be a great way to capture closing thoughts from employees, as well.

2. Survey length

Employees are taking time out of their (probably busy!) days to share their feedback with you, so you want to make sure you are optimizing your survey length to get the most helpful feedback. This can be a bit of a balancing act: you’re likely excited to learn everything you can about your culture, but your employees may not be keen on answering hundreds of questions.

We’ve found that 10 minutes or less is ideal, which is why all of our templates fall below this timeframe. It is short enough to feel reasonable to employees and also prevents a phenomenon called satisficing: the more questions you ask, the less time people will spend providing thoughtful answers. Although you might want answers to lots of questions to get the detail and coverage you’d like, longer surveys are going to decrease the quality of your data and your participation rates.

Recommendations:

  • Include a maximum of 50-60 rating questions in your survey: Because most people will be able to answer around 50-60 rating type questions in 10 minutes (if the questions are good and the format is kept simple), we suggest that as a maximum. This will also help you work out how many questions you can ask on each topic you want to cover. For example, 10 topics means about 5-6 questions for each one.

  • Limit free-text questions to just a few: As mentioned, survey takers can comment on any rating question, so you will likely capture targeted comments throughout the survey. Most organizations want to add in a few free-text questions as well: we recommend no more than three (located at the end!) that can function as a catch-all for additional thoughts from employees as they wrap up the survey.

💡Tip: Give employees an estimate for how long the survey might take. Let folks know that you expect this survey will take about 10 minutes to take, but they are of course welcome to spend more time, especially if they’d like to write a number of comments. Across surveys, we’ve found on average each rating question adds about 10 seconds, and each open-ended item adds about 50 seconds. For our templates, we provide an estimated completion time in our template library.

mceclip0.png

Now that we know how to design a clear survey of reasonable length, let’s talk about the content.

3. Survey outcomes and focus

A survey should have a primary focus or outcome in mind, even if it may be hard to pin down in some cases. You should ask yourself, What is the overall end point or outcome I want to tap into and am aiming to improve? You can then try to identify aspects of this outcome that your respondents will be able to see, feel or know about.

For example, in an employee engagement survey, our primary outcome is the level of engagement employees feel with the company. To measure this, we ask these five questions. Importantly, these questions often tap into feelings, thoughts, or behaviors that are hard or impossible to take action on directly. For example if we know that employees are responding unfavorably to the item ‘I am proud to work for [X Company]’, we cannot simply ask employees to be more proud. Similarly, you cannot simply ask employees to be more engaged. Rather, we use other questions and analytic tools to tell us what aspects of the employee experience are most likely to positively impact engagement.

Recommendations:

  • Identify the business outcome or employee behavior you are trying to reach or improve via your survey: Every survey should have a purpose, since you should only survey employees when there is something you want to improve at your organization. If you’re looking for inspiration, take a look at surveys in our Template Library (click “Create survey” on the Surveys tab). Once you have your outcome, try to come up with a range of statements that represent, or would be associated with, that purpose or goal being achieved. You can then assess which of these you think are best and which ones your respondents will likely be able to answer or provide feedback on.

  • Group these questions in a factor and make that your Index Factor: Once you've arrived at your 4-5 outcome questions they can then form a group together known as an index factor and together they will represent the key outcome of your survey. Factors provide greater score variation amongst respondents and tend to have better statistical properties than single questions. For ideas on other index factors beyond engagement, take a look at our Wellbeing or Team Effectiveness surveys!

  • Order your outcome to be displayed first in the survey: You want to know how people are feeling around your outcome before they are primed with other questions, so set your outcome factor as your Key Factor as well.

mceclip1.png

4. Survey coverage

After you've arrived at your outcome questions, it's time to make a list of all the different things you think might possibly impact your outcome. Ask yourself, “What elements of the employee’s experiences or perceptions lead to the outcomes we want to improve?” For example, with employee engagement, we include things like learning & development, leadership, recognition & rewards, managerial practices, career opportunities, teamwork, etc. Once you have a list of topics, you can get a feeling for how many questions you will have room for in each section.

Recommendations:

  • Start with a template, then customize: If you’re starting with a Culture Amp template, some of these topics you’d like to learn more about might already be covered in the template’s questions. Review the included questions and ask yourself:

    • What’s asked on the survey that isn’t relevant for our company? Feel free to remove them!

    • What elements of our culture are missing from these questions that we want to ask about? Feel free to add questions!

    • Are there any areas where we’d like additional questions to get more detail?

  • Organize the topics from broad to narrow: Once you have questions for all topics in your survey, place them in sections that will make sense to the survey taker. We like to start with outcome questions, begin with broadest topics at the top, then gradually narrow down into more specific questions toward the end. Our Engagement Survey template is a great example of this: it begins with our overall engagement questions, continues onto Company Confidence and Leadership before moving into elements of the employee experience more specific to the individual (e.g., role specific).

Remember: surveys are the start of the conversation, not the end

Our overall recommendation: Create a survey that identifies a problem, not a solution. Don't get hung up on writing hundreds of perfect and specifically actionable questions or thinking that your results must tell you the precise and perfect thing to do in order to remedy the situation. Write questions that will cover the major areas of the employee experience, keeping in mind that the results will tell you where to focus, but not exactly how. Your next step will be unpacking that focus to understand specifically what could improve that area. Oftentimes this looks like leading a discussion and ideation session. Sometimes it’s a follow up with another survey that dives deeper into the area of focus (e.g., wellness or inclusion).

The reason we survey is ultimately to put Culture First. Culture Amp allows you to collect employee feedback so that you can understand trends and opportunities, and use this data to move toward action. For this reason, surveys are the start of an ongoing conversation you and your employees have about their experience and how to improve it. If you’ve constructed a well-designed survey, you'll find that your survey results will direct you to the most important areas that are related to your outcomes (via driver analysis for example) and guide you toward discussion and action.

Did this answer your question?