Skip to main content
Survey strategy checklist

Plan a successful survey with our survey strategy checklist! Align goals, engage stakeholders, set objectives & design surveys effectively

Jessie Walsh avatar
Written by Jessie Walsh
Updated over 2 months ago

Welcome to our Survey Strategy Checklist! This guide is all about helping you nail the strategy and design aspects of planning Attributed (Snapshot) surveys, like Engagement, before you launch your survey.

Just a quick heads-up! Be sure to use this alongside our Technical Survey Review Checklist, which outlines the steps for reviewing your survey's technical setup before launching.


But before you dive into that, it's crucial to give this Survey Strategy Checklist a once-over. Once you've got that sorted, then you can tackle the technical setup with confidence.

If you have any questions about any of the guidance covered in this checklist, contact your customer success coach for further help. If you don't have an assigned coach, just reply with "Ask a Person" in a Support Conversation to speak with a Product Support Specialist.

Overview


Survey design is an important aspect of the employee experience and getting to the heart of what matters most. The reason we survey design is ultimately to put Culture First. Culture Amp enables you to collect employee feedback so that you can understand trends and opportunities, use the data to move toward action, and evaluate impact.

However, for employee experience technologies to be adopted and have an influence on your organization, you need to have a purpose in mind for the data, be able to convert numbers into an action inspiring narrative, build an action plan and drive behavior change in ways where you can show meaningful results from all this effort.

The pre-data stage of survey design is an often missed opportunity for you to think through the outcomes you want to achieve to help you make informed choices when using Culture Amp’s suite of resources.

Here is a checklist of what to consider before you start your design process. This checklist is designed to guide you through the foundational step of defining clear and strategic objectives for your engagement survey. By starting with a deep understanding of your organization's goals, engaging with stakeholders, and setting SMART objectives, you can ensure that your survey is strategically aligned and poised to deliver actionable insights. Remember, the clarity and relevance of your objectives directly influence the effectiveness of your survey and the impact of its outcomes on organizational improvement.

Be sure to take this training and guidance once you’ve completed the activities in this checklist.

Survey design checklist


The Survey Design Checklist covers the Action Items BEFORE you launch your survey. If you would like to download a copy of this checklist, you can access it here.

1. Understand your strategic goals

  • Review company goals: Start by aligning with your broader organizational objectives. What are the company's current priorities and long term goals?

  • Identify key areas of interest: From these goals, identify specific areas where employee feedback could inform or improve decision making.

  • Survey cadence: How often will you survey towards specific goals and when will you do this?

2. Engage stakeholders

  • Identify stakeholders: Identify who has a stake in the survey's outcomes (e.g., department heads, management, HR).

  • Gather input: Conduct meetings or informal discussions to understand their perspectives and what information they value most.

3. Define the purpose of the survey

  • Clarify the purpose: Based on strategic goals and stakeholder input, clearly define what the survey aims to achieve. Is it to measure employee satisfaction, understand engagement drivers, or identify areas for improvement?

  • Specify survey topics: List the topics (factors) that the survey will cover, ensuring they directly contribute to the defined purpose.

4. Set specific objectives

  • Develop SMART objectives: For each survey topic, set specific, measurable, achievable, relevant, and timebound (SMART) objectives. What exactly do you want to learn, and how will you measure success?

  • Prioritize objectives: If you have multiple objectives, prioritize them based on strategic importance and feasibility.

5. Validate objectives

  • Review objectives with stakeholders: Share your draft objectives with key stakeholders to get feedback and ensure alignment with broader company goals.

  • Refine based on feedback: Adjust your objectives based on stakeholder feedback to ensure they are clear, achievable, and aligned with company priorities.

6. Document and communicate objectives

  • Document your final objectives: Create a clear document outlining your survey objectives, including the rationale behind each and how they align with strategic goals.

  • Communicate objectives broadly: Share your objectives with all relevant parties involved in the survey process, including survey participants, to ensure transparency and foster engagement.

7. Plan for action

  • Link objectives to potential actions: For each objective, brainstorm potential actions or decisions that could be taken based on the survey outcomes.

  • Establish criteria for success: Define what success looks like for each objective, including how you will measure improvement or change.

Best practice, challenges and implications


Best practice 🏆

Challenges 🚧

Implications ⚠️

1. Understand your strategic goals

Changing business priorities between planning and execution phases. Translating broad organizational goals into specific survey areas.

Failure to align with strategic goals may result in collecting irrelevant data, wasting resources, and affecting stakeholder buy-in. Disengagement among employees who don't see the survey's value.

2. Engage stakeholders

Integrating all perspectives due to competing interests. Time constraints and availability for discussions.

Insufficient engagement may lead to low investment in survey outcomes and loss of valuable insights. Including stakeholders can improve relevance and buy-in but may complicate the process and extend timelines.

3. Define the purpose of the survey

Balancing broad objectives with specific interests. Ensuring the survey remains manageable and focused.

Poorly defined purpose can result in a survey that is too broad or vague, making it hard to act upon. A well-defined purpose enhances the survey's ability to generate actionable insights but may require excluding some topics.

4. Set specific objectives

Setting objectives that are both ambitious and achievable. Prioritizing objectives may mean some areas are not explored in depth.

Without specific objectives, it may be hard to measure the survey's success or take meaningful action. While setting and prioritizing objectives focuses the survey, it may also narrow the scope of insights gathered.

5. Validate objectives

Incorporating stakeholder feedback may require significant changes. Delays or disagreements on survey focus may occur.

Skipping validation may result in objectives not fully aligned with organizational goals, reducing the survey's impact. Incorporating stakeholder feedback enhances alignment and buy-in but may require compromises on objectives.

6. Document and communicate objectives

Effectively communicating objectives to diverse audiences. Different groups may have varying levels of interest or understanding.

Failure to communicate objectives broadly can lead to misunderstandings and reduce participation or engagement. Clear communication enhances participation and aligns expectations but requires effort and resources.

7. Plan for action

Anticipating actions based on speculative outcomes. Setting realistic criteria for success can be challenging.

Without a clear plan, even well-executed surveys may result in minimal change, diminishing their value. Planning for action encourages accountability and ensures results are used effectively but may require assumptions about survey findings.

Trade-offs - streamlining your actions BEFORE survey design


While it's not ideal to skip any steps in defining survey objectives, small businesses may consider streamlining or condensing certain steps to save time. To make the survey process effective, each abbreviated step should be viewed with a clear understanding of the potential impacts, as well as the necessity of obtaining meaningful and actionable insights.

Taking time into consideration, the following steps could be abbreviated or combined rather than completely skipped:

Engage stakeholders

Define the purpose of the survey

Validate objectives

Document and communicate objectives

To gather input from stakeholders quickly, consider conducting a single, well-structured meeting or an online survey instead of multiple meetings. This ensures diverse perspectives are captured while reducing engagement time.

Combine the steps of defining survey purposes and setting specific objectives into a single phase. This saves time by establishing clear objectives concurrently based on strategic goals and initial stakeholder input.

Ensure objectives align with company goals and stakeholder expectations. Limit validation to key stakeholders or conduct an expedited review for quicker feedback gathering, though it may result in less comprehensive feedback.

Utilize a single communication strategy, employing concise documentation and digital tools for broad distribution and feedback collection. This streamlines informing and engaging all relevant parties involved.

Steps not to skip:

Understand your strategic goals

Plan for action

Skipping or abbreviating this step may result in a survey less aligned with the organization's overall goals.

Omitting this step negates the purpose of the survey as it informs subsequent actions.

Your action items when designing your survey


As you refine your survey design, here are essential aspects to consider ensuring it aligns with your goals and provides valuable insights. Be sure to take this training and guidance to expand on these points.

1. Outcome index evaluation

  • Engagement outcome index: Decide whether you will use all five aspects of the proposed index, some of them, or none at all. Do you plan to measure outcomes differently? Note any implications for a multi-faceted outcome measure based on the proposed approach. Remember, the platform correlates survey items to this outcome index, so its significance might need reiteration for clarity.

Example

You only include the following questions in your index:

  • %ACCOUNT_NAME% motivates me to go beyond what I would in a similar role elsewhere" - Motivation

  • "I am proud to work for %ACCOUNT_NAME%" - Pride

  • "I see myself still working at %ACCOUNT_NAME% in two years' time" - Future commitment

Implication:

You may choose to prioritize the motivation, pride, and future commitment to maintain a comprehensive yet concise Engagement Outcome Index. However, deciding which questions to exclude should align with your specific engagement goals and the strategic importance of each engagement dimension to your organization.

2. Wording amendments

  • Clarity and positivity: Review any wording changes for clarity (avoiding jargon, ensuring readability), positive phrasing (where a higher score indicates positive feedback), and adherence to guidelines (avoiding complex, multi-part questions). Assess if the edits affect the ability to benchmark responses and cross-check for comparability.

Example

You want to reword to provide more context

Original question

Customer edit

"I feel I am part of a team."

"Following the collaborative initiatives introduced, I feel I am part of a team"

Implication

1. Complexity: The question introduces a specific context (collaboration initiatives) before asking about the feeling of being part of a team. This could confuse respondents who might focus more on the initiatives than on the team feeling.

2. Benchmark: If the aim is to track changes over time or compare across departments or organizations, the reference to recent initiatives could limit the ability to benchmark.

3. Assumptions: The question assumes all respondents are aware of and have been affected by the collaboration initiatives. This might not be the case, skewing the responses.

Considerations for revision

  • Separate the questions: Consider asking about the feeling of being part of a team separately from the effectiveness or impact of the collaboration initiatives. This helps in benchmarking the general team sentiment over time and across different contexts.

  • Measure Initiative Impact Directly: If capturing the impact of recent collaboration initiatives is crucial, frame a direct question about it, such as "How effective have the recent collaboration initiatives been in enhancing teamwork?"

  • Clarity and Directness: Ensure the question is straightforward and easy for all respondents to understand, regardless of their direct involvement or awareness of the specific initiatives.

3. Bespoke question items

  • Custom Questions: Apply similar criteria to entirely new questions, considering clarity, positive framing, and guidelines adherence. It's also beneficial to check if an existing, comparable item might serve for benchmarking purposes.

Example

You want to create bespoke items but they can’t be benchmarked

Original question

Customer edit

"I feel I am part of a team."

"Following the collaborative initiatives introduced, I feel I am part of a team"

Likely solution

Consider maintaining the original question for its broad applicability and benchmarking potential to capture the essence of what you are interested in. Add a separate question to gauge the impact of recent initiatives if necessary, but design it so that it can be compared over time or omitted in cycles where it is not relevant without losing the thread of overall team cohesion measurement.

Suggested revision

  • Keep the original question: "I feel I am part of a team."

  • And add a general but targeted question regarding initiatives: "Recent initiatives (e.g., virtual team-building) have helped me feel more connected with my team members."

4. Factors and benchmarking

  • Factor adjustments: Understand that altering factors, such as removing questions or adding custom items, can impact benchmarking. Removing questions but keeping the factor generally allows for external factor-level benchmarking, whereas adding bespoke items might eliminate this possibility. When comparing to your historical data (internal comparisons), you'll receive a factor-level comparison if the number of questions remains the same, all questions are matched and the factor name stays consistent. For a deeper understanding of how adjusting factors and questions can impact your ability to compare and benchmark results, check out our support guide on how scores are calculated for reports. You can also learn more about our Factors, in our Culture Amp Training Courses.

5. Survey structure

  • Order and sequencing: Ensure questions progress logically from general to specific (macro to micro) and that similar questions (factors) are grouped appropriately. If including non-rating or free-text responses, position these at the end to maintain flow and encourage completion.

  • Question balance: Consider the number of questions to avoid surveys being too long or too short, which could affect participation rates and the depth of insights gathered.

Example

Rearranged questions without clear factor grouping

Question

Factor

Q1: What are some things we are doing great here?

Open-ended. No factor.

Q2: I am happy with my current role relative to what was described to me.

Alignment & Involvement

Q3: I feel I am part of a team.

Teamwork & Ownership

Q4: We have enough autonomy to perform our jobs effectively.

Alignment & Involvement

Q5: I have access to the things I need to do my job well.

Enablement

Implication

Survey fatigue may occur earlier in the process if respondents spend more time and effort on open-ended questions at the beginning of the survey. Subsequent questions can be affected by this. There is no logical progression if factors and question style are mixed up. This can confuse respondents and lead to less thoughtful responses.

6. Relevance of questions

  • Insightful Queries: Ensure the questions posed are relevant and capable of providing the insights you seek regarding your specific outcomes.

7. Self-select demographics

  • Demographic Questions: Ensure that demographic questions are positioned at the beginning of the survey and adhere to recommended practices, facilitating participant understanding and response accuracy.

Need further help?


  • Discover answers to common questions about designing your survey in our Survey Design FAQs.

  • Explore our Technical Survey Review Checklist for detailed guidance on reviewing the technical elements of your survey prior to survey launch.

  • Alternatively contact your customer success coach for further help in reviewing your survey. If you don't have a coach, just reply with "Ask a Person" in a Support Conversation to speak with a Product Support Specialist.


💬 Need further help? Just reply with "Ask a Person" in a Support Conversation to speak with a Product Support Specialist.

Did this answer your question?