Guide to understanding survey result reports

Guide to understanding survey results within reports

Jared Ellis avatar
Written by Jared Ellis
Updated over a week ago

What can I learn from this page?

Guide to understanding survey results within reports

Who is this guide for?

Account Admins, Survey Admins, Survey Creators, Report Viewers

This article covers how to understand survey results and the different reports available.

Discover your participation

The first step in understanding survey results is exploring participation. How does participation differ by particular demographics? You may be tempted to fixate on this number, but remember, what’s most important is getting a representative sample. An ideal participation rate is around 75%-80%. Learn more about defining a good employee response rate from our Chief Scientist.

View your overall participation rate on the left-hand side of the report, or click the Participation tab at the top to dive deeper.

From here, you can see response rates for particular demographics. You may have many demographics here, so focus on the main demographics that will interest your leaders and will find useful for reporting, such as location or department. Check to see if any substantial or important parts of your organization relative to your company size, have a response rate below 60%. This will be something to keep in mind when reviewing the quantitative (numerical) results.

💡 Tip: Keep the sample size in mind. If participation is 50%, but there are only 4 individuals in the team, that means only 2 people chose not to respond.


Set up and understand your comparisons

The next step is setting up your comparisons to give you a frame of reference for your results. Benchmarks can be external or internal. You can load your benchmarks directly in the Culture Amp platform. Once loaded, you will be able to flip between comparisons in the top right.

When viewing the Summary tab of the reporting, the historical survey comparison will display the first one loaded in, and the external benchmark comparison will display the last one loaded in.

External benchmarks

External benchmarks are generated by us at Culture Amp using data from our customers. Use external benchmarks to gain context for how your score compares to other organizations. Learn more about how Culture Amp’s benchmarks are generated or check out your options and the insights we’ve collected at Culture First.

Internal benchmarks

If you’ve surveyed in the past, use internal benchmarks to compare current results to previous results and see where you’ve improved or taken a dip overall.


Review your outcome score

Now that you have an idea of how representative your results are and a benchmark for additional context, you’re ready to explore the data, starting with the outcome score. The outcome score is often Engagement, but may also be Manager/Team Effectiveness, Wellbeing, or something else, depending on the type of survey. Your outcome factor is the first one listed on your report. Click into the factor to see the questions included.

Screen_Shot_2021-12-23_at_1.33.33_PM.png

The factor score displays three numbers: favorable, neutral, and unfavorable.

  • Favorable: the percentage of participants who selected ‘agree’ or ‘strongly agree,’ displayed in green

  • Neutral: the percentage who selected ‘neither agree nor disagree,’ displayed in gray

  • Unfavorable: the percentage who selected ‘disagree’ or ‘strongly disagree,’ displayed in pink

In the example shown, the Engagement favorability is 69%. The comparison, displayed on the right, compares the favorability score to the loaded benchmarks. In this example, the company is at 69% Engagement favorability, which is 12 points below the benchmark of 81% Engagement favorability.

Learn more about how scores are calculated.


Dive into the Questions report

While it’s important to know how your outcome is scoring overall, the nature of the outcome questions means they are difficult to act on directly. Think of outcome questions as indicators of great workplace culture, such as the likelihood of recommending your company or a desire to stay in your company. Indicators are difficult to act on directly, but the other questions in your report serve as the levers you can pull to achieve your outcome.

Go to the Questions tab on the top of your report to see all of the questions asked in your survey.

Sort questions to review

Sort questions by clicking on the column headers. We recommend:

  • Sort by Favorable column to see your highest scoring items, these are things to celebrate.

  • Click the Favorable column again to see your lowest scoring items, these are where you may have opportunities for improvement.

  • Click the Comparison column to see how you’re faring compared to your benchmark, which offers important context.

Dec-23-2021_14-12-42.gif

Consider the context when viewing comparisons. In this example, “I know how my work contributes to the goals of Hooli” is the fourth-highest scoring question, but it is below the benchmark.


Identify your focus with Impact Analysis

Impact Analysis

It can be tempting to focus on your lowest scoring questions, whether in comparison to the benchmark or favorability, but it’s important to take impact into account when deciding where to focus. Our Impact Analysis takes the non-outcome questions in your survey and uses a correlation analysis to identify which of those questions are most strongly and consistently related to the outcome.

Screen_Shot_2021-12-23_at_2.15.08_PM.png

The Impact Analysis is visualized as a set of overlapping circles--the larger the overlap, the stronger the relationship. Improving on these questions would most likely 'Impact' your outcome levels.

Keep in mind that not all questions have the same room for improvement. For example, even though “Pied Piper is a great company for me to make a contribution to my development” is high impact, meaning strongly related to the outcome (Engagement), it is only slightly below the benchmark. The question, “I believe there are good career opportunities for me at Pied Piper” would be a better question to focus and take action on, because it is much lower than the benchmark and almost 20 points lower in favorability.

Focus Agent

We created the Focus Agent to help you decide where to focus your efforts. It takes three things into account:

  • The impact score

  • The opportunity for improvement (absolute favorability score)

  • The comparison to your loaded benchmark(s)

Using this information, our Focus Agent recommends the top 3 questions that are both strongly related to your outcome and have room for improvement. These recommendations are signified by a green bar in the Focus column.

💡 Tip: Focusing on 1 thing will allow you to put all of your effort into 1 area. Learn more about why we recommend focusing on fewer things.


Find differences across demographics

So far we’ve seen how things are going at a high-level, but it’s also important to see where demographic groups may be having a different experience. You can do this in 3 ways: spreadcharts, heatmaps, and filters.

Spreadchart

Click into a Factor or Question to see the spread of scores for groups within a particular demographic. The spreadchart compares the favorability of the lowest scoring group to favorability of the highest-scoring group within a demographic. The larger the spread, the more different the experience groups are having.

In the example shown, Design is 43% favorable on the question and is a clear outlier compared to other departments.

Screen_Shot_2021-12-23_at_2.25.10_PM.png

The size of the circle corresponds to the size of the group so that you can keep the sample size in mind. Remember that if a demographic has only 4 people and 1 person responds ‘neither agree nor disagree,’ their favorability score will be 75%.

Heatmap

To see how demographics compare across questions, use the Heatmap report. A Heatmap is a color-coded table that draws your attention to the highest and lowest scores and patterns in responses.

To use the Heatmap, select a demographic. The heatmap initially shows factor results by presenting each demographic group as a Delta (difference) from the aggregated scores (company overall). You can switch to display an absolute percentage instead. Both displays options are based on the favorability. You can also click into a Factor to see the Question results.

Report viewers can view Heatmaps in either red and green or blue and yellow color schemes. Users can easily toggle between color schemes depending on their needs or preferences.

Text weight distinctions enable users to easily identify differences in scores, where larger differences are indicated by the heavier text.

Screen_Shot_2022-09-26_at_10_46_39_am.png

Depending on the comparisons that have been loaded into your survey, there is also the option to replace the column containing the aggregated results and instead compare demographic results to external benchmarks. To adjust the first column within the heatmap, select an option from the Compare To list in the top right.

heatmap.gif

Demographic groups with the biggest population ('n' size) are presented first because these are the groups that will have the greatest influence on the overall score. The Heatmap is designed to highlight the biggest differences in results for particular demographics when compared with the aggregated results. The most positive differences are highlighted green and the most negative differences are highlighted red.

As the delta between a group and the aggregated results gets larger, the shading gets darker. This visual cue highlights large differences between groups so that you don't have to read through every single cell. If you want to combine demographics, you can use the custom heatmap.

If 0% is displayed on the heatmap, this means that the total number of people within that demographic group neither answered agreed or strongly agreed.

💡 Tip: The heatmap is a great place to identify groups with complementary opportunities. For example, you could connect the Head of Operations with the Head of Finance to explain what they’ve done to encourage innovation.

Filters

Filters are another way to identify different experiences across demographics. Use filters on the Insight, Participation, Questions, and Heatmap report pages, by selecting the filter on the left side. When using filters, keep in mind:

  • The Comments report can only have one filter for confidentiality reasons.

  • Each different demographic uses an AND operation so adding more filters reduces your result set (e.g. selecting Dept: Sales and Gender: Male displays results for men in Sales). When you add values for the same demographic (e.g. Country: India, France, Italy) an OR operation is used.

  • You will not be able to apply a filter that includes fewer responses than your reporting group minimum. For example, if there are only two responses from the Palo Alto office, and your minimum group size is 5, you will not be able to see filtered results for Palo Alto. The filter will be highlighted in yellow, and a message will appear noting that there are not enough responses. Even if the combined results of a filter will meet the reporting group minimum, you are still unable to add any filter that is smaller than the minimum. This is to prevent scenarios in which individual results could be identified by adding and removing small group filters, and seeing how the data changes.

Screen_Shot_2021-12-23_at_2.32.58_PM.png


Analyze the comments for additional context

Any free text question responses will display under the Comments tab of your report.

Comments can provide interesting context and further insight into specific issues that may be raised by your survey. However, comments should be viewed as secondary to the quantitative results for a number of reasons:

  • Comments are often from a smaller number of respondents so they may not be representative of everyone. By nature, some individuals are more likely to write comments than others. This can result in the same person writing most of the comments across multiple questions.

  • Individuals with unfavorable scores are more likely to leave a comment according to our research, 7 times more likely to be exact. They also leave longer comments than their favorable counterparts.

To help you understand if this is true in your organization, we include commenter stats at the question level and for comments overall, looking at the percentage of participants that commented and how their responses differ from the company overall.

comment_stats.png

In the example shown, 8% of participants commented and they were 56% favorable on the question, which is 10 points below the company overall.

The best practice with comments is to leave them until you’ve absorbed the rest of your results, then use comments to consider the following:

  • Look for ideas for action. Once you’ve picked an area of focus, comments can be a great place to start to find ideas for action.

  • Tell the story. As you’re preparing to share results with your leadership team or your broader organization, you might incorporate a comment that illustrates the heart of the challenge your organization is facing. This helps people to connect emotionally with the data and depending on your organization’s culture, it might make sense to paraphrase this comment rather than quote it directly.

Save comments

When you find a comment with a good idea or one that highlights an important point, you can save it by clicking Save on the right side. Each user has their own list of saved comments which they can refer back to from the Saved Comments tab.

Sort comments

We apply machine learning to automatically categorize comments for you, making it easier to explore them with the filters.

comments.png

  • Any question: The question the comment was left in response to

  • Any topic: A set list of pre-defined topics (as determined by the content of the comment)

  • Any sentiment: The sentiment (as determined by the content of the comment)

  • Any rating: The rating the participant responded with

  • Any replies: Comments that have been replied to if you had Comment Replies turned on prior to launching

  • Any language: Depending on the language the comment was originally left in

Our algorithms are great for seeing overall trends like what are most people talking about and their sentiment on the topic. While the categories are highly accurate, there may be individual comments that are misclassified, particularly if it includes sarcasm or humor (that’s tough for a computer!). You can edit the topic and sentiment as you see fit.

Learn more about text analytics and how to edit a sentiment or topic tag.


Share and export your results

The final step in understanding results is to share them with others. We’ve made it easy for you to share results within Culture Amp, and we’ve made the summary report really simple and intuitive for leaders and managers alike.

We also recognize you may want to do some more advanced analytics outside of the platform. If you enabled Raw Data Extract before the survey was launched, you can export all of the results from the top right of the Results page.

Alternatively, as you look through the results, it is possible to export views to Print, PDF, Excel, CSV, and Powerpoint. In particular, exporting to Excel from the Insight report or the Question report can be useful. The Comparison sheet allows you to easily compare your scores to all of the loaded benchmarks side-by-side. The Detail sheet provides the breakdown of responses (favorable, neutral, and unfavorable) as well as the correlation coefficient for each driver.

Additionally, you can export to Excel from the Heatmap report. From here you can remove or re-order columns as you like. Either delta or percentage will be reflected in the Excel file, depending on the display view you’ve selected in the Heatmap.

Did this answer your question?