Comparisons allow you to understand how your organization has changed over time, and how you compare to other similar businesses. At Culture Amp, we have a number of external comparisons (benchmarks) available for customers and the ability to load internal comparisons.
Account admins can add comparisons to surveys via the comparison survey configuration section.
It's important to be mindful of how you use comparisons and where they have the most utility. Keep in mind, that comparisons and benchmarks are not targets, they are context. It's common to see the benchmark as the former. If you are scoring 66% on a particular item, and you see (a relevant) comparison at 70%, then it's natural to see yourself as lagging in some way. However, at a high level, this isn't that great a difference. You're in the ballpark already.
Tip: Learn how to design your Survey with Benchmark considerations
External Benchmarks
External benchmarks are comparative figures that are based on external companies. Ideally, these companies are very similar to your own. Similar size, industry, and basic composition. Usually, a company that is competing for the same or similar talent. There are a few key things to consider when using these benchmarks:
How similar are the surveys? The questions need to be consistent in order to be compared. Even small changes in questions could potentially mean a different interpretation.
How current are the benchmarks? Some data sets can be quite old. If you're using a benchmark, ask how current the data in the benchmark is.
What is the composition of the benchmark? It's very common to not identify the exact companies in a benchmark. However, it's good to know the types of companies, their geographies and typical sizes.
Ultimately, is this benchmark relevant to your organization and your culture? A company with a very strong set of values may get less utility from a benchmark of companies that does not necessarily reflect those values.
You'll mostly see benchmarks at the question level unless you have exact matches for all questions in a given survey factor meaning no customized items have been added.
Note: You may remove items and still have a factor comparison to the industry benchmark.
Benchmarks are best interpreted at the question level anyway. Also, don't get too focused on the specifics with benchmarks because they move around a little from year to year and they should be thought of as providing context rather than precision.
Internal Comparisons
What we see most often with Culture Amp is that the variation within your organization is far more significant than any comparison to the benchmark.
Let's say you had an overall score of 63% on Engagement. You may then find that one location, department or other demographic is scoring some distance away from that overall 63%. For example, the Sales team may be at 40% and the Engineering team at 78%. Seems contrived, but we see significant distributions in almost every survey we run at Culture Amp. Finding differences of 80 percentage points between demographics is common (if not expected).
Internal comparisons do not suffer from some of the limitations of the external benchmark. They may be very different teams, but Sales and Engineering are operating in the same context. There will naturally be cultural differences, but they are similar in numerous other ways. Plus, they have been asked the same questions at the same point in time. So the comparison is direct. In the example above, there may be some aspect of your Engineering culture that you can bottle and bring to your Sales team - a great opportunity.
When comparing to your historical data, you will receive a factor level comparison if:
The number of questions stay the same
All questions are word for word the same or have been manually matched using the comparison matching tool
Further information can be found in our guide here.
Trend Comparisons
Even more important is the trend. If the Sales team scored 40% 6 months ago and 55% now, then that is a significant change. Something is going right there and worth investigating. Many Culture Amp customers survey every 3 months. Seems like a lot, but once you're in the cadence it becomes much easier. Being able to see the change Quarter-to-Quarter is a powerful tool.
Bringing them together
We would never advocate discarding external benchmarks as they serve an important role. They let you know, roughly, what's reasonable given the type of company you are. There are some questions that nobody ever scores well on. 40% may not be as terrible a score as you think (and perhaps, 88% not as great as you think). This is really valuable context. However, if you want to dig into the nuance on a particular topic, the nuance is often found by turning to the internal comparisons and the trend.
Automated Comparisons
You can add comparisons manually or take advantage of automated comparisons. Automated comparisons are automatically generated for your survey report on the Comparisons page. The list includes any closed surveys of the same type that have at least one matching question. Select any of the automated comparisons to add to your report. As always, you have the option to add additional manual comparisons and benchmarks.
Further information can be found in our guide here.
Hierarchical Comparisons
These are internal same survey comparisons used to set up a demographic-level comparison where the data filtered to one demographic value is compared to data filtered to another demographic value.
Further information can be found in our guide here.
Related pages:
๐ฌ Need help? Just reply with "Ask a Person" in a Support Conversation to speak with a Product Support Specialist.