The Science Behind Our Driver Questions

The science behind our driver/impact questions

Jared Ellis avatar
Written by Jared Ellis
Updated over a week ago

What can I learn from this page?

The science behind our driver/impact questions

Who is this guide for?

Account Admins, Survey Admins, Survey Creators, Report Viewers

Separate from our Employee Engagement Index (EE) questions, we use around 45 questions that we sometimes call our driver or impact questions. Employee engagement questions should be thought of as outcome questions because they represent the very high level ultimate feelings, attitudes and behaviors that we'd ideally like our people to have or experience.

However, these questions don't tell us much about our specific workplace or culture, and are thus very hard to act upon directly (existing inside people's hearts and minds). We therefore aim to also ask about a good range of more tangible aspects of our workplaces and cultures to help us understand where we might focus more directly to improve.

Once again we began by consulting the academic literature and good practice in IO Psychology to get a good grasp of what the most important factors to measure might be. In particular the academic literature provides good examples (e.g. as you can see here it is not uncommon to use 100+ questions in the academic literature) where a very large range of potential factors have been examined and thus provided a clear guide as to where we should focus our attention.

We took a very broad range of questions representing many identified factors and collaboratively with our clients we have refined those down to the most critical factors that consistently predicted engagement as well as other outcomes such as actual retention. We also include factors that do not often show up as drivers of EE but can occasionally be detrimental if they are done poorly (work life blend factors for example).

Each year we also pull all of our questions together across hundreds of companies for our benchmark reports. This allows us to validate the relationships between our driver questions, EE questions and other survey questions on a very large scale. Beyond the annual process, we also complete short-term research studies like connecting survey data to external metrics (like Glassdoor, stock price) or business outcomes (like regrettable attrition and csat)

In addition to this research we also encourage a balance of validated questions and new and unique questions with our clients. This is often a source of new questions or entire new survey templates and we can then validate any promising approaches. At Culture Amp we believe science and research can go hand in hand with innovation.

Did this answer your question?