We submitted a draft employee survey questionnaire for client approval recently.
We hadn't worked with them before, and this was the first time they had used a specialist survey consultancy; they'd run surveys before, but in-house. This time, they said, they wanted to take a fresh approach and get more value from the output.
We'd had only one meeting with them to discuss the topics they wanted to include in the survey and agreed a list based on their previous questionnaire, which had been circulated to all employees in this public sector organisation.
A few days later they came back to us requesting a few, very minor, changes to our draft. Initially we were pleased that we seemed to have hit the spot and understood what they wanted. And then we noticed that they had asked us to remove three items: those relating to location, department and job role.
We were, not to put too fine a point on it, stunned. Why would they NOT want to understand the views of people at different levels, in different functions and in different parts of the organisation?
The highlighted some basic questions which we often ignore: what's the point of an employee survey? Why do we run them and what do we want to get out of them? More often than not we assume that when a survey is commissioned, the client has carefully thought through why they want to commit considerable time and effort - not to mention money - to such an exercise. As consultants we know that surveys can be enormously powerful tools which can play a central part in facilitating change. We work with many gifted and talented managers who are intimately involved with transforming organisational culture and performance. And we simply take it for granted - as do they - that we will produce organisational data analyses which will be as precise and informative as those produced by forensic accountants. We just assume that every client will want us to work with them in this way.
What we can forget is the importance of checking all these assumptions, and understanding not only what clients want to derive from a survey, but their concerns about the survey methodology. In the case above we had assumed that there was a consensus about both objectives and process. But in fact our client simply hadn't appreciated what can be done with data analysis (they'd done it all in house on previous occasions and had treated the organisation as a single group). They'd also been concerned that employees would be deterred from responding if they were asked 'too many' questions which might make it possible to identify them. All perfectly reasonable and understandable. Better informed about the options and the benefits (we should have made this our business earlier on) and reassured about data protection, the client was happy for us to seek more detailed demographic data and thus to be assisted in more fundamental ways than they had thought possible.
In practice, we always expect go to great lengths not only to produce survey reports which are of optimal value to our clients - whatever their objectives - and to take absolute care of personal data. But this example showed that we should be more careful to check that we're at the same starting point as the client. Obvious with hindsight, but something we may sometimes fail to check. Fewer assumptions, more progress seems to be a key lesson for us.