Feedback Fatigue: How often should you survey your staff?
I recently chipped my tooth (a story for another day)! A one-minute phone call to my health insurance provider confirmed that my stupidity was covered under their policy… Cue recorded message… “Based on your interaction with our customer service professional how likely is it that you would recommend us to your family and friends? In a few words, explain why you gave this rating?”
These days it’s commonplace to receive a quick customer satisfaction survey whenever you engage in a transaction, whether it’s the phone survey after a call with your telco provider or an online survey asking if you would recommend your car insurance provider to your family and friends. One main benefit of these customer engagement (CX) surveys is the ability to link customer feedback to specific transactions, individuals, days-of-the-week, etc to gain valuable performance data to help improve service. This is a great example of artificial intelligence being used efficiently to improve the world we live in.
Following on from the success of CX research, organisations are now exploring the viability of implementing shorter surveys with greater regularity to help improve the employee experience (EX). It’s argued that EX surveys have advantages over the annual engagement survey such as higher response rates due to quicker completion time, ease of completion on a mobile device, the ability to report and act upon issues as they arise, frequent tracking of company performance against KPI’s and ensuring staff feel heard due to ongoing dialogue between staff and management.
A client of ours recently experimented with monthly staff pulse surveys, using 10 opinion questions and one optional open-ended question. After the first survey, management responded quickly to implement one action to address the lowest scoring question “in this organisation it is clear who has responsibility for what”. The action involved adding staff photos to everyone’s intranet bio. They worked tirelessly over a couple of weeks to implement this action prior to the launch of the next month’s survey.
However, despite the swift implementation from management, staff still didn’t feel appropriately heard. It was their responses to the open-ended question that ultimately stopped the pulse program in its tracks after only two implementations.
A pulse survey is not a dialogue
Whilst management was able to act upon the role clarity issue, the action was perceived by staff as being implemented too quickly, without consideration of what was really driving their dissatisfaction. This led to staff feeling misunderstood, which was ultimately the opposite of what management were trying to achieve. Management later acknowledged that monthly frequency was too ambitious and that they should have taken the time to better understand the issue and its underlying causes before jumping into action.
A “management tool” does not engage staff
An attractive element to frequent surveys, whether monthly or the more common quarterly frequency, is the ability for Executives to receive periodical data about the health of the organisation. This data is often packaged in a dashboard format, making it simple for Executives to track performance over time. However, a concern with this approach, especially when paired with a lack of action, is that it prioritises the needs of management over those of staff. This can impact future survey participation as staff will not want to participate if “management aren’t going to do anything anyway”. For a survey to be successful it needs to meet the needs of both groups.
Quicker surveys do not automatically improve participation
The client waited another year before delivering a full engagement survey (105 questions). Interestingly, the response rate to the 105-question annual survey (58.4%) was exactly the same as the average response rate across the two monthly surveys (also exactly 58.4%), suggesting that a shorter survey may not necessarily lead to a higher staff participation rate. In fact, our data shows that other factors are more important in driving high participation, including clear articulation of the survey’s purpose, trust in the confidentiality of responses, management sponsorship of the survey and confidence that staff feedback will lead to appropriate action.
Despite the negative experience of this particular client, there is definitely potential for frequent, short surveys to work in certain circumstances.
When the audience changes for each survey instance
A key reason for the success of CX surveys is that every survey is typically administered to a different person. In an employee context this might work for, say, assessing an onboarding program, or a specific staff training initiative whereby different groups of staff participate in each roll-out.
When you can link the survey results to a specific interaction
In CX research it’s possible to link feedback to a specific interaction (e.g., a phone call), enabling organisations to provide feedback to customer service officers to help improve their performance. Linking would be possible, for example, following an onboarding or training program. This would enable facilitators to experiment with different approaches and test what does and doesn’t work well.
When tracking performance against survey actions
Pulse surveys conducted periodically in-between engagement surveys can be an efficient way to track progress against specific identified priorities and actions. These surveys are usually very short and focussed on one or two issues. Staff are usually willing to participate as they have a vested interest in the specific issues, and it enables them to have a say about whether the issues are being addressed sufficiently.
When you have the organisational capacity to implement positive change before the next survey
This is the most important thing to get right. The old adage of never conducting a repeat survey until you have addressed the main issues in the first survey still applies. Before conducting an initial survey, make sure you have the resources to invest in positive change. Then allow enough time following the initial survey to understand the issues, develop an action plan and implement the change in a lasting and meaningful way. Interestingly, following their most recent survey, our client this time conducted some innovative design thinking workshops to unpack the results. This led to more considered actions that addressed the real underlying concerns.
If you are considering a move to more frequent surveys, consider whether the above circumstances apply to your organisation. If you don’t think progress can be made against priorities, then it may be more appropriate to stick with your regular cycle of engagement surveys.