There is little point in collecting lots of data unless you are going to do something with it.
If all we do as professionals is gloat about how much participants liked the program, then participants will soon reduce the amount of time and effort they give to deliver feedback and complete surveys for us. Improving the participant experience and learning outcomes must be our agenda.
Responsivity is so appreciated by participants and managers – it tells them ‘you listened’.
This relates to logistics and operational communication. Are our messages getting through, actionable, and effective? We have constantly tweaked our communications over the years on suggestions of participants and managers alike. We’ve also axed some elements because feedback told us they were of little value to the program.
We ask lots of questions about what the most important and helpful parts of the program were, and match this data against what they studied. It tells us which topics are landing, and which could land better. Over the years we have dropped some subjects and added others.
Coach performance Maybe this is a characteristic of professional coaches, but far from worrying about being monitored, our coaches embrace feedback hugely. Maybe because they so rarely get structured and comparative data. It is possible for our Master Coaches to tell a coach on our roster that their scores are in the lowest quartile. This is data, if you are a professional coach, you need to know, no matter how disappointing it is to hear at first. Coaches are the biggest single point of inspiration and failure in any small group coaching program. Ensuring you have the best people delivering at their best consistently is a key success factor.
How engaged are the participants, and why? What, if anything, is disengaging them? Elsewhere in this paper we talked about making sure sessions happen regularly. We learnt how negatively impacted the pods were when this wasn’t the case through engagement feedback. We learned how much participants want to – and can be trusted to do so sensibly – choose their own topics through feedback.
We learned through feedback from managers that making sure they understood the nomination criteria far better upfront was important. And we are adding briefing videos to complement our documentation because managers tell us they are more likely to watch a three-minute video than read a document.
How do we promote the investment in these small group coaching pods to senior executive sponsors. The answer is always data, and if we are showing continuously that the program is delivered and being improved, it makes it much easier for senior executives to fund the initiatives.
In some programs, clients decide to do “before” and “after” 360 surveys to see whether individual participants have progressed.
We make sure we are constantly communicating to our alumni how their feedback has helped. This isn’t just good PR, it’s good practice. It generates goodwill and shows that the program is about them and not us as designers. It generates budget and commitment from the wider organisation.
In our Fastlead programs we handle all the metrics and reporting, providing clients access to raw data as well as processed, and we discuss what this data is telling us in program meetings on a continuous basis.