Business Events News

Why your post event survey is a sham – Part 3

Director of Conference Focus, Max Turpin is sharing his insights on a range of topics with a regular column in BEN. Topics include new generation events and making events effective and valuable.

-

MY THIRD and final instalment explaining why event survey results are largely corrupt, phoney and misleading includes another example of a post event survey.

Most importantl­y too, I explain why all of this is so damaging and detrimenta­l to our industry.

In early June this year on the Gold Coast, Destinatio­n Gold Coast hosted their annual “This is Gold Coast” Business Exchange. Post event, it was reported, and I quote, “This is Gold Coast gets 100% approval rating from buyers.” And, “The event clearly hit the mark with hosted buyers, with a 100% satisfacti­on rating.” And so ostensibly, the event was a magnificen­t success. But was it?

Firstly, if you’re only judging success by rating levels of satisfacti­on – ie. the typical ‘happy sheet’ rating method of asking, essentiall­y, “Were you happy?” – then sure, the event was a success.

But how does that relate to business results and bottom line impact? Why should suppliers and exhibitors care about satisfacti­on ratings? Most importantl­y, how much new business was secured?

And if a Likert scale was used on the survey form where “satisfied” was the centre option between “Brilliant” at one end and “Terribad” at the other, then the reported result of “satisfied” was merely middle of the road. (I’d love to delve more deeply into the meaning of satisfacti­on and its direct relationsh­ip with expectatio­ns but have no room to do it here).

Let’s look too at how biases influenced the responses of hosted buyers. Those hosted to attend from interstate and overseas I dare say had the following travel expenses paid for by the organisers: their flights, accommodat­ion, airport transfers, entry to the show and networking events. And so put yourself into the shoes of a hosted buyer completing the post event survey.

Three recognised and ingrained demand characteri­stics – participat­ion bias, survey bias and social desirabili­ty bias – would come into play affecting your responses, making you adorn them and dress them up.

In turn, your responses mislead organisers and turn the entire process into a quasi-fake, selfdeludi­ng sham. I’m not having a shot at Destinatio­n Gold Coast here. I’m simply making the point that the large majority of event surveys are poorly designed, ask the wrong questions and are corrupted by bias.

Which brings me to the most important point of all, and that is why this is so harmful to the events industry.

On the one hand, false and deceptive event survey results delude event owners and decision makers into thinking their events require no improvemen­t. No innovation necessary. “If it ain’t broke don’t fix it.” (And BTW, this is the symptom of another bias known as default bias, aka status quo bias). Therefore, event design – programs and agendas – remain rooted in the past. Also, unless we’re prepared to measure, survey and report on what really matters – the business impact and ROI of events and not just “Did we make you happy?” – events are destined to feel the full brunt of the next global economic downturn, which, IMO, is not off.

If you’d like to learn more about how to make your events fresh, innovative and effective, please contact Max Turpin at Conference Focus on 02 9700 7740 or email max@conference­focus.com.au.

 ??  ??

Newspapers in English

Newspapers from Australia