So I started the week with another case of ‘pass the sick bag, Alice’.
For the sixth time in as many weeks, I’ve heard people saying – “I’d like to use XYZ research firm because they’re a good brand for when we publish the results…”
What utter tosh.
This time it came from another agency, who really should know better.
Let me be clear, though. This isn’t a blame thing. Research is part of our bread and butter. Belief in any reflected value from a research company’s brand is simply naivety, nothing more.
Do I mean that a research company’s brand is worthless? No, of course not.
A research brand, based on reputation for quality panels, rigorous analytics and high-touch customer service, is very valuable indeed. In other words, it can inform your choice of who to use in terms of the quality of their product/service. NOT, however, any thought of leveraging their brand when you publish the results (in my humblest of opinions).
No-one cares who did the fieldwork.
They care about the story the results tell!
My best example to illustrate this point is the range of research fieldwork companies who published their polls in the run up to the recent UK election.
There were literally scores of them.
Within the usual bounds of expected volatility, most of their results turned out to be broadly valid.
And did the media favour one over any other? Did they frequently highlight or name-check anyone?
No.
Almost all media outlets were at pains to publish ‘poll-of-polls’ results, featuring no individual pollster.
That’s because the actual quality of the research fieldwork product from each of them is – to my mind – virtually indistinguishable. The statistical techniques are much the same. The maths is much the same. The panel quality is pretty much the same.
The way they compete is on service, support, analytics and so on.
Now this is just my view, and it may induce howls of protest.
But it’s my experience, and I’m sticking with it.
And I have never – in all the feedback we’ve had over twenty years – heard of someone saying that their opinion of survey results was influenced by who the pollster was… they’ve only ever commented on the content and results themselves.