HIRE ME TO SPEAK
HIRE ME TO SPEAK

Gartner fails miserably in product test, immediate action required

I write about strategies to turn fans into customers and customers into fans. I also share ways to use real-time strategies to spread ideas, influence minds, and build business.

Social Media  |  Worst Practices  |  Public Relations  |  Media Relations  |  Research and Analysis  |  Corporate blogging

UPDATE 1: About six hours after this post, Andrew Frank commented (see below) and also clarified on his blog in a post This is Not a Product Test. Thank you Andrew.

UPDATE 2: The day after this post, Andrew Frank posted The Watchdogs List which aggregated the responses from the two Gartner blogs in order that they were received. This is a welcome and appropriate followup to my blog post below. Once again. thank you Andrew.

++++++++++++

ORIGINAL POST:
Andrew Frank, a Research VP at Gartner "the world’s leading information technology research and advisory company" conducted an interesting experiment in a blog post on October 2 called Which Social Media Monitors Eat Their Own Dog Food?

Gartlog
Unfortunately the experiment went terribly awry and Gartner did nothing about it. I find this behavior by a research firm appalling and misleading and completely counter to the sorts of things that they advise clients. In fact, I think given the post in question, Gartner should be required to issue a formal apology.

Frank issued a challenge to firms that monitor social media by including the names of some companies in the space in a blog post. The test was to see who noticed and you told Frank by leaving a comment.

The companies included 1st2c, Biz360, BrandIntel, BuzzLogic, Nielsen Buzzmetrics, CIC, Clarabridge, Collective Intellect, Converseon, CoreX Technologies, Crawdad Technologies, CSC NameProtect, CustomScoop, TNS Cymfony, Echo Research, Envisional, Factiva, Kaava, Market Sentinel, MotiveQuest, Networked Insights, New Media Strategies, Onalytica, Opinmind, Popularmedia, Radian6 Technologies, RelevantNoise, ScoutLabs, SentiMetrix, Techrigy, Trackur, Umbria, Unbound Technologies, Visible Technologies, Waggener Edstrom Narrative Network

So far so good. It’s sort of clever actually.

However the blog comments that were a part of the Gartner blog system made this a terribly flawed experiment. Several people’s comments did not appear and they had to resend. One person commented within 24 hours and that comment never appeared. It is quite likely that some people commented and their comment never appeared at all.

Here are some of the comments that led me to write this post:

Marcel LeBrun, CEO of Radian6 said: "Hi Andrew, We are listening. Didn't see my earlier comment appear on your post so trying again."

Andy Beal of Trackur.com said: "Unfortunately, I see all and hear all. Nice test! **Either you need a better confirmation system, or my last comment vanished. Feel free to delete this one if needed. **"

Alecia O'Brien of dna13 said: "We are in fact listening (and have been..). We posted a comment several days ago but I guess comments are being moderated?"

At the New Marketing Summit #NMS08 yesterday, I had an opportunity to speak with several of the companies in this experiment. Many were, excuse my language, ROYALLY PISSED OFF.

One person said that they reached out to Gartner through the blog about the issue and got an email from Tom Kobak, Communications Director of Gartner's Product Platforms Group who said: "Thanks for your email to Site Feedback about your comments on the Social Media Monitoring on one of the Gartner blogs. I have passed on to the appropriate person here to take care of, so should be fixed shortly. Sorry for any problems, and please let me know if you have any questions." Interestingly, Kobak's email signature line included this: "Gartner delivers the technology-related insight necessary for our clients to make the right decisions, every day."

Okay, so here is why this is an appalling situation that Garner must make right:

1. Gartner is an analyst firm. Clients trust them to make decisions on technology. In this case, clients may make actual purchase decisions based on this flawed post.
2. In this post, Gartner was testing media monitoring company speed of response to a post. They were TESTING PRODUCTS on the blog but the test was ridiculously flawed. (The test implies that when you are quick to respond, you are better, yet some people were unable to respond).
3. The flaw was pointed out yet Gartner did nothing about it.
4. The vendors in question would probably be too timid to call Gartner out on this for fear of retribution in the form of bad product reviews.

Here's another test. How long will it take Gartner to respond to this?