“According to the widely respected OhGosh Group, OUR Product Performs Better Than THEIR Product!”
or
“New Report from World-Renowned Acme Consulting Proves XYZ Technology’s New Stamflatz 10000 delivers More Gigaflux Capacity than ABC Technology’s Burgpultz 500!”
Or how about my personal favorite?
“Industry Leading Analyst Firm Ranks XYZ Technology’s New Stamflatz 10000 the Clear Leader in Customer Satisfaction!”
Oh, and the tweets are even worse:
“Great News! We absolutely blow away Burgpultz! But, don’t take our word for it.. check out HTTP.BS.B.LY”
You know the type – usually sounds all hyperventilated and squealy. Ugh.
My good friend and colleage, Lori MacVittie just wrote a blog on this subject, For Thirty Pieces of Silver My Product Can Beat Your Product, that got my 'marketing professional' dander dandying.
Digging under the covers, it becomes pretty clear that ‘widely respected’ OhGosh Group, or ‘world-renowned’ Acme Consulting, received a (big) check from the marketing department at XYZ Technology, Inc. to run (or worse, to simply observe and report while XYZ actually ran) that test that XYZ Marketing is now so breathlessly touting.
Hey ho…lazy marketing managers at XYZ…here’s a wake up call…the market – despite current leftist leaning political opinion – is not stoopid. You are. The market is in fact brilliant. You are not. The market cannot be fooled…fool.
Two things really stick in my craw about this lousy, lazy, and professionally embarrassing marketing practice.
First – it hurts customers. These one-sided tests - bought and paid for by technology vendors - do not serve customers in helping them make thoughtful buying decisions.
Second – pay for play performance testing makes a mockery of the marketing profession, and it makes the jobs of legitimate, knowledgeable and hardworking Marketers that much harder. The customer you snookered with your lop-sided test results soon learns to discount all marketing information as more of your hogswallow.
A friend of mine, who also happens to be a big-time CIO, has one of those red Staples ‘Easy Button” things on his desk. When a vendor starts spewing this nonsense, he hits it, and instead of a bubbly, "That was easy", a stern voice loudly proclaims, “Warning BULLSHIT level approaching DEFCON-5!”
I love that. But I hate that he thinks all technology vendors spew BS, and that’s the problem.
So, if you call yourself a marketing professional, the next time you find yourself tempted to pay somebody to run a trumped up test or skew a survey to make your feeble product look formidable, slap yourself upside the head, and get back to work. There are no short cuts.
If you’re a customer, the next time you see a headline or read a report that claims, “XYZ Technology is Faster, Better, Cheaper, Shinier, and More Fun at parties” than ABC, stop and ask yourself, which company would you want as your trusted advisor – the one that pays to play, or the one that always, not matter what, shoots you straight?
4 comments:
BTW - I am not going to slam the testers here - they are just doing what they are paid to do, and I don't begrudge anyone a living.
It's the marketers who hire them and skew the results that infuriate me.
As the owner of one of those "oh My Gosh Group" labs I have a mixed reaction.
Frankly I started taking vendor assignments because there was no other way to get test results out there. The publishing industry that used to keep me busy in the lab dried up with ad revenues.
We ALWAYS test, almost always in our lab, and resist the test engineering that comes from vendor ran tests while we watched.
So if you have a too good to be true tech do you test in house, hire us or just not make performance claims?
- Howard
Howard - appreciate the necessity of going where the money is...suggest that you drive the high road - see Lori's blog for suggestions. Force clients to accept full transparency, apples to apples testing, fair notification of competitors, etc. You could actually find it is a competitive differentiator for your company to be the one test lab known for only doing true COMPARISON testing vs Competitive...then you really would become world-reknown...
*Disclosure I work for NetApp*
Overall I agree with you, especially where you argue for transparency and fair notification which seem similar to the requirements of the Storage Performance Council.
You could argue that the various SPC benchmarks arent relevant (as EMC has done in the past), but the same could be said of any synthetic benchmark created by a reviewer.
Outside of performance, the results for soft metrics such as "ease of use" or "customer satisfaction" can be manipulated by the unscrupulous, but most of the "independent" reviewers seem to be reasonably honest brokers.
I think it is unlikely that a vendor would ask a reviewer to test a configuration that they believed would come "out the wrong way". Even if it did, the chances are that the entire report would be quietly shelved and the marketing team responsible would have an embarassing conversation with their management.
Haqving said that I've seen some pretty bogus "comparisons" in the past done by vendors in the past, mostly caused by configuring a competitors storage against one's own or outdated best practice. However the independents dont seem to be quite as bad.
I'd be interested in seeing some specific examples and what you believe is incorrect/misleading as I believe that has a better chance of improving the situation rather than tarring the entire industry with the same brush.
Post a Comment