IT analysts produce crap - what to look for in analyst "research"

This post has been podcast
Crap Factoids are pure B.S. that almost sound like a fact, and will be presented so often that everyone will think it true. Let us look closer at a classic Crap Factoid where the results were deliberately skewed, then hyped up by marketing people, and the resulting Crap Factoid thrown to the winds like GM seed. It is time people called analysts to task for this stuff because we all suffer the consequences when decision makers fall for it.

As I said in a recent comment on this blog, my concerns with much research published are that the 'research' is

  • commissioned to prove a point, like cancer research paid for by the tobacco industry but with less observers ready to scream "foul"
  • created as a revenue generating exercise, therefore the results need to be useful, attention getting and self-serving (grow the market)
  • often anecdotal and opinion-based
  • Often asked of the wrong person: "How brilliant were you..." "Did you make the right decision to..." "what ROI have you had from your spending..."
  • lacking transparency (and hence impossible to reproduce): what was the methodology? what questions were actually asked? how was the sample derived? what controls were there (generally none)? what were the raw results?
  • no peer review ('cept blogs like this). Where are the academic and professional journals and conferences with real review boards?

I think IT should be renamed Information Engineering, and should be held up in comparison with the traditional engineering disciplines, where it compares very badly. If a bunch of post-grad engineers set themselves up as self-appointed experts and wrote a paper on how 86% of Chief Engineers surveyed agreed that bamboo is the material of choice for bridges in 2008 (and sold it for $3000 a copy), they'd be torn to ribbons by peer review.

As an example, let us look at a case study: "66% of organisations surveyed around the globe have engaged with the Information Technology Infrastructure Library (ITIL)", according to a survey released in February 2008 by DimensionData. This survey was the subject of a previous Crap Factoid Alert. It is neither the best nor the worst around, but seems nicely typical to use as an example of what to look for when detecting Crap Factoids.

  • Dimension Data commissioned the survey from Datamonitor. These are not professional scientific researchers, they are professional market researchers, which is not the same thing. But at least they know how to construct questions and analyse results.
  • "The research surveyed over 370 CIOs from 14 countries across five continents." But how did it select them? As DD customers? How did it bias the sample? Not stated, but see below.
  • Look at the spin put on the press release: "Two-thirds of Enterprises Engage with ITIL – Is Your Company an IT Service Management Laggard?" The intent of the exercise is evident, which puts the credibility of the research into question. Scientists set up a hypothesis but they try to be impartial about its veracity.
  • Look at what the survey measured: people's opinions. "What do you believe to be ...?" "In your opinion, what is the potential impact..."

    At least the ITPI research that we have been debating on this website actually measures some hard metrics from the sites,. This opinion-based 'research' is not worth the self-aggrandising wind of the respondents who produced it. They might as well ask "How clever were you..."

  • Then we get graphs like "What do you believe to be the primary inhibitors to adoption of ITIL / ITSM best practices?". Well what were the options they chose from? We only get told the "Six strongest". My local Resident's Association recently surveyed the village and asked something like:
    Which would you least like to see in The Bay?
    • cycleways
    • walking paths
    • playgrounds
    • beach improvements
    • big ugly housing developments
    • gambolling unicorns
    • fairies in the dell

    "99% of residents agree the last thing they want to see..." (1% = me)

  • Note all the way through the Datamonitor paper they are arguing strongly from a pre-assumed position. The intent is clear: to talk up ITSM frameworks in general and ITIL in particular. Remember that analysts are parasites on an industry: they sell information and opinion on it. If the industry grows they grow. If the industry withers they have to go start again somewhere else - expensive. Analysts have a clearly defined motivation to pump up an industry that they have invested in.
  • And now the doozy from the results document itself from Datamonitor: "Admittedly, this Datamonitor study deals with a somewhat self-selecting sample, as the screener question probed for those that have evaluated, although not necessarily adopted, ITSM frameworks. Methodological nuances notwithstanding, the survey results indicate that over two-thirds of the enterprises interviewed claim that they have engaged with ITIL".

    "screener question"? to select the 370 or to select the responses that made it into the results? the graphs show "n=372" so I'd say the 372 were deliberately selected to be already predisposed to ITIL. Either way Datamonitor are freely admitting the results were deliberately skewed. Then they cavalierly brush this aside as "Methodological nuances". Deliberate distortion of data, I'd call it.

  • This jaunty approach to statistical science is repeated elsewhere, such as this one on p11: "Granted, the statistical significance of a 10 percentage point differential could be the subject of further scrutiny. Nevertheless, the swing testifies to the positive experience of those that have implemented ITIL and corroborates qualitative evidence in favour of ITSM approaches in general and the ITIL best practice framework in particular... Those that have engaged with ITIL are more optimistic regarding its actual impact"
  • Since the survey questions, the methodology, and the raw data are not published we cannot draw proper conclusions. This is a classic attribute of pop-knowledge Crap-Factoid fluff like this that strongly distinguishes it from scientific research: you can't check it out for yourself.
  • Remember, if you want your Crap Factoid to propagate, exact numbers give it credibility.

    [The other major factor in establishing credibility is the name of the source organisation. On the IT Skeptic's CF Name Drop Scale, Gartner scores a factor of 2. Datamonitor scores about 0.5]

    Datamonitor said "over two thirds"
    Dimension Data's press office said "More than 65% of respondents" [turn it into a number]
    but the Dimension Data press release itself said "66% of organisations" and headed the whole thing "TWO-THIRDS OF ORGANISATIONS..."

    The real number seems to be in Figure 4 (p11):
    but we'll never know as they don't publish the data. If I'm right, Datamonitor's grasp of basic maths is a bit shaky. When I went to school 66% was less than two thirds, just.

The IT Skeptic said some time ago that the IT analyst industry badly needs a code of practice to reduce this kind of pop-knowledge crap. Please spread this article around, get the word out, put some pressure on them.

The analysts survive on their credibility. Based on the bilge they produce, they don't deserve it. If we undermine it, they'll have to do something to improve, to deliver real scientific research. If we don't, they'll keep shovelling this stuff into our managers and you'll live with the results.

Remember: Chokey the Chimp hates Crap Factoids!

Also see:


Analysts' damn lies

Justin Pirie is a candidate for the IT Skeptic's annual Sagan Candle award for online skepticism with Gartners “Magic” Quadrant- Lies, Damn Lies and Statistics. So is the article he quotes from Mark Suster.

It’s strange to me to think that customers with years of experience would ever listen to twenty-something smarties from great MBA’s who have never worked in your industry before ...
“well, my report was due and I didn’t have much time. My boss told me to look at the growth rate average over the past 3 years and increase it by 2% because mobile penetration is increasing.”
“For real?”
“Well, yeah, we know it’s going to grow faster but nobody can be sure by how much.”
Me, “And I suppose you don’t have a degree in econometrics or statistics?”
Her, “No.”

I've never found evidence of rigorous research by an analyst. Period. See Mark's post and my original post above for all the crap they get up to. they pull numbers out of their analyst.

or this quoted from Ed Sim’s post- The Gartner Magic Quadrant- a necessary evil in IT.

Given my experience, I must say that developing a relationship with the analyst is key to helping you improve your standing in the quadrant. This means buying a subscription to Gartner and then hiring the analyst for some consulting.

So maybe that upper-right quadrant should be labeled "analyst-lickers".

Great comment on Gartner and hype cycles: banana oriented archi

I found this great quote from Alexander Jerusalem:

It'd be quite interesting to look at [Gartner's] predictions from one, two, three or 7 years ago and see what has become of them. It seems that their projections are based largely on opinions, their own and those of the CIOs they talk to. But these opinions change with the wind. If IBM or Microsoft come up with the banana oriented architecture tomorrow, you will see 50 % of CIOs say: Yeah, we're actively testing this thing, it looks very promising. And a few months later after Gartner has predicted the rise of bananas as _the_ predominant platform of the century, 50 % will say: Sure we're using it and we will be using much more of it next year. And it will be true, because the tag "banana oriented" will figure prominently on each and every OS or app server or library update they could possibly buy. And certainly it won't be long until Eric Raymond calls on everybody to peel their bananas before selling them in the bazar instead of the supermarket.

But the amazing thing is that no matter how weird and irrational this whole technology "innovation" circle is, it sometimes creates quite interesting ideas that are worth learning and we probably shouldn't become too cynical about it.

The Old MagicianAt the time I was trying to find the origin of the quote from my old boss Charles Wang when asked about Gartner "I want to choose my words carefully here, so I'm not misunderstood. They're a bunch of fucking idiots." It may be apocryphal but it rings true and was known when I worked at CA. I hate to admit it but sometimes I miss the Old Magician.

Sceptical = sleep deprivation

Sounds like someone woke up on the wrong side of the ITIL pillow.

I'm always this grumpy

No I'm always this grumpy.

I get wound up by intellectual laziness, fuzzy thinking or outright dishonesty wherever I find them.

The IP Skeptic

By questioning the value of analyst research, I guess you could say I'm being the IP Skeptic eh?

We try to avoid crap in our factoids

As the chief analyst of one of those analyst firms, I must comment. First of all, I agree with the basic premise: market research interviews with IT executives tends to produce poor results. Or at least results that are very easily skewed. After all, the research is for marketing and marketing is at least in part, intended to focus customer attention on the good stuff a company or product does and away from the bad stuff.

I believe that market surveys on the efficacy of a technology solution should be viewed in the same light as a Spinal Tap amplifier volume knob. Useful to know about and worth a chuckle.

My firm focuses on hands-on evaluations supported by formal and informal interviews with users. We believe that validation of a company's product positioning and the concomitant validation of competitive positioning is a more useful approach for our customers and their customers too. While our published papers are all intended to help our customers better sell their products, we try to avoid gross factual error and only use a little spin.

When we do a project we tend to start from the market research and surveys and go from there. If "everyone is saying" something about a product, and we get hired to do so, we will either prove it to be true or false depending upon who's paying. We're honest about the bias, and the results too. It's not unusual for our research to support competitor positioning or strengths nor is it unusual for those research projects to remain confidential.

OTOH, if you believe all the crap and are an IT executive, you're probably in the wrong job. ITSkeptic is not just a blog title, it should be considered a job requirement.

the classic caveat emptor

BEC, thanks for joining in, but I'm going to take a swipe, sorry.

Your comment reads like the classic caveat emptor: "I just make the snake oil. Not my fault if folks are dumb enough to buy it, let alone drink it". Fact is, many IT folk including management ARE dumb enough to build a bamboo highway bridge if they think 86% of their peers say it is a good idea. It is up to people and sites like us to get the caveat to the emptor. And it is up to people like you to consider if what you are doing ("spin") is ethical (spoken like the 20-year vendor veteran that I am). Not saying it isn't, just asking the question.

I loved Brad's comment: by all means build expertise, develop good opinions, and sell them. Just sell it as opinion not properly researched fact.

Analysts are good fun, reports are less so

The one thing I do like is talking to the analysts. Something that has only been easily available to me in the last 8 years working for a vendor. Most of them are intelligent people and they do get to talk to a heap of companies, including competitors. As semi-independent, they often get to hear information which would not easily be transfered otherwise and get to form interesting opinions because of it. I generally find them more skeptical than the media commentators that follow the same markets (most recently taking blog form) because they generally are not about hits and sensational headlines.

As a cup of coffee the information and opinion is good value, when portrayed as research science, it is overselling itself. Having worked closely with a number of them over time they don't have the time, methodology or datasets to apply the same checks and balances normally associated with academic research. There is also not the need for repeatability of analysis that is required for academic research to become credible.
(NOTE: My experience here is limited to the commissioning of 5 analyst reports for my employer, and spending 8 years living with a Research Psychologist and providing the obligatory SPSS and SAS coding/support.)

But we should think of them more as finance market analysts and less as academic researchers. I will follow a few analysts who specialize in certain markets, and read there papers. I respect there opinion, experience and social networking skills and over time get to understand there biases. I am less likely to just read something and absorb it because its from a major analyst company with a unknown analyst.

So get to know your major analyst in the fields that interest you, that way you get some value.

Brad Vaughan

Analysts' opinions are valuable; their facts aren't

EXACTLY right Brad. Bang on the money.

Analysts get around. They learn a lot. Their opinions are valuable. Their facts aren't. If they'd just sell opinions I'd leave them alone ... perhaps :)

Some of my best friends are analysts...

Actually no.

I consider them as on a par with real estate agents (In the UK the only upside people can see in the collapse of the property market is the number of estate agents who will be unemployed, possibly even having their own homes repossessed. Oh such sweet irony)

They want to be your friend to either sell you advice, or to steal your opinions. Yes I know that's whar we consultant's are also accused of. The difference I suspect is we try to actually teach you to tell the time using the watch you already have, whereas an anlayst simply asks the closest passer by what the time is and then sells it on to someone else.

Mind you I think Brad is right, they are only meeting the market need. The market doesn't want scientifically verifiable evidence, it wants something that simplifies decision making, and the prime factor that does that is telling them that "this is what everyone else is doing" How many strategies have been launched on the simple premise "Gartner says everybody else is doing this..." removing all need for rational analysis or creative thinking.

telling people what I think and what to think

This blog is ample evidence that I like telling people what I think and what to think. I have to confess to having gazed enviously at the analyst profession. Nice work if you can get it.

You are quite right about people wanting their thinking pre-digested for them.

Consultants are different. Consultants do people's communicating for them, not their thinking. They go talk to the folk that their employers ought to be talking to: their peers, their staff, their suppliers... Then they do their thinking for them.

Spinal Tap

When I do a customer survey I like to turn the number of options available to a respondent up to 11

"ITSkeptic is not just a

"ITSkeptic is not just a blog title, it should be considered a job requirement."


Analysts Slamdancing

I've ranted about the Blood in the Water before. Personally, I do like the analyst and vendor papers; even with the spin. They are often a result of their clients input and can be an excellent source of information (or data).

However, separating the spin from the facts and applying them in a context that is appropriate for your situation is what gives the information its value for you. Throwing the White Papers into the pit and letting the Punks tear it up a bit may help weed out the 'crap factoids', and I'd be in favor of a Mosh Pit expressly for this purpose.

By all means listen to the drumbeats you hear coming out of the vendor/analyst community. But as the hype increases don’t forget what you need to do, which may be much more down to earth. With blood already in the water, staying on top of the basics can keep you from being bitten.

So as our Savage Journey Continues, let's get all the fact craptoids in the pit and let's slamdance our way to enlightenment.


It is unwise to be too sure of one's own wisdom. It is healthy to be reminded that the strongest might weaken and the wisest might err.

John M. Worthington
MyServiceMonitor, LLC

Lies, Damn Lies and Statistics

If you rely on getting your news from the television, then you probably are fairly comfortable believing "Analyst" reports. The popular analyst firms are commercial entities who have to serve the needs of the paying customers. They are a primarily a marketing tool for major investors.

Craptoids are just at the most distilled version of this marketing tool. You need to catchy headline to get people to read the content.

To the support these companies, very little of the information that shapes our views is truely unbiased or substantial.

These Analyst reports should be just taken as one source of information that needs to be validated and compared with other sources to create a more rich view of reality. Be sure however that "reality" only comes from experiencing it. Everything else is just opinion.

If we could find a way to transfer learning between people of different generations without bias or corruption, then we have solved one of the endemic problems with the human condition. We are doomed to repeat our failure.


Brad Vaughan

Syndicate content