Friday, August 28, 2009

A Value Proposition for Industry Analysis 2

@mcgoverntheory (James McGovern) asks "Why do you allow us to remain blissfully ignorant not knowing what quality analyst research is?" But as James Governor replies, "find a way to pay the independents without us merging, and who knows what we could do for you".

I've been rather critical of the large analyst firms recently, but at least they've found a way to generate revenue from what they do. They have also evolved a reasonably efficient process for producing and distributing large quantities of knowledge, in a form that people are apparently still willing to pay for.

The challenge for "quality analyst research" is to find a way of funding empirically grounded and practically relevant work to a satisfactory level of rigour.

Empirically grounded means gathering concrete evidence from the technology-in-use. Not sitting around pontificating on what a given bit of jargon ought to mean, or inventing new frameworks, but measuring actual outcomes.

Practically relevant means producing something reasonably quickly that is of practical value to producers and/or consumers. I'm not interested in measurements of opinion or awareness, I'm interested in practical results. While long-term studies are important as well, this is a job for university researchers. (Unfortunately, a lot of academic research is disappointingly superficial as well, if not methodologically flawed, but that's a different problem.)

Satisfactory level of rigour means doing genuine analysis rather than casual sorting or mechanical feature comparison, taking nothing at face value, and examining the evidence through the appropriate lenses.

It is important to remember that the producers of software products and platforms may not themselves understand what they have produced. Over the years I have talked to any number of bright inventors and innovators, and I cannot fault their ingenuity and enthusiasm, but they are not always aware of things going on elsewhere in the IT world that might interact (positively or negatively) with their work. (I think that's an excellent reason for them to talk to me, but then I would say that wouldn't I?)

And the users don't understand everything either. They may stand up at conferences and talk about "success stories" with a given technology, but it is very common for people to identify a single cause of success or failure, without acknowledging that success and failure usually result from many interacting factors.

In any case, technology strategy isn't about selecting specific products, or even specific classes of product. That's pretty tactical, because product features change from one version to the next. Even vendor comparison and selection look pretty short-term once acquisition rumours start to circulate. (Perhaps last month IBM had a better story on technology xyz than SAP. But this month SAP has acquired a niche xyz specialist, and last month's opinion is now out-of-date. Until IBM retaliates by acquiring another xyz specialist. And so on.)


Technology strategy should be about deciding what factors to pay attention to. So if industry analysts are going to support technology strategy, we need to be able to show which factors have the greatest impact on outcomes. And this knowledge cannot possibly be derived solely from vendor briefings or anecdotes from selected users.

1 comment:

  1. Richard -

    Good post, and your three points are generally in agreement to what I refer to as "decision-grade information." However, perception is reality - I think the consternation comes from another angle, and is generally misdirected anger toward all analysts.

    People who are continually focused on delivery and execution want decision-grade information, and they want it yesterday. Their frustration bubbles over when they turn to analysts for answers, and all they get is more questions. Ignore the fact that this pattern is precisely what the analyst should do - perception is reality.

    ReplyDelete