@mcgoverntheory (James McGovern) continues to complain about the completeness, balance and objectivity of industry analyst coverage. He has just added some further comments to my earlier post on Industry Analyst Coverage (June 2009)
1. Should analysts be more transparent in declaring that they don't have time to actually perform proper research in their reports as a disclaimer and need to be spoonfed by a vendor briefing mechanism?
I don't think there is a consensus on what would count as proper research, but more transparency on research methodology and declaration of interests would be good.
2. What would an end buyer of technology learn if they were to understand how much/little time goes into producing a report vs the other activities analysts spend their time on?
If a decision-maker is making a major decision on the basis of a single report, then it would be sensible to check the quality of the report, instead of being awed by the reputation of the firm that produced it, or being too embarrassed to admit that it wasn't worth what you paid for it.
3. Can any analyst guarantee that if OWASP spends time on briefing analysts that this will generate a positive ROI? As you are aware, OWASP is a volunteer organization. If analysts want to waste the time of analyst relations professionals, that is one thing. It is another to waste the time of people who are attempting goodness.
No, of course not. A good analyst tries to evaluate everything objectively, and it would be completely out of order to guarantee a good review in advance. Obviously if I think there are flaws in what you are presenting to me, then it is my duty to communicate that to you clearly and directly. However, it is not my duty as an analyst to help you fix the flaws. If you want me to help you fix the flaws and/or help you with your marketing, that would require a switch in role and a different kind of funding/engagement; such a switch would need to be managed carefully and declared openly to avoid possible conflicts of interest.
4. Maybe you could identify an analyst or two in your network that would be willing to contribute time to a few open source projects. It may be beneficial to the analysts to understand what it is like to sit on the other side of the table with a compelling value proposition but zero money.
If the industry wants small independent analysts to have the financial freedom to participate in such exercises, then the industry must make sure there is a viable economic niche for small independent analysts.
5. I will take it one step further. If you know of any Gartner, Altimeter, Constellation, Ovum, Celent, Novarica or IDC analyst that wants a free conference pass to the upcoming OWASP conference in NYC, I will get them one.
I have no idea whether any of the large analyst firms will wish to attend your conference, and I hope it's not just the large firms you want to attract. For my part, I should be delighted to attend if anyone is willing to cover my travel and other costs.
I believe there are some fundamental misunderstandings about the role of the industry analyst in the software industry. I certainly believe that analysts could and should deliver greater levels of intelligence and value to the software industry as a whole. But this isn't going to happen if people just complain about analysts while failing to take any action.
See also James McGovern Five Mistakes CIOs make in asking analyst firms to create vendor shortlists... (February 2013), plus discussion on Twitter OWASP and Industry Analysts (Storify, February 2013).
Showing posts with label OWASP. Show all posts
Showing posts with label OWASP. Show all posts
Tuesday, February 05, 2013
Thursday, January 07, 2010
OWASP Top Ten 2010
@johnccr asks me to give a look to the new OWASP Top Ten 2010 RC1 (pdf), saying "it would be interesting to know if it changed your perception". So here are a few quick comments.
I'm certainly happy to acknowledge that this version makes the limitations of the Top Ten approach much clearer than previous versions, and explicitly encourages organizations to "think beyond the ten risks here". The document is careful not to claim the Top Ten as a full application security program, and warns readers not to stop at ten, because "there are hundreds of issues that could affect the overall security of a web application". But then surely this implies we shouldn't be wasting time reading this document at all; we should be reading the OWASP Developer’s Guide, "which is essential reading for anyone developing web applications today".
The status of the top ten items as risks (rather than, say, weaknesses or vulnerabilities or threats) is also a bit clearer, and the ranking of risks is based on the scale of the risk, not just the frequency of the attack. However, the document also refers to "relatively simple security problems like those in the OWASP Top 10" - which makes it seem that they may be the most obvious rather than the most problematic. Making people aware of simple problems doesn't necessarily promote awareness of more complex problems.
To my mind, the trouble with this kind of list is that it encourages bad thinking. Not only are some risks regarded as more attention-worthy than others (based on a generalized model of risk that may not be relevant to your organization or application portfolio), but each risk is considered in isolation. But a holistic understanding of security and risk needs to look at the composition of risk - how can several apparently small risks sometimes be multiplied into a very large risk.
I'm also concerned about limiting the analysis of risks to application security itself. Presumably a full security risk analysis would need to look at social attacks as well as technical attacks, but the Top Ten are all drawn from the technical side. I looked for this technical focus to be stated and explained somewhere, perhaps in a statement of scope, but couldn't find anything to this effect.
By the way, when I have raised issues about OWASP in the past, I have been challenged to fix them myself. But I'm not a normal member of OWASP, I'm an independent industry analyst who has been asked by a few OWASP members to provide coverage of OWASP. I am happy to enter into further discussions with OWASP members, but if you want me to build stuff then I am going to have to find a way of funding my time.
I'm certainly happy to acknowledge that this version makes the limitations of the Top Ten approach much clearer than previous versions, and explicitly encourages organizations to "think beyond the ten risks here". The document is careful not to claim the Top Ten as a full application security program, and warns readers not to stop at ten, because "there are hundreds of issues that could affect the overall security of a web application". But then surely this implies we shouldn't be wasting time reading this document at all; we should be reading the OWASP Developer’s Guide, "which is essential reading for anyone developing web applications today".
The status of the top ten items as risks (rather than, say, weaknesses or vulnerabilities or threats) is also a bit clearer, and the ranking of risks is based on the scale of the risk, not just the frequency of the attack. However, the document also refers to "relatively simple security problems like those in the OWASP Top 10" - which makes it seem that they may be the most obvious rather than the most problematic. Making people aware of simple problems doesn't necessarily promote awareness of more complex problems.
To my mind, the trouble with this kind of list is that it encourages bad thinking. Not only are some risks regarded as more attention-worthy than others (based on a generalized model of risk that may not be relevant to your organization or application portfolio), but each risk is considered in isolation. But a holistic understanding of security and risk needs to look at the composition of risk - how can several apparently small risks sometimes be multiplied into a very large risk.
I'm also concerned about limiting the analysis of risks to application security itself. Presumably a full security risk analysis would need to look at social attacks as well as technical attacks, but the Top Ten are all drawn from the technical side. I looked for this technical focus to be stated and explained somewhere, perhaps in a statement of scope, but couldn't find anything to this effect.
By the way, when I have raised issues about OWASP in the past, I have been challenged to fix them myself. But I'm not a normal member of OWASP, I'm an independent industry analyst who has been asked by a few OWASP members to provide coverage of OWASP. I am happy to enter into further discussions with OWASP members, but if you want me to build stuff then I am going to have to find a way of funding my time.
Should we take OWASP seriously?
Another stimulating discussion with @mcgoverntheory (James McGovern) about the ongoing OWASP project to identify the Top Ten Security Risks. I see no reason to change my previous opinion , which is that such lists are fundamentally misconceived.
As I've explained before (in this blog and elsewhere), I think the objectives of the list are muddled; I regard the methodology for producing the list as insufficiently rigorous; and I think it highly likely that the list will be widely used not as a precursor to a serious threat analysis but as a lazy substitute for it; so I just can't see that a Top Ten list is a good idea for anyone.
@mcgoverntheory replies "Many contributors to the top ten agreed that top ten lists as a concept are flawed. Its all about helping others move needle." Yes, but does it actually achieve any positive outcome? Show me.
@mcgoverntheory adds "Flawed concepts are propagated all the time. It's called marketing". But is it really the role of OWASP to be a marketing organization?
@mcgoverntheory continues "Everyone knows that Top X lists aren't meant to be complete nor necessarily measurable. Its about simple understanding". Well maybe everyone knows, but what matters is whether and how they act upon that knowledge.
@mcgoverntheory admits that "Sadly, most enterprises start and stop with awareness". Maybe so, but why should OWASP pander to this tendency?
And if OWASP is focusing its efforts on publicizing material that many contributors agree to be flawed, why on earth should industry analysts take OWASP seriously? Does OWASP want to be taken seriously?
Maybe it doesn't. @mcgoverntheory asks "What lift would analysts provide to OWASP? No products to sell and therefore we won't show up in quadrants or hype docs."
Of course, that depends what kind of industry analysis we are talking about. Some so-called industry analysis firms seem to do little more than reprocess and amplify the efforts of the software industry marketing departments, putting favoured products and vendors into a Magic Sorting Hat. Or they write like a theatre critic who gets invited to the previews, always finds something positive to say about the latest production, which can then be quoted on the play's website.
But I hope OWASP isn't the kind of organization that only wants analysis on its own terms, and understands that the value of industry analysis comes from the different perspective an analyst should be able to offer. In which case, I am happy to talk.
As I've explained before (in this blog and elsewhere), I think the objectives of the list are muddled; I regard the methodology for producing the list as insufficiently rigorous; and I think it highly likely that the list will be widely used not as a precursor to a serious threat analysis but as a lazy substitute for it; so I just can't see that a Top Ten list is a good idea for anyone.
@mcgoverntheory replies "Many contributors to the top ten agreed that top ten lists as a concept are flawed. Its all about helping others move needle." Yes, but does it actually achieve any positive outcome? Show me.
@mcgoverntheory adds "Flawed concepts are propagated all the time. It's called marketing". But is it really the role of OWASP to be a marketing organization?
@mcgoverntheory continues "Everyone knows that Top X lists aren't meant to be complete nor necessarily measurable. Its about simple understanding". Well maybe everyone knows, but what matters is whether and how they act upon that knowledge.
@mcgoverntheory admits that "Sadly, most enterprises start and stop with awareness". Maybe so, but why should OWASP pander to this tendency?
And if OWASP is focusing its efforts on publicizing material that many contributors agree to be flawed, why on earth should industry analysts take OWASP seriously? Does OWASP want to be taken seriously?
Maybe it doesn't. @mcgoverntheory asks "What lift would analysts provide to OWASP? No products to sell and therefore we won't show up in quadrants or hype docs."
Of course, that depends what kind of industry analysis we are talking about. Some so-called industry analysis firms seem to do little more than reprocess and amplify the efforts of the software industry marketing departments, putting favoured products and vendors into a Magic Sorting Hat. Or they write like a theatre critic who gets invited to the previews, always finds something positive to say about the latest production, which can then be quoted on the play's website.
But I hope OWASP isn't the kind of organization that only wants analysis on its own terms, and understands that the value of industry analysis comes from the different perspective an analyst should be able to offer. In which case, I am happy to talk.
Labels:
OWASP,
risk-trust-security,
security,
softwareindustryanalysis
Monday, June 22, 2009
Industry Analyst Coverage
@mcgoverntheory (James McGovern) complains about the completeness, balance and objectivity of industry analyst coverage. He believes that certain areas are neglected (security, open source), and attributes this to a commercial bias.
James has always been particularly exercised about the fact that OWASP lacks coverage. When he raised this issue with me last year, I responded by posting some questions on the OWASP wiki and the OWASP Linked-In group, as well as several posts on this blog. I'm still waiting for answers.
If there is something in the product offering from any of the large vendors that I don't understand, I can contact one of my analyst relations "minders" and get a reasonably quick answer. If it's a small vendor, I can usually get an answer straight from the CTO. In contrast, my questions to OWASP go into a black hole. One person even suggested that if I wanted to know something about OWASP I needed to start a project. No thanks. (And, to answer Jim's comment below, I don't want to join a mailing list either.)
Industry analysts simply cannot invest that amount of time in chasing non-existent information. If OWASP wishes to be taken seriously by industry analysts, then it needs to put some energy into briefing industry analysts properly, instead of expecting us to root around the OWASP website and complaining when we don't.
Large vendors may sometimes try to influence industry analysts by commissioning work, and many analysts declare this when they deem it relevant. (I think that's what James means by transparency.) But a much more subtle influence can be achieved simply by providing better quality information and making our lives easier.
Update February 2013. James has now returned to the subject Five Mistakes CIOs make in asking analyst firms to create vendor shortlists... (February 2013). See further comments below this post, plus discussion on Twitter OWASP and Industry Analysts (Storify, February 2013).
- How important is it for industry analysts to include security analysis in their SaaS research?
- Does non-commercial open source have a fighting chance to be mentioned by industry analysts to their customers?
- How can customers understand analyst transparency when it comes to coverage of non-commercial open source?
James has always been particularly exercised about the fact that OWASP lacks coverage. When he raised this issue with me last year, I responded by posting some questions on the OWASP wiki and the OWASP Linked-In group, as well as several posts on this blog. I'm still waiting for answers.
If there is something in the product offering from any of the large vendors that I don't understand, I can contact one of my analyst relations "minders" and get a reasonably quick answer. If it's a small vendor, I can usually get an answer straight from the CTO. In contrast, my questions to OWASP go into a black hole. One person even suggested that if I wanted to know something about OWASP I needed to start a project. No thanks. (And, to answer Jim's comment below, I don't want to join a mailing list either.)
Industry analysts simply cannot invest that amount of time in chasing non-existent information. If OWASP wishes to be taken seriously by industry analysts, then it needs to put some energy into briefing industry analysts properly, instead of expecting us to root around the OWASP website and complaining when we don't.
Large vendors may sometimes try to influence industry analysts by commissioning work, and many analysts declare this when they deem it relevant. (I think that's what James means by transparency.) But a much more subtle influence can be achieved simply by providing better quality information and making our lives easier.
Update February 2013. James has now returned to the subject Five Mistakes CIOs make in asking analyst firms to create vendor shortlists... (February 2013). See further comments below this post, plus discussion on Twitter OWASP and Industry Analysts (Storify, February 2013).
Labels:
open source,
OWASP,
softwareindustryanalysis
Friday, January 09, 2009
OWASP Top Ten - Update
OWASP is the Open Web Application Security Project. It is perhaps best-known for publishing Lists of the Top Ten (or more recently Top Twenty-Five) Security Bugs (or Vulnerabilities or Threats or Risks).
Following my earlier post on the OWASP Top Ten, as well as an exchange of emails with someone in the OWASP community, I posted the following question to the OWASP discussion group on Linked-In.
This prompted a couple of interesting responses, expressing different views on the real purpose of the OWASP Top Ten. Michael Vance said that the items in the top ten list are those most likely to occur or those that are most likely to have the greatest impact. Christian Frichot said that lists are good at removing the low hanging fruit: I interpret this as meaning the most obvious and easiest to fix, which is not necessarily the same as frequency or impact.
In any case, the methodology for creating the OWASP top ten list does not seem to be designed to produce a list with the characteristics required by either Michael or Christian. It is partly based on historical data (frequency but not impact or low-hangingness, as far as I can see), but with some adjustment to allow for some future projections of increased risk. For example, one issue (CSRF) was promoted to the list because the team believed it to be important, but with no evidence produced to support this belief. So is the OWASP Top Ten List really based on a systematic assessment of (generic) likelihood and impact?
In any case, it would be strange if the same list were equally relevant to all applications in all organizations. Do we expect a retail bank to have the same security risks as a nuclear power plant? Do we expect an airline to have the same security risks as an online bookstore?
Clearly it would be stupid to rely completely on the Top Ten List - although I suspect that some people do just that. But my question is more fundamental - what are the grounds for thinking that a top ten list improves the overall process, rather than just adding a redundant step into the process? Christian's argument is interesting - by dealing quickly with the easy and obvious generic vulnerabilities, we can spend more time on the specific ones. But is that what people actually do?
Michael acknowledges that there is a significant disconnect between the way that Top Ten (and Top 20 and Top 25 and even Threat Classification) lists should be used and the way that they are used. He mentions a specific concern that this list will be misused by being improperly inserted into procurement language.
If OWASP were merely an academic organization, it could deny responsibility for how other people use their lists. "We produce the perfect lists, it's not our fault if people abuse them." But if OWASP is trying to make a real practical difference to security, then the actual effects and effectiveness of these lists is important.
Meanwhile, I am happy to see that other security experts agree with my concerns. Gary McGraw (CTO of Cigital) has just published an excellent article called Software [In]security: Top 11 Reasons Why Top 10 (or Top 25) Lists Don’t Work (via Bruce Schneier).
Update (March 2009)
Tom Brennan has just posed a question on the Linked-In discussion: "So what OWASP project are you going to start that will change this?" So the way to influence existing projects within OWASP is to start a rival project is it? What a strange organization!
Related posts: OWASP Top Ten (October 2008), OWASP Top Ten 2010 (January 2010), Low-Hanging Fruit (August 2019)
Following my earlier post on the OWASP Top Ten, as well as an exchange of emails with someone in the OWASP community, I posted the following question to the OWASP discussion group on Linked-In.
Do Top-Ten Lists distract from a holistic approach to security?
If you ask people to pay attention to the top ten items in a list of threats or vulnerabilities, they will almost inevitably pay less attention to other things. (Intelligent people are aware of the limitations of lists, but even they are not immune to such effects.)
If a security vendor has a particular interest in one item - for example selling protection or detection for a particular threat - then there may be some commercial significance in whether that item makes the top ten or not. So a commercially minded security vendor will look for ways of influencing (aka distorting) the top ten list in his favour.
Meanwhile, intelligent attackers may calculate that a significant portion of security dollars will be consumed by the top ten, leaving other vulnerabilities under-funded.
The OWASP website does contain a page (Where To Go From Here) explaining that the top ten list is only the starting point of a proper security analysis, but this page is very poorly signposted and I suspect that many people never reach this page.
The official purpose of the OWASP list is to educate people about the consequences of security vulnerabilities. But I think there is a broader education purpose, and I fear that top ten lists distract from this purpose.
This prompted a couple of interesting responses, expressing different views on the real purpose of the OWASP Top Ten. Michael Vance said that the items in the top ten list are those most likely to occur or those that are most likely to have the greatest impact. Christian Frichot said that lists are good at removing the low hanging fruit: I interpret this as meaning the most obvious and easiest to fix, which is not necessarily the same as frequency or impact.
In any case, the methodology for creating the OWASP top ten list does not seem to be designed to produce a list with the characteristics required by either Michael or Christian. It is partly based on historical data (frequency but not impact or low-hangingness, as far as I can see), but with some adjustment to allow for some future projections of increased risk. For example, one issue (CSRF) was promoted to the list because the team believed it to be important, but with no evidence produced to support this belief. So is the OWASP Top Ten List really based on a systematic assessment of (generic) likelihood and impact?
In any case, it would be strange if the same list were equally relevant to all applications in all organizations. Do we expect a retail bank to have the same security risks as a nuclear power plant? Do we expect an airline to have the same security risks as an online bookstore?
Clearly it would be stupid to rely completely on the Top Ten List - although I suspect that some people do just that. But my question is more fundamental - what are the grounds for thinking that a top ten list improves the overall process, rather than just adding a redundant step into the process? Christian's argument is interesting - by dealing quickly with the easy and obvious generic vulnerabilities, we can spend more time on the specific ones. But is that what people actually do?
Michael acknowledges that there is a significant disconnect between the way that Top Ten (and Top 20 and Top 25 and even Threat Classification) lists should be used and the way that they are used. He mentions a specific concern that this list will be misused by being improperly inserted into procurement language.
If OWASP were merely an academic organization, it could deny responsibility for how other people use their lists. "We produce the perfect lists, it's not our fault if people abuse them." But if OWASP is trying to make a real practical difference to security, then the actual effects and effectiveness of these lists is important.
Meanwhile, I am happy to see that other security experts agree with my concerns. Gary McGraw (CTO of Cigital) has just published an excellent article called Software [In]security: Top 11 Reasons Why Top 10 (or Top 25) Lists Don’t Work (via Bruce Schneier).
Update (March 2009)
Tom Brennan has just posed a question on the Linked-In discussion: "So what OWASP project are you going to start that will change this?" So the way to influence existing projects within OWASP is to start a rival project is it? What a strange organization!
Related posts: OWASP Top Ten (October 2008), OWASP Top Ten 2010 (January 2010), Low-Hanging Fruit (August 2019)
Thursday, October 23, 2008
OWASP Top Ten
Back in August, James McGovern asked me to provide some OWASP coverage. Someone called Jennifer (Bayuk perhaps?) added a comment
Okay, let me start from that point. The OWASP Top Ten Project periodically publishes a "Top Ten" list of the most common web application security vulnerabilities. The official purpose of this list is to educate people about the consequences of these vulnerabilities.
But of course the inevitable effect of publishing a Top Ten list is pretty obvious - it causes people to pay particular attention to the items in the top ten, and considerably less attention to the items that don't quite make the top ten. If I was a niche security vendor, I'd be lobbying extremely hard to make sure that the particular vulnerability addressed by my product got into the top ten. Conversely, if I were running a criminal scam, I know exactly which vulnerabilities I'd be targeting.
This kind of thing clearly distracts people from a proper holistic view of application security. In my view it is the Top Ten List itself that is the "interesting distraction" Jennifer talks about, and I think OWASP should quietly drop this kind of cheap journalism and concentrate on educating people to do security properly. There is a lot of more intelligent stuff on the OWASP website explaining where to go from here, but I wonder how many people get that far?
Never let it be said that I am just a passive critic, however. Back in August, I registed onto the OWASP wiki and posted a couple of helpful questions about the OWASP principles. Haven't had a response yet, but I live in hope.
See also
OWASP Top Ten Update (January 2009)
OWASP Top Ten 2010 (January 2010)
OWASP is not dominated by commercial interests, and so the message is different than from product vendors (and service vendors too, to a lesser extent). When an automated tool vendor claims to "address" the OWASP Top Ten, they should be ashamed of themselves. And you should be ashamed if you're buying that hype and promoting automated tools as anything much more than an interesting distraction. Covering OWASP would allow people to get a far less biased opinion of what's going on in application security.
Okay, let me start from that point. The OWASP Top Ten Project periodically publishes a "Top Ten" list of the most common web application security vulnerabilities. The official purpose of this list is to educate people about the consequences of these vulnerabilities.
But of course the inevitable effect of publishing a Top Ten list is pretty obvious - it causes people to pay particular attention to the items in the top ten, and considerably less attention to the items that don't quite make the top ten. If I was a niche security vendor, I'd be lobbying extremely hard to make sure that the particular vulnerability addressed by my product got into the top ten. Conversely, if I were running a criminal scam, I know exactly which vulnerabilities I'd be targeting.
This kind of thing clearly distracts people from a proper holistic view of application security. In my view it is the Top Ten List itself that is the "interesting distraction" Jennifer talks about, and I think OWASP should quietly drop this kind of cheap journalism and concentrate on educating people to do security properly. There is a lot of more intelligent stuff on the OWASP website explaining where to go from here, but I wonder how many people get that far?
Never let it be said that I am just a passive critic, however. Back in August, I registed onto the OWASP wiki and posted a couple of helpful questions about the OWASP principles. Haven't had a response yet, but I live in hope.
See also
OWASP Top Ten Update (January 2009)
OWASP Top Ten 2010 (January 2010)
Tuesday, August 12, 2008
OWASP Coverage?
In a comment to an unrelated post, James McGovern asks
I can't speak for anyone else, but here's my answer. I might provide occasional comments about OWASP without any special motivation, but before I go to the trouble to provide comprehensive coverage about something, I need to see some strong interest from my readers. I also need to feel that this is a subject I can add some value to, rather than merely repeating what everyone else is saying.
So if anyone wants me to take a thorough look at OWASP (or anything else for that matter), please add a comment to this blog, indicating the nature of your interest and what specific questions you'd like me to address. Thanks.
"What would it take for an industry analyst to provide comprehensive coverage via blog entries on the work that OWASP is doing?"
I can't speak for anyone else, but here's my answer. I might provide occasional comments about OWASP without any special motivation, but before I go to the trouble to provide comprehensive coverage about something, I need to see some strong interest from my readers. I also need to feel that this is a subject I can add some value to, rather than merely repeating what everyone else is saying.
So if anyone wants me to take a thorough look at OWASP (or anything else for that matter), please add a comment to this blog, indicating the nature of your interest and what specific questions you'd like me to address. Thanks.
Labels:
OWASP,
risk-trust-security,
security,
softwareindustryanalysis
Subscribe to:
Posts (Atom)