[<prev] [next>] [<thread-prev] [day] [month] [year] [list]
Message-ID: <Pine.LNX.4.44.0404080906440.8062-100000@pingu.awe.com>
From: mark at awe.com (Mark J Cox)
Subject: Vulnerability response times -- MS and others
> Now... what about the following? I cannot read the Forrester report --
> I am not a client, and I do not wish to spend $899 on it... so I
> cannot discuss the metrics used, nor how Forrester determined what was
> a "vulnerability disclosure".
For the Linux vulnerabilities that formed part of the survey the various
security teams from the named Linux distributions worked with Forrester to
make their data accurate.
For a one-year period ending mid 2003 they basically took every
vulnerability (normalised by CVE name) that affected any package in any
product shipped by any of the four vendors. They then found out the
"first public date" being the date that the issue was first discussed in
any public forum they could find (bugtraq, searching various bug tracking
databases, messages to obscure lists). They then found out the date the
issue was fixed upstream and, if it applied, by each of the vendors.
They then took a simple mean of the difference between these dates and
came up with 57 days for Red Hat and Debian, and slighly longer for
Mandrake and SUSE. They also repeated this study for Microsoft, although
I don't track Microsoft vulnerabilities so I have no way of knowing how
accurate that data is.
The problem with the report is not the raw data on the Linux
vulnerabilities but the poor analysis of the data. 57 days sounds awful.
How many Linux users were put at risk by an obscure cross-site scripting
flaw in Squirrelmail, or even an Apache vulnerability that only affected
people using a particular configuration to support wildcard DNS? The
vulnerabilities that really matter are the ones which put Linux users at
risk - the OpenSSL issue exploited by the Slapper worm, the ones that
exploits exist for on this list and in private. You can take a subset of
the Forrester data and look at how fast the Linux distributions fixed
those issues; it's something Red Hat does internally. So for example for
issues that would be classified on the Microsoft scale as "critical" over
21 months we get a mean of 1.1 days, with 77% of the issues fixed within a
day of first public disclosure. I had been under the impression that the
raw dataset from the report was going to be made public; then people could
come up with their own statistics based on the issues and products that
were important to them and their own unique configuration and set of
deployed packages.
Even then, this average is only a small part of the picture - most vendors
had OpenSSL fixes out on the day of public disclosure of the vulnerability
that was several months later exploited by the slapper worm, but still
20k+ hosts were affected. It also was unable to tell from public data how
long the vendors had known about an issue themselves in advance of the
first public date.
Anyway, that's why we all joined together and wrote
http://www.redhat.com/advice/speaks_daysofrisk.html
Mark
--
Mark J Cox ........................................... www.awe.com/mark
Apache Software Foundation ..... OpenSSL Group ..... Apache Week editor
Powered by blists - more mailing lists