[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Message-ID: <007601c3dfba$ce8af470$0100000a@MOTHER>
From: yossarian at planet.nl (yossarian)
Subject: Anti-MS drivel
> Yup, security research focuses on home computing, but this does not mean
> the quality of enterprise software is any better; quite the opposite. I
> had a chance to audit a bunch of big enterprise applications in several
> places I've worked in, and it is very uncommon to find a solution that
> will not fall apart if you mess with its proprietary protocols and
> interfaces - often exposing gross trust model design problems.
Never said corporate computing was any better, quite the opposite. But our
dwelling on irrelevant software in the security community makes us, uh, look
silly.
>
> These applications usually undergo much more rigorous QA, and this
> elliminates most of basic reliability issues that occur in reasonably
> "normal" working conditions - but the most common type of QA does almost
> nothing to find problems that will surface only when the application poked
> with a stick by a sufficiently skilled attacker.
Well, QA has probably suffered a lot. I work by a dirty mind, but testing in
TMap rules that one out.
> Old school development
> and quality assurance practices, and developers with mindsets locked on
> the network security it used to be in late '80s or so, are far more
> prevalent in these environments. And it really really shows.
Maybe where you work. The last three years in auditing gave me a lot of
smartie experiences - hard on the outside, gooey on the inside.
> The relatively low number of vulnerabilities found in those products can
> be contributed to a couple of basic factors:
>
> 1) Average Joe Hacker does not have access to prohibitively expensive
> or highly specialized systems used in high-profile corporations.
> He does have his Windows and Linux partition, though, maybe even
> a Solaris box somewhere, and can sometimes get ahold of Oracle.
> Enterprise applications for VMS or OS/400, doubtly so. This holds true
> both for amateur researchers, and for many "vulnerability research"
> shops, too - they simply do not have the budget (or incentive) to
> do it.
Budget or incentive? Well if the shops don't have the incentive, they are
probably groping for the real customers.
> 2) Joseph Hacker who happens to be working in a corporation that has such
> a platform is usually limited in how far he can experiment with it
> while playing it safe, especially if it is a production system "ever
> since", and creating a dedicated testbed with appropriate data feeds
> would be overly complex or time-consuming.
Yep, same here.
>
> 3) Even if Joseph finds a flaw, he is expected to work with the vendor
> to protect his company's assets, instead of disclosing a problem
> (otherwise, a swift retaliation from both the vendor and his
> now ex-employer would ensue). He does not have the freedom
> Joe enjoys.
Grumble - spot on again.
> Moreover, sometimes vendors are extremely non-cooperative, and there
> is simply no other choice for this platform that could be used
> as a replacement without major transition expenses and problems.
Usually they are the same vendors you see in the big shops. Let's start some
IBM bashing here. Uh nooooo, they went Open Source, AND they are opposing
Bill, so they must be good...
>
> 4) The public interest in this type of vulnerabilities is marginal.
> Although some solutions may be popular in corporations, the systems
> usually do not face the Internet, and are seldom mentioned in the
> media. As such, there is very little incentive to disclose this
> type of stuff, as only a couple of folks are going to realize
> what you are talking about to start with.
Well, with BEA and all alike, they are facing the internet. This has yet to
settle in.
But what is that public interest in stuff like scripts in Perl or PHP? Who
is our audience? Are we geeks disclosing to other geeks?
Powered by blists - more mailing lists