lists.openwall.net   lists  /  announce  owl-users  owl-dev  john-users  john-dev  passwdqc-users  yescrypt  popa3d-users  /  oss-security  kernel-hardening  musl  sabotage  tlsify  passwords  /  crypt-dev  xvendor  /  Bugtraq  Full-Disclosure  linux-kernel  linux-netdev  linux-ext4  linux-hardening  linux-cve-announce  PHC 
Open Source and information security mailing list archives
 
Hash Suite: Windows password security audit tool. GUI, reports in PDF.
[<prev] [next>] [day] [month] [year] [list]
From: dufresne at winternet.com (Ron DuFresne)
Subject: RE: It takes two to tango

On Wed, 31 Jul 2002, Scott, Richard wrote:

	[SNIP]

> [RS] Lets assume that contracts and licensing are not defunct of liability.
> Providing that the security vulnerability is reported to the vendor, the
> vendor should immediately verify the claims and inform all its licensed
> clients.  In most cases many vulnerabilities could be mitigated with certain
> other efforts, whilst not as efficient or reduce business functionality, may
> reduce the risk, until a patch is available.  The business would decide if
> the risk is acceptable to continue business or would defer risk by either
> reducing functionality (stopping services etc) or completely stop until a
> patch (in the event the IDS picked something up).  Just because a
> vulnerability is detected in a service one is using does not necessarily
> mean my server has to be placed off line.  However, I would expect a patch
> if I intend to use that feature in the future.
>
> In such cases, businesses are fully aware of risk of doing business, can
> apply some vague quantitative measure of risk and understand the risk model.
> If the client was not notified, after the vulnerability was published (not
> the exploit), businesses affected by the security hole, could sue the
> vendor.  The vendor may have chosen not to inform it's clients of the
> potential security problem, and thus did not do its due diligence.
>
> I believe this would be a better model of controlling and enabling full
> disclosure.  Thus, the vulnerability owner would notify a vendor, and
> following the guidelines, give 30 days for client notification (assume 30,
> could be anything noted..).  The Vendor must notify clients to take
> precautionary action.
> If vendor refuses to notify clients, and clients discover additional risk,
> and/or potential damage litigation can be a consequence.  [Seems very
> similar to other product warranties et al ?? ...]
>
> <snip>
> IMHO, vendors SHOULD be responsible for security holes.  However,
> before that can be done there needs to be some kind of law put in
> place to protect the researchers who find the holes.  Doesn't need to
> be much, just a blanket law that if the researcher has taken
> reasonable steps to alert the vendor, they cannot be held liable for
> the consequences of releasing the advisory. If that doesn't happen,
> things are going to get messy.
> </snip>
>
> [RS] I must admit that the legal system in this country is not proactive,
> very reactive and very heavily fraught with strange laws.  The introduction
> of laws and regulations to prevent reverse engineering is just step to
> remove full disclosure.  The onus should be placed back in to liability and
> insurance.  Preventing discovery is not the answer.  If Full Disclosure was
> covered by some government classification as to require adequate and
> official steps, liability is placed on both hands of the vulnerability.  The
> author would be required to follow the steps, informing the vendor and then
> releasing an advisory and then potentially the exploit.  Whilst the vendor
> must be required to notify licensees / clients prior to the advisory and
> then follow up with a patch.
>
> Secondly, just because one person has discovered the flaw doesn't mean
> others do not know about it.  Hence, it is vital that vendors treat
> advisories as high priority issues and must assume that potential criminals
> could use those vulnerabilities.
>
> It doesn't seem much to stretch the Homeland office for security to regard
> commerce systems as "Infrastructure" and hence bind researchers and vendors
> to an agreement.  The only sticky part is if a vendor fails to take note and
> the advisory and exploits are released.  In such a case the department of
> HLS could be involved in high level cases, i.e. large scale potential.
>
> This is just a sketch and there are numerous possible obstacles, but it
> certainly beats the current rogue view of many members who regard FD a
> terrible thing.
>

Of course, the same should apply to companies that expose their customers
to potential information leakage as with the recent BestBuy wireless cash
transactions exposures, yes?  Afterall, it's not just the products vendors
release that are at issue in the security of information, but also, how
those products are enabled and used.  Granted the vendors are extremely
lacking in providing adequete documentaion about how to properly secure
their wireless toys and trinkets, and should be taken to task for
their failings there, but, a company like BestBuy with  INFORMATION
SECURITY staff should know more then the average home user about how to
secure delicate and potentially exploitable information like  credit card
information.  Exposure of private information is not all HIPAA related
for sure.  Does not the exposure of some client information <e.g.
FORD's recent client information fiasco> make the companies obtaining,
using, and improperly storing it also perhaps contributors to the extreme
fraud rates in the credit industry such a major issue too?


Thanks,

Ron DuFresne
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
"Cutting the space budget really restores my faith in humanity.  It
eliminates dreams, goals, and ideals and lets us get straight to the
business of hate, debauchery, and self-annihilation." -- Johnny Hart
	***testing, only testing, and damn good at it too!***

OK, so you're a Ph.D.  Just don't touch anything.




Powered by blists - more mailing lists

Powered by Openwall GNU/*/Linux Powered by OpenVZ