lists.openwall.net   lists  /  announce  owl-users  owl-dev  john-users  john-dev  passwdqc-users  yescrypt  popa3d-users  /  oss-security  kernel-hardening  musl  sabotage  tlsify  passwords  /  crypt-dev  xvendor  /  Bugtraq  Full-Disclosure  linux-kernel  linux-netdev  linux-ext4  linux-hardening  linux-cve-announce  PHC 
Open Source and information security mailing list archives
 
Hash Suite: Windows password security audit tool. GUI, reports in PDF.
[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Message-Id: <1209609075.1271.19.camel@nigel-laptop>
Date:	Thu, 01 May 2008 12:31:15 +1000
From:	Nigel Cunningham <ncunningham@...a.org.au>
To:	Linus Torvalds <torvalds@...ux-foundation.org>
Cc:	"Rafael J. Wysocki" <rjw@...k.pl>, Willy Tarreau <w@....eu>,
	David Miller <davem@...emloft.net>,
	linux-kernel@...r.kernel.org,
	Andrew Morton <akpm@...ux-foundation.org>,
	Jiri Slaby <jirislaby@...il.com>
Subject: Re: Slow DOWN, please!!!

Hi.

On Wed, 2008-04-30 at 18:40 -0700, Linus Torvalds wrote:
> The thing is, the quality of individual patches isn't what matters! What 
> matters is the quality of the end result. And people are going to be a lot 
> more involved in looking at, testing, and working with code that is 
> merged, rather than code that isn't.

No. People generally expect that code that has been merged does work, so
they don't look at it unless they're forced to (by a bug or the desire
to make further modifications in that code) and they don't explicitly
seek to test it. They just seek to use it.

When it doesn't work, some of us will go and seek to find the cause,
others (most?) will simply roll back to whatever they last found to be
reliable.

Out of tree code has the same issues.

The only time code really gets looked at and tested is when there's a
problem, or when people are explicitly choosing to inspect it (pre-merge
reviews, eg).

So my answer to the "how do we raise quality" question would be that
when writing the code, we put time and effort into properly analysing
the problem and developing a solution, we put time and effort into
carefully testing the solution, and we put code in that will help the
end-user help us to debug issues later (without them necessarily needing
to git-bisect). After all, good software isn't the result of random (or
semi-random), unconsidered modifications, but of planning, thought and
attention to detail.

In other words, I'm arguing that the speed of merging should be
irrelevant. What's relevant is the quality of the work done in the first
place.

If you want better quality code, penalise the people who get buggy code
merged. Give them a reason to get it in a better state before they try
to merge. Of course Linus alone can't do that.

Nigel

--
To unsubscribe from this list: send the line "unsubscribe linux-kernel" in
the body of a message to majordomo@...r.kernel.org
More majordomo info at  http://vger.kernel.org/majordomo-info.html
Please read the FAQ at  http://www.tux.org/lkml/

Powered by blists - more mailing lists

Powered by Openwall GNU/*/Linux Powered by OpenVZ