lists.openwall.net   lists  /  announce  owl-users  owl-dev  john-users  john-dev  passwdqc-users  yescrypt  popa3d-users  /  oss-security  kernel-hardening  musl  sabotage  tlsify  passwords  /  crypt-dev  xvendor  /  Bugtraq  Full-Disclosure  linux-kernel  linux-netdev  linux-ext4  linux-hardening  linux-cve-announce  PHC 
Open Source and information security mailing list archives
 
Hash Suite: Windows password security audit tool. GUI, reports in PDF.
[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Date: Wed, 03 Jan 2007 18:04:19 +0100
From: Anders B Jansson <hdw@...listi.se>
To: full-disclosure@...ts.grok.org.uk
Subject: Re: Perforce client: security hole by design

Before I begin to trash.
I do not reject any of the findings, most I'll argue that it's a matter of perspective.

Ben Bucksch wrote:
> = Abstract =
> 
> The Perforce client has a huge gapping security hole by design. It 
> totally trusts the Perforce server and does whatever the server tells 
> it, writing arbitrary files.

I'd say that it's a design decition, not sure that it's a design flaw.
It's all down to what you try to protect.
>>From a corporate security standpoint the important part is the source, not someones workstation.
So if someone has write access to the server the damage is done, the aftermath is just collateral damage.
> 
> = Disclaimer =
> 
> This is so terribly obvious that I'd be surprised that this is news, but 
> I couldn't find anything. Or I'm missing something.

You haven't missed anything, your observation is totally correct.

> 
> = Problem =
> 
> The Perforce server stores a "client config", which contains the local 
> pathnames on the *client* machine (the machine fetching source). Of 
> course, that information on the server can change any time. The problem 
> is: the Perforce client adheres to it without a second thought. That 
> means the p4 server can tell the p4 client to overwrite my ~/.bashrc, 
> and *it will just do it*.
> 
> In fact, the client cannot even do "p4 help" on its own, even that comes 
> from the server. Apparently, there is a very fundamental design problem 
> of overly relying on the server, nor checking its input, there are 
> probably more bugs of this kind. I am completely new to Perforce, so the 
> "p4 sync" problem described above may well not be the only one.

The whole point is that whoever controls the source archive also controls the setup.
And having worked with it I actually find the approach useful.
I don't have to bother about which workstation I'm using, since all files and configs are at the server.

> 
> Critical. The server has full access to *all* files that *any* of its 
> users has.
> 
> "We can trust the server" is not an appropriate answer:
Actually, it is.

> 
>     * I am a contractor and have access to many companies' sources, and
>       I do *not* allow any company I work for to have full access to all
>       files on my computer, including the source of the all other
>       companies I work for and even personal files.
If anyone at the company I work for would allow a contractor to access our source from a computer that we don't have absolute control over, that person would be fired instantly and quite likely sued.

Actually, connecting any device not 100% controlled by the company to a company network is strictly forbidden, doing so would be regarded as intended sabotage.

(Unless it's a marked and configured guest network.)

Read your own statement once more and ponder upon what it means.
Do you take the full legal responsibility for anything that would happen to the files on your computer, for all the companies?
  
>     * Also, there are many ways to fool DNS, so that your client goes to
>       another, hostile server.
Normally access to source repository is local to a company's network.
Access from other nets shouldn't ever happens unless additional security is added, like ssh tunneling or VPNs.

>     * And, lastly, a server is not 100% bulletproof either.
Nope, but if the server is broken into the cat is already out of the bag.

> = Proposed fix =
> 
> The problem at hand could be easily fixed by letting the client check 
> out only in the current directory (or one specified by the user on the 
> commandline or GUI, preferences stored locally), no matter what the 
> server says. It may put files anywhere underneath that directory, but 
> never higher or otherwise outside. It must never adhere to absolute 
> paths from the server. This does require some changes to how client 
> specs work, though.
This would require the client to hold it's own config, and frankly, the client isn't trusted to do so.
> 
> But to believe that this would fix the client would be naive. The nature 
> of the bug, that this is a design problem, and a terribly obvious one at 
> that, points to a very serious attitude problem, that there's no 
> consideration for security at all (when it comes to client vs. server). 
Actually, it isn't, it's all down to a matter of perspective.

With my background in more or less fascistic corporate security I's say that your statement that  you keep source belonging to or even just related to several clients on your own computer sends shivers down the spine of any corporate security auditor.

> This usually reflects in many places in the design and code and is often 
> very hard or impossible to remove, because this often results in 
> hundreds or thousands of security holes. I've seen code with critical 
> security holes on every third line, for similar reasons. Thus, the only 
> way that Perforce could reassure the security of the client vs. server 
> would be to make the client source open for review (preferably as Open 
> Source) and make the protocol available for everybody to implement their 
> own clients.
Now, here for once, we agree 100%

Anyone with actual responsibility for the security of any implementation would much rather have the source open audit by as many as possible.

The problem comes with accountability.
Take software X, open source, hacked, audited and used by many.
And software Y, source hidden in some secret location, hacked and possibly audited by a few, used by many.

The risk that there would be a grave security hole in software X is slim, it would be discovered and fixed very fast.
The risk that there would be a grave security hole in software Y is high, and it can remain there for ages.

The problem for anyone with the power to select which software to install (let's say the IT director) is that there's most likely some form of problem with any of them.
An IT director chosing X would have to say "I was hoping that the community would have caught this", and IT director chosing Y would say "I chose a highprofile proffessional company with a good security track record".

And you can bet your ass (which most IT directors try to avoid) that the X dude would be looking for a new job, the Y dude would get away with it, since "he did the best that could be expected".

It the same mechanism that allowed IBM to sell PC's twice as expensive (and quite sucky) as any clones, because "noone has ever been fired for chosing IBM".

In a small organisation where the IT director very well might be the owner, this is no problem, but in larger companies it sure is.


> Ben Bucksch
-- 
// hdw
In another life IT security peasant at some large corporation.
Oh, and no, I've never worked for or recieved anything from Perforce (not even a lousy t-shirt, cheapskates...)
But I've used the software in question for many years, both in it's intended proffessional setting and smaller private ones.

_______________________________________________
Full-Disclosure - We believe in it.
Charter: http://lists.grok.org.uk/full-disclosure-charter.html
Hosted and sponsored by Secunia - http://secunia.com/

Powered by blists - more mailing lists

Powered by Openwall GNU/*/Linux Powered by OpenVZ