lists.openwall.net   lists  /  announce  owl-users  owl-dev  john-users  john-dev  passwdqc-users  yescrypt  popa3d-users  /  oss-security  kernel-hardening  musl  sabotage  tlsify  passwords  /  crypt-dev  xvendor  /  Bugtraq  Full-Disclosure  linux-kernel  linux-netdev  linux-ext4  linux-hardening  linux-cve-announce  PHC 
Open Source and information security mailing list archives
 
Hash Suite: Windows password security audit tool. GUI, reports in PDF.
[<prev] [next>] [<thread-prev] [day] [month] [year] [list]
Date: Thu, 23 Dec 2004 13:31:09 +0100
From: Anders Henke <anders@...lund.de>
To: ycw1bh302@...akemail.com
Cc: bugtraq@...urityfocus.com
Subject: Re: phpBB Worm


On Dec 22nd 2004, ycw1bh302@...akemail.com wrote:
> 1.  Why has the worm been as effective on Windows servers as on *nix servers?  At the very least, shouldn't the difference in file and directory naming cause a problem?  I looked at the decoded Perl script, but I'm not a Perl expert, so I couldn't understand all of it.  And what about the difference in file permissions?

Perl does provide cross-platform-functions for e.g. file access and
there's usually not much of a difference for running a well-written
perl script on Unix as well as on Windows other than the first line
(usually '#!c:\perl\perl.exe -w' on Windows and '#! /usr/bin/perl -w'
on Unix).

However, most Windows-Webservers other than Apache do run any .pl-Script 
using the to-be-installed perl interpreter and don't care on the bang-line.

The documentation found in 'perldoc perlport' does give a closer view 
on the few differences when writing cross-plattform perl scripts.

> 2.  More importantly, why wasn't the worm's destructive ability limited by file permissions, especially on *nix servers?  If, for example, an HTML file on the server was uploaded by user bob, and has permissions of 755, how can the Perl script delete that file?  Shouldn't the Perl script be created with the Perl process's permissions, which was invoked by PHP, which should have the Web server's permissions, which should be, at least on most *nix servers, the nobody user?

On shared servers with ISPs caring about security, user CGIs are using the 
suexec mechanism in order to run each customer within his own user's space.

The downside of using suexec is that PHP as a CGI doesn't offer a small 
number of special features some people do believe to be essential, as well
as some people do write code in a way that making it work on PHP as CGI
is close to 'virtually impossible'. The PHP-Module also allows one to
set PHP-configuration settings via .htaccess; those configuration
changes are also ignored by CGI-PHP and can severely affect the way
an PHP-written application works (or doesn't work).

> This is a big issue on shared servers, or virtual hosts, whatever you want to call them.  Our site is on a shared server, and our site does not even run phpBB, but most of our HTML files were replaced with the worm's content.  Obviously, then, another site on the server must have an old version of phpBB.  But why could the worm, coming in through another site, modify files created by other users?  Even if the worm's script ran as the owner of the vulnerable viewtopic.php file, how could it then modify non-world-writable files created by other users?



Right - if everyone were using e.g. suexec, this would be the case.

As a web host, you've got to chose to run either CGI-PHP or PHP as
module.

Your 'power'-users are calling for the module, the admin keeping 
maintenance on an already overloaded server does also all for the module 
(the module relieves the web server from forking a seperate process for 
running a php-script), only those security-related ones are rejecting both
mod_perl as well as mod_php and favour 'true' CGIs via suexec.

If your scripts support the fastcgi extension, one might use mod_fastcgi 
with suexec support; however, this means one has to setup three softwares 
(fastcgi, suexec, php) and make them work together instead of the 
often-recommended 'add mod_php'-Oneliner. As a result, you're spending 
much work on a secure system, but your users are still calling for mod_php
and in case any part of your setup breaks, your whole system is unusable.

> I have long been concerned with the security of PHP scripts, especially on shared servers.  Since PHP almost always runs as an Apache module, and Apache usually runs as nobody, one must make files and directories world-writable for PHP scripts to be able to write to them.  But that means that any process on the server, including anyone's PHP script, can modify the files.


Yes, you've got the point.

Apache 2 has the ability to run modules per VirtualHost within a different
user context (perchild MPM).
-According to the Apache documentation, this module is non-functional,
 not yet finished and development is not currently active.
-PHP is certainly one of the most interesting modules for this feature, 
 however, the last time I looked, exactly PHP didn't support it and Apache
 required to have at least one process running per virtualhost (which in 
 turn would render servers hosting thousands of sites unusable).
-Still today, the php documentation warns from using Apache 2.0 with PHP 
 in productive environment.

>From a security aspect, the only way for running PHP securely
(with 'secure' from the view of the administrator), CGI is currently
the only way to do so.



Regards,

Anders
-- 
Schlund + Partner AG              Security and System Administration
Brauerstrasse 48                  v://49.721.91374.50
D-76135 Karlsruhe                 f://49.721.91374.225


Powered by blists - more mailing lists

Powered by Openwall GNU/*/Linux Powered by OpenVZ