[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Message-ID: <20040120130814.GN4663@schlund.de>
Date: Tue, 20 Jan 2004 14:08:14 +0100
From: Anders Henke <anders@...lund.de>
To: Gadi Evron <ge@...uxbox.org>
Cc: bugtraq@...urityfocus.com, full-disclosure@...ts.netsys.com
Subject: Re: More info on blocking the Bagle worm
On January 18th 2004, Gadi Evron wrote:
> From MooSoft (Daniel):
>
> Here is the URL list, all 404 last I checked:
Important note: while the script isn't found on any listed site, the
affected web servers still do log the request; by analyzing the server logs,
any site owner still receives a list of infected hosts. If one of those site
owners is the beagle-author, he/she still gains enough information to contact
infected machines.
A few notes on the impact of beagle from an ISP's point of view - our
company is hosting 10 out of the 35 sites listed at
http://vil.nai.com/vil/content/v_100965.htm (we're hosting 3.5M of
domains and also our largest competitor does host 9 beagle-sites, so
don't wonder or misinterpret the "high" percentage).
>From Monday on, every site hosted here and listed at NAI about 35 requests
per second for the non-existing scripts, resulting in about 3M of additional
requests per site and day from more than 108k of different IPs. At our site,
404 is also somehow "expensive" (it is usually handled via CGI), so we're
currently redirecting requests for the specific sites' non-existing 1.php
with a 302 to "Location: http://localhost/". www.sttngdata.de seems to be
so flooded that they changed DNS to point to 127.0.0.1. A few other hosts
(especially those listed by IP or in .ru) seem to be completely unreachable
by now.
So from the ISP's point of view, beagle is also some kind of DDoS.
Following are a few loglines (requested site and source-ip removed):
x.x.x.x - - [20/Jan/2004:13:09:10 +0100] "GET /1.php?p=6777&id=47432653 HTTP/1.1" 302 231 "-" "beagle_beagle"
x.x.x.x - - [20/Jan/2004:13:09:10 +0100] "GET /1.php?p=6777&id=65275748 HTTP/1.1" 302 231 "-" "beagle_beagle"
If you wish to to detect those requests at proxy level or block
at some other listed site, there are a few options as well as things to
take care of:
-all requests so far are using the unique UserAgent "beagle_beagle"
(not bagle). At least from my point of view, disallowing server usage
or proxy access to this UserAgent shouldn't do any harm at all.
-all requests are calling "GET /1.php?p=6777&id=${some_number}",
so =don't= use something like 'Rewriterule ^/1.php$ - [F]' for
rejecting requests.
Regards,
Anders
--
Schlund + Partner AG Security
Brauerstrasse 48 v://49.721.91374.50
D-76135 Karlsruhe f://49.721.91374.225
_______________________________________________
Full-Disclosure - We believe in it.
Charter: http://lists.netsys.com/full-disclosure-charter.html
Powered by blists - more mailing lists