lists.openwall.net   lists  /  announce  owl-users  owl-dev  john-users  john-dev  passwdqc-users  yescrypt  popa3d-users  /  oss-security  kernel-hardening  musl  sabotage  tlsify  passwords  /  crypt-dev  xvendor  /  Bugtraq  Full-Disclosure  linux-kernel  linux-netdev  linux-ext4  linux-hardening  linux-cve-announce  PHC 
Open Source and information security mailing list archives
 
Hash Suite: Windows password security audit tool. GUI, reports in PDF.
[<prev] [next>] [day] [month] [year] [list]
Message-ID: <000001c5627a$f89343a0$0201a8c0@Furion>
Date: Fri May 27 12:54:22 2005
From: der at cirt.dk (Dennis Panduro Rand)
Subject: WebRoot version 1.6

                                                CIRT.DK WebRoot Security
Scanner
                                                 (c)2005 Dennis Rand -
CIRT.DK
                                                      http://www.cirt.dk/


DESCRiPTiON:
    Have you ever been auditing a system where files are stored on a web
server and accessed without authentication directly 
    by an application that knows each file URL. 

    Have you tried a number of spider tools but they are based on links so
they don't pull up anything. 

    CIRT.DK WebRoot is a Webserver auditing tools, that tries each and every
combination (incremental)or a list of words from a 
    file, against the Webserver.

    In short:
    A Brute Forcing tool to discover directories, files or parameters in the
URL of a webserver. 


iNSTALLATiON:
    perl -MCPAN -e shell
    cpan > install Bundle::LWP
    cpan > install IO::Socket
    cpan > install Getopt::Long
    cpan > install Algorithm::GenerateSequence
    cpan > install Net::SSLeay::Handle
    cpan > install Time::HiRes
    cpan > quit


    How to clean a wordlist before use to avoid doubles:
    cat list.txt | sort | uniq > Temp.txt 
    mv -f Temp.txt list.txt
 
PARAMETER DESCRiPTiON:
   Basic settings
    -host                 Set the host ip or hostname to scan.
    -port                 Set the port where the webserver are located.
    -timeout              Set a maximum timeout for each try.
    -delay                Set a delay between each attempt (Microseconds - 1
second = 1000000 Microseconds).

   Scanning options
    -incremental          Set if the scanning has to bruteforce
                          use with "lowercase" (a-z)
                                   "uppercase" (A-Z)
                                   "integer"   (0-9)
                                   "special"   (!,#,$,?,/,\,=)
                                   "all"       (All ove the above)
                                   
    -minimum              Set the min chars for the incremental scan
    -maximum              Set the max chars for the incremental scan
    -wordlist             Set if a wordlist is supplied
    -url                  Set the URL to bruteforce.
                          Use <BRUTE> where you want the bruteforcing

   Advanced scanning options
    -diff                 If the result has to be different, from the
response(Default)
                          use with "404 File not found" and it will find
anything NOT matching in the response.
    -match                If the result has to match the response
                          use with "200 OK" and it will find anything
matching
    -command              Set the HTTP command if not GET
                          Remeber you can also use <BRUTE> in this field
    -useragent            Enter your own useragent
    -cookie               Enter a cookie value
    -http_version         If you want to use anything other then HTTP/1.1
    -recursive            Make WebRoot scan recursively when scanning for
directories
    -referer              If you want to set a Referer in the header of the
HTTP request
    -override             Override the False Positive Check - NOT A GOOD
IDEA
    -resume               Restore a previous scan, usage: "-resume
WebRoot-xxx-xxx.resume"
    
   Report options
    -saveas               Save report as defines name
    -txtlog               Save report in pure text format
    -rawlog               Save report in pure text, and only includes the
specific hit
    -reportlines          Amount of lines output from webserver to put into
report (ONLY HTML)
    
   Visual options
    -verbose              Show findings on the screen
    -debug                Shows some of the output to screen, so we can
search for specific elements
    -debugline            Decide how many lines to be in output from
debugging - Default: 15
    -debugdelay           Delay between each request made in debug mode -
Default: 3 seconds


EXAMPLES OF HOW TO USE WEBROOT:
    Scan localhost port 80 search for 200 OK response at the url
http://127.0.0.1:80/admin/<BRUTE> incremental lowercase 1 to 3 characters.
    WebRoot.pl -host 127.0.0.1 -port 80 -match "200 OK" -url
"/admin/<BRUTE>" -incremental lowercase -minimum 1 -maximum 3
    
Version descriptions
    Version 1.0
       I'm back from scratch, this time I'm going to make it a bit better,
but have patience. 
       For now results are only written to screen.

    Version 1.1 
       We now have support for saving the scanning into an HTML file
       Decide how many lines of output from the server goes into the report.

    Version 1.2
       More information added into the report start
       Now WebRoot also supports scanning of a HTTPS connection.
       The response in the report now shows the HTML

    Version 1.3
       Fixed a bug in the -diff and -match options.

    Version 1.4
       Added possibility to use -txt if you want the report in pure text
       Added recursive scanning, so if you use -recursive, it will
bruteforce deeper to search for more.
       Added more information to the update function on what the new version
are including.

    Version 1.5
       Added possibility to add referer to the hostheader, use eg. -referer
http://127.0.0.1/whatever/qwe.asp
       Added raw logging, pure text and only the word that got the hit, use
-rawlog
       Changed name of the text log -txt replaced with -txtlog
       Added a "GUI" to the scanning.
       Added False Positive Check to the scan to ensure the right result,
and be disabled with -override
       Added -debuglines for deciding how many lines of output to have in
debug mode
       Added -debug for scanning in debug mode to also see what is being
sent and recieved.
       Added -debugdelay for making a delay between each debug request
       Added -Verbose scanning to see findings on screen as they are
spotted.

    Version 1.6
       Fixed the issue if you do not choose -diff or -match it will by
default be -diff
       Instead of only being able to delay for seconds, now possible to
delay for microseconds
           1 second =  1000000 microseconds (Time::HiRes)
       Fixed an error for recursive scan where we remote space and if there
are errors in URL "/", "/ /", " /" or "/ "
       Added the possibility to resume previous scans "-resume
WebRoot-xxx-xxx.resume"

-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.grok.org.uk/pipermail/full-disclosure/attachments/20050527/0321a707/attachment.html

Powered by blists - more mailing lists

Powered by Openwall GNU/*/Linux Powered by OpenVZ