lists.openwall.net   lists  /  announce  owl-users  owl-dev  john-users  john-dev  passwdqc-users  yescrypt  popa3d-users  /  oss-security  kernel-hardening  musl  sabotage  tlsify  passwords  /  crypt-dev  xvendor  /  Bugtraq  Full-Disclosure  linux-kernel  linux-netdev  linux-ext4  linux-hardening  linux-cve-announce  PHC 
Open Source and information security mailing list archives
 
Hash Suite: Windows password security audit tool. GUI, reports in PDF.
[<prev] [next>] [thread-next>] [day] [month] [year] [list]
Message-ID: <000c01cc6404$b4bb1cb0$1e315610$@gmail.com>
Date: Fri, 26 Aug 2011 18:27:43 +0300
From: SuRGeoNiX <srgn.ml@...glemail.com>
To: <full-disclosure@...ts.grok.org.uk>, <pen-test@...urityfocus.com>,
	<webappsec@...urityfocus.com>
Subject: WebSurgery v0.6 released - Web application
	testing suite

WebSurgery is a suite of tools for security testing of web applications. It
was designed for security auditors to help them with the web application
planning and exploitation. Currently, it uses an efficient, fast and stable
Web Crawler, File/Dir Bruteforcer, Fuzzer for advanced exploitation of known
and unusual vulnerabilities such as SQL Injections, Cross site scripting
(XSS), Brute force for login forms, identification of firewall-filtered
rules, DOS Attacks and WEB Proxy to analyze, intercept and manipulate the
traffic between your browser and the target web application.

 

WEB Crawler

 

WEB Crawler was designed to be fast, accurate, stable, completely
parametrable and the use of advanced techniques to extract links from
Javascript and HTML Tags. It works with parametrable timing settings
(Timeout, Threading, Max Data Size, Retries) and a number of rules
parameters to prevent infinitive loops and pointless scanning (Case
Sensitive, Dir Depth, Process Above/Below, Submit Forms, Fetch
Indexes/Sitemaps, Max Requests per File/Script Parameters). It is also
possible to apply custom headers (user agent, cookies etc) and
Include/Exclude Filters. WEB Crawler come with an embedded File/Dir
Bruteforcer which helps to directly brute force for files/dirs in the
directories found from crawling.

 

WEB Bruteforcer

 

WEB Bruteforcer is a bruteforcer for files and directories within the web
application which helps to identify the hidden structure. It is also
multi-threaded and completely parametrable for timing settings (Timeout,
Threading, Max Data Size, Retries) and rules (Headers, Base Dir, Brute force
Dirs/Files, Recursive, File's Extension, Send GET/HEAD, Follow Redirects,
Process Cookies and List generator configuration).

By default, it will brute force from root / base dir recursively for both
files and directories. It sends both HEAD and GET requests when it needs it
(HEAD to identify if the file/dir exists and then GET to retrieve the full
response).

 

WEB Fuzzer

 

WEB Fuzzer is a more advanced tool to create a number of requests based on
one initial request. Fuzzer has no limits and can be used to exploit known
vulnerabilities such (blind) SQL Inections and more unsual ways such
identifying improper input handling, firewall/filtering rules, DOS Attacks.

 

WEB Editor

 

A simple WEB Editor to send individual requests. It also contains a HEX
Editor for more advanced requests.

 

WEB Proxy

 

WEB Proxy is a proxy server running locally and will allow you to analyze,
intercept and manipulate HTTP/HTTPS requests coming from your browser or
other application which support proxies.

 

 

Download / Documentation / Changes

http://www.surgeonix.com/blog/index.php/archives/117


Content of type "text/html" skipped

_______________________________________________
Full-Disclosure - We believe in it.
Charter: http://lists.grok.org.uk/full-disclosure-charter.html
Hosted and sponsored by Secunia - http://secunia.com/

Powered by blists - more mailing lists

Powered by Openwall GNU/*/Linux Powered by OpenVZ