[<prev] [next>] [day] [month] [year] [list]
Message-ID: <1a3126150707060920m680682dele060de56116b7762@mail.gmail.com>
Date: Fri, 6 Jul 2007 12:20:36 -0400
From: "Rob McCauley" <robm.fd@...il.com>
To: full-disclosure@...ts.grok.org.uk
Subject: Re: Does this exist ?
Ya know, I don't think he does get that part yet.
This scheme is essentially how data compression already works. Not in
gigantic swaths of bits, as being proposed here, but in smalish numbers a
few bits represents a bigger set of bits. Huffman coding is a basic
example.
The infeasability of this idea is all about the data size. As someone
already pointed out 2^4000 is not 16,000,000 (that's 4000^2). 2^4,000 is
large enough to just call it infinite and be done with it.
For comparison, there's something like 2^100 to 2^130 or so atoms in the
known universe. The hardware you'd need to implement a database of that
size would require more matter than exists. Period.
This idea is only interesting if it works at the scale proposed. It
doesn't. On a smaller scale, this is how data compression is already done.
Rob
>
> On Fri, Jul 06, 2007 at 01:52:55 -0500, Dan Becker wrote:
> > So we generate a packet using the idpacket field of a database to
> > describe which packets should be assembled in which order then send
> > it. 1 packet to send 500.
>
> Do you realize the id of the packet(s) would be equivalent to the contents
> of the package(s)?
>
> See also
> http://en.wikipedia.org/wiki/Information_entropy#Entropy_as_information_content
>
>
> _______________________________________________
> Full-Disclosure - We believe in it.
> Charter: http://lists.grok.org.uk/full-disclosure-charter.html
> Hosted and sponsored by Secunia - http://secunia.com/
>
Content of type "text/html" skipped
_______________________________________________
Full-Disclosure - We believe in it.
Charter: http://lists.grok.org.uk/full-disclosure-charter.html
Hosted and sponsored by Secunia - http://secunia.com/
Powered by blists - more mailing lists