1996-10-13 - Re: exporting signatures only/CAPI (was Re: Why not PGP?)

Header Data

From: Adam Back <aba@dcs.ex.ac.uk>
To: azur@netcom.com
Message Hash: eb0124d2328a01345017c28cd4ee0019c587bbaf8fb50ab904513ffbde6889e3
Message ID: <199610122109.WAA01149@server.test.net>
Reply To: <v02130502ae850fb9a80c@[10.0.2.15]>
UTC Datetime: 1996-10-13 10:13:32 UTC
Raw Date: Sun, 13 Oct 1996 03:13:32 -0700 (PDT)

Raw message

From: Adam Back <aba@dcs.ex.ac.uk>
Date: Sun, 13 Oct 1996 03:13:32 -0700 (PDT)
To: azur@netcom.com
Subject: Re: exporting signatures only/CAPI (was Re: Why not PGP?)
In-Reply-To: <v02130502ae850fb9a80c@[10.0.2.15]>
Message-ID: <199610122109.WAA01149@server.test.net>
MIME-Version: 1.0
Content-Type: text/plain



Steve Shear <azur@netcom.com> writes:
> >The problem however, is finding a non-US site to hold the hot potato
> >once it has been exported.  For example 128 bit Netscape beta was
> >exported a while ago.  I don't see it on any non-US sites.  This is
> >due to Netscape's licensing requirements, you need a license to be a
> >netscape distribution site, the license doesn't include the right to
> >mirror non-exportable versions on non-US sites.
> 
> That's one good application for remailers, and .warez newsgroups. at.

I don't know of any advertised files by email services using nym
servers, where the file request, and the files are both sent via
remailers.

The problem with this is currently is that the nym servers couldn't
stand up to the scrutiny if SPA or whoever got interested.  The
message flood attack on the nym would reveal the services host.

The BlackNet architecture solves this problem by posting requests
encrypted with the services key to a newsgroup, but USENET newsgroup
disitribution time is slow (*), and people are spoilt these days with
WWW, and expect results now, not days later.

The requested file can be posted via mixmaster.  You would want to use
a different, random chain of remailers each time.  A reverse message
flood could reveal the host also, as you can request lots of copies,
and the service will blindly serve the files.  (If someone wants to
discover the service host, they send 1000s of requests, then sit back
and watch which user sends most data into the remailer net.)

To combat this the service could impose a limit on the number of
copies it would serve per day.  This allows a denial of service
attack, if someone wants to stop anyone else getting a copy, they just
saturate the service.  Still an improvement over no limit.

Of course Ross Anderson's `eternity service' provides the general case
solution for distribution of such data.  It is complex to implement
well though.

Adam
--
#!/bin/perl -sp0777i<X+d*lMLa^*lN%0]dsXx++lMlN/dsM0<j]dsj
$/=unpack('H*',$_);$_=`echo 16dio\U$k"SK$/SM$n\EsN0p[lN*1
lK[d2%Sa2/d0$^Ixp"|dc`;s/\W//g;$_=pack('H*',/((..)*)$/)





Thread