From: Adam Back <aba@dcs.ex.ac.uk>
To: gbroiles@netbox.com
Message Hash: 9e212c704f65f2f391dc0f85c3ab31ba2b14bed5b03eb05dabe05a51d0a01fed
Message ID: <199710181953.UAA00908@server.test.net>
Reply To: <3.0.2.32.19971018025454.03e23124@pop.sirius.com>
UTC Datetime: 1997-10-18 21:29:21 UTC
Raw Date: Sun, 19 Oct 1997 05:29:21 +0800
From: Adam Back <aba@dcs.ex.ac.uk>
Date: Sun, 19 Oct 1997 05:29:21 +0800
To: gbroiles@netbox.com
Subject: Re: Security flaws introduced by "other readers" in CMR
In-Reply-To: <3.0.2.32.19971018025454.03e23124@pop.sirius.com>
Message-ID: <199710181953.UAA00908@server.test.net>
MIME-Version: 1.0
Content-Type: text/plain
Greg Broiles <gbroiles@netbox.com> writes:
> Adam Back writes:
> >You have a dual concern: you are trying to protect against big brother
> >and against little brother.
>
> At the technical level, is there a meaningful difference between the
> brothers?
I think so yes. I think I am able to demonstrate situations where
there is a trade-off: you can trade government resistance against
little brother resistance.
PGP Inc's CMR I believe is a form of this; they have nice
recommendations for companies, statement of intents for plaintext
handling, and transparent statements of when email is a company
recoverable email address.
But way that they have implemented this functionality I believe could
be perverted for uses other than their intentions by governments.
I think this means that the resulting system has traded off weaker
government resistance to acheive marginally stronger little brother
resistance. (I may even say no stronger little brother resistance, I
think CDR can acheive same functionality).
The best example I can think of is France. Other governments (US/UK)
people say: it'll never happen; tin pot dictator scenarios people are
less worried about because they feel the dictator will do unreasonable
things in any case.
The situation in France is: currently (or recently) you could not use
encryption at all without a license. The enforcement rate is low to
zero. (Jerome Thorel interviewed the head of SCSSI (NSA equivalent),
and they said that if you asked for a license for personal use they
would say "no", but if you didn't bother asking and used PGP2.x
anyway, they wouldn't do anything about it typically).
Now I understand the French have switched position: you can use
encryption without a license *provided* that it has master key access
for the government.
Lets say then that our aim is to design the OpenPGP standard and the
pgp5.5 implementation to be as resistant to use by the French
government as possibe.
With the pgp standard as is french government could insist that people
use pgp5.x. pgp5.x provides a reasonablly useful framework for the
french government to adapt to be used as a master access system.
Because this will then be explicitly allowed, more people are likely
to use it. (Current people using pgp2.x illegally are one suspects
the french cypherpunks subset of the population). This means that the
french government can start to ramp up enforcement as there are less
objections (you _can_ protect confidentiality for business and privacy
uses, use this: pgp5.x).
If on the other hand pgp5.x were to use only single recipients for
confidentiality, and to base company recovery of encrypted mail
folders on key recovery information stored locally alongside the
mailbox the system would be less useful to the french government.
> Aren't we really talking about third-party access to communications,
> and second-party access to stored data ... with the "brother"
> distinction being one made at a social/political level, as a
> judgement about the legitimacy of the access or the size of the
> actor, rather than the character of the access?
It is more than that I think.
This is because many people acknowledge legitimacy of companies to a)
encrypt data b) to recover data in case employee forgets keys, or dies
unexpectedly.
Second party access to stored data is much less scary. Little brother
can ultimately read _everything_ you do at work. If he gets
suspicious he can install keyboard logger, keyboard password sniffer,
or concealed videocam whilst you are out of the office. The best we
can do is discourage little brother from abusing systems designed for
data recovery as mass communications snooping. The best suggestion I
have seen for this so far was Bill Stewart's suggestion to only store
recovery info for some of the bits. Make the recovery process
artificially slow: say 40 bits. Worth it for recovering main
developers design notes made in email when he dies unexpectedly. Some
hinderance to little brother unless he is determined. As long as this
hinderance is similar scale to other similar things little brother
could do to check up on suspicious user, you have achieved your goal
of hindering little brother.
Big brother is hindered very significantly if you do recovery locally,
rather than on the communications link as PGP Inc CMR does. This is
because big brother does not have access to the ciphertext on disks.
He must come and take them. Whereas for communications he can
practically acheive access to your email ciphertext at push button
convenience.
Your email communications are already secret split with the NSA: the
data is split into to unequally sized halves: the _key_ and the
_ciphertext_. For this reason you really don't want the NSA to get
that key.
For data storage recovery, your data is again in two halves: you have
one, the _key_, your employee/you have the other, the _ciphertext_ on
disk. Your employee can recover that info anyway. The NSA can't
easily. It is much more logistically expensive to collect or randomly
sample disk contents.
Adam
--
Now officially an EAR violation...
Have *you* exported RSA today? --> http://www.dcs.ex.ac.uk/~aba/rsa/
print pack"C*",split/\D+/,`echo "16iII*o\U@{$/=$z;[(pop,pop,unpack"H*",<>
)]}\EsMsKsN0[lN*1lK[d2%Sa2/d0<X+d*lMLa^*lN%0]dsXx++lMlN/dsM0<J]dsJxp"|dc`
Return to October 1997
Return to ““William H. Geiger III” <whgiii@invweb.net>”