1997-10-13 - Re: IETF policy on refusing to allow politics to weaken protocols (Re: Why Adam Back keeps politicizing technical issues)

Header Data

From: “Tony Mione” <mione@boeing.rutgers.edu>
To: Adam Back <frantz@netcom.com
Message Hash: 39540e451004a5268bdd73de4c24bc9e124cd2c84934b4bc06906e7698f957cc
Message ID: <971013100300.ZM23083@boeing.rutgers.edu>
Reply To: <199710131140.MAA01044@server.test.net>
UTC Datetime: 1997-10-13 15:02:33 UTC
Raw Date: Mon, 13 Oct 1997 23:02:33 +0800

Raw message

From: "Tony Mione" <mione@boeing.rutgers.edu>
Date: Mon, 13 Oct 1997 23:02:33 +0800
To: Adam Back <frantz@netcom.com
Subject: Re: IETF policy on refusing to allow politics to weaken protocols (Re: Why Adam Back keeps politicizing technical issues)
In-Reply-To: <199710131140.MAA01044@server.test.net>
Message-ID: <971013100300.ZM23083@boeing.rutgers.edu>
MIME-Version: 1.0
Content-Type: text/plain



On Oct 13, 12:40pm, Adam Back wrote:
> Subject: Re: IETF policy on refusing to allow politics to weaken protocols
>
> Bill Frantz <frantz@netcom.com> writes:
> > At 2:41 AM -0700 10/11/97, Adam Back wrote:
> > >
> > >5. The IETF process should be accepting proposed designs and deciding
> > >on the best ones, which PGP Inc, and the other suppliers would then go
> > >and implement.  As it is now, as William Allen Simpson just pointed
> > >out, PGP Inc is cruising ahead implementing, and deploying things
> > >without bothering with the OpenPGP process.
> >
> > Just a quick reality check here.  Frequently implementations have proceeded
> > IETF standards.  That is one of the strengths of the IETF process (as
> > compared with e.g the CCITT.)
>

	The reality is that standards generally drag long behind the need for
new technology. In any standards org worth thier weight in salt, the group will
look at existing implementations (not trial implementations but ones that have
been in use for some time). This is called 'Prior Art'. Without prior art, it
is not easy to determine which features or schemes work and which do not.

	The problem we are running into here is that PGP Inc. is marketting
about the only prior art available for this technology. In addition, business
needs will not stand still waiting for a standard to come out in 1-2 years.
This creates a somewhat dangerous situation as Adam has pointed out (Note: I am
NOT PGP-Inc.-bashing. This is merely an outcome of what happens when a vendor
tries to take their technology and 'open' it up).

	I would be concerned if this relatively new addition were being added
to the internet-drafts currently being produced by the group. I know they have
tried to wrestle with all of the issues. I just think we need some more
'experiential' evidence that it is the correct solution. In general, standards
only describe the 'bare-minimum' that an implementation must do to be
'compliant'. There is nothing wrong with leaving certain bells-and-whistles out
of the first pass at a standard. Vendors are always free to add proprietary
features to thier product. That is where the market competition comes in. PGP
Inc. implements thier CMR feature on top of the OpenPGP standard. Another
company attacks the problem a different way and adds a different 'extension'.
In about 5 years, a new standards group updates the 'OpenPGP' standard, argues
over the different schemes and adopts the one with the best technical merit.

> I must admit some ignorance of the IETF processes due to my only
> understandings being gained from intermittent following of the IPSEC
> standardisation process, and in that respect I thank you for the data
> point...
>
> However, I would defend that my understanding is, and my point was
> more that proposals with trial implementations should be used as
> discussion points, and as competing draft proposals rather than being
> used as arguments for it being necessary to build compatibility with
> just because someone has rushed off and sold copies of them, prior to
> ratification of those proposals by the standardisation process.  And
> that independently of the technical merits of those proposals, or
> arguably even in the face of evidence that there are more secure
> alternatives to these proposals.
>
> The IETF standardisation process is supposed to make decisions based
> on technical merit.  Not based on marketing clout.  Nor even based on
> reputation capital, something which PGP surely has in abundance, as
> transferred to it by Phil Zimmermann with his virtual net.god status.
>

	To Adam and others, I would say, work like mad to develop pgp-like
implementations (based on available documents of 2.6 or 5.0 formats) and add in
alternative mechanisms to do the same function (as per Adam's discussion of the
3-key system : signature, transit-encryption, and storage). To others in this
group, I would say keep it simple...especially on a first pass at a standard.

	I have noticed an interesting phenomena in a number of IETF groups. One
group develops a standard that is huge and nearly unimplementable. Another
group comes along and says "We can do it simpler than that". They start
designing and writing documents, run into the same issues the first group did
and eventually start solving them the same way leading to another large
unimplementable standard. Witness: X509 -> PKIX -> SPKI -> OpenPGP. OpenPGP was
supposed to be different because it had a large established user base. Then,
all of a sudden, everyone is arguing over the same issues as the first three
groups...global naming problems, certificate flexibility, key recovery, policy
enforcement...


>...
>
> Adam
>-- End of excerpt from Adam Back

--
Tony Mione, RUCS/NS, Rutgers University, Hill 055, Piscataway,NJ - 732-445-0650
mione@nbcs-ns.rutgers.edu                 W3: http://www-ns.rutgers.edu/~mione/
PGP Fingerprint : E2 25 2C CD 28 73 3C 5B  0B 91 8A 4E 22 BA FA 9F
Editorial Advisor for Digital Systems Report   ***** Important: John 17:3 *****






Thread