1997-11-03 - Larry Lessig: a new CDA better than censorware?

Header Data

From: Declan McCullagh <declan@well.com>
To: cypherpunks@toad.com
Message Hash: 4967921f9f2fca0b2170f7afa8db4cf02be0bd903a4a74a7989da39a6974ab14
Message ID: <v03007805b083bdb0607e@[168.161.105.141]>
Reply To: N/A
UTC Datetime: 1997-11-03 18:24:45 UTC
Raw Date: Tue, 4 Nov 1997 02:24:45 +0800

Raw message

From: Declan McCullagh <declan@well.com>
Date: Tue, 4 Nov 1997 02:24:45 +0800
To: cypherpunks@toad.com
Subject: Larry Lessig: a new CDA better than censorware?
Message-ID: <v03007805b083bdb0607e@[168.161.105.141]>
MIME-Version: 1.0
Content-Type: text/plain



***********

Date:         Mon, 3 Nov 1997 11:11:56 -0500
From: Mike Godwin <mnemonic@WELL.COM>
Subject:      Law Professor calls for new CDA
To: NETLY-L@pathfinder.com

(I have been saying for some time that Professor Lessig, despite how he
characterizes himself, is no friend of freedom of speech. But here's where
the other shoe has dropped.)

 From the online edition of the New York Times:

 October 30, 1997

 By CARL S. KAPLAN

 Is a Better CDA Preferable To Opaque Censorship?

 he Communications Decency Act is dead, and most free speech advocates say,
 "good riddance." If there must be a solution to the problem of kids and
 cyber-pornography, let a thousand software-blocking packages bloom in
 homes, libraries and schools.

 Professor Lawrence Lessig of Harvard Law School is having none of this,
 however. In a recent controversial draft essay on the regulation of
 cyberspace, Lessig, a respected cyberlaw scholar, argues that if government
 must embrace a solution to indecent speech, a revamped CDA-like plan would
 be far more protective of traditional free speech values than the dangerous
 filtering products that many civil libertarians seem to love, or at least
 to prefer.

 "My sense is that this first major victory [in Reno v. ACLU]  has set us
 in a direction that we will later regret," Lessig writes, referring to the
 Supreme Court opinion striking down the CDA on First Amendment grounds.
 "It has pushed the problem of kids and porn towards a solution that will
 (from the perspective of the interest in free speech) be much worse. The
 (filtering products) touted by free speech activists in Reno are, in my
 view, far more restrictive of free speech interests than a properly
 crafted CDA."

 Lessig is not the first free speech advocate to damn filtering software.
 But he goes further than most in his nostalgia for a revised CDA. He also
 knows that his conclusions may invite some fury.

 "Promoting a CDA-like solution to the problemb of indecency is very much
 to step out of line," he writes. "I am not advocating a CDA-like solution
 because I believe there is any real problem. In my view, it would be best
 just to let things alone. But if Congress is not likely to let things
 alone, or at least if the President is more likely to bully a private
 solution then we need to think through the consequences of these different
 solutions. . . . We may well prefer that nothing be done. But if something
 is to be done, then whether through public or private regulation, we need
 to think about its consequences for free speech."

 Lessig's article, titled "What Things Regulate Speech," is a trove of
 ideas and legal scholarship on the permissible scope of government
 regulation of indecency, the evils of filtering and the nature of law in
 cyberspace, where restrictions on speech, for example, are apt to be
 enacted not by federal or state statues, but by minimally debated software
 codes. Happily, the article is written in plain English, not law school
 professor-ese. Many of the author's ideas have been expressed in earlier
 articles, law review essays and speeches.

 Boiled down and simplified, the main points of Lessig's CDA argument run
 like this:

 First, he argues that government has the power to place or "zone" hard-core
 pornography out of the reach of kids, so long as the means chosen is the
 least restrictive form of discrimination that existing technology permits.

 For example, Lessig notes that a California law making it a crime to sell
 porn in unattended vending machines, unless the machines are equipped with
 an adult identification system, was upheld by a Federal Appeals court. The
 Supreme Court earlier this year declined to review the case and thereby
 left the California law standing. In a historical footnote, the denial was
 issued in the same week the Supreme Court heard oral arguments in the CDA
 case w another matter involving the distribution of porn to kids.

 Next, Lessig points out that the success in the CDA case came in persuading
 the Court that other, less restrictive means for protecting children from
 porn were still available. The evils associated with the less restrictive
 means [ traditional blocking software ] are legion, however.

 For one thing, blocking software is crude because it tends to filter out
 too much [ sites opened to discuss AIDS or gay rights, for example ]
 because of mistaken associations with indecency. Also, blocking software is
 opaque, because the lists of banned sites are not published. Finally, the
 filtering companies, prompted by the demands of the market, tend to offer
 generalized censorship [ restrictions on access to a variety of potentially
 objectionable sites, from those dealing with violence to gambling ] not
 just censorship of so-called indecent sites.

 The upshot is that to the extent that government embraces filtering
 software, or mandates its use in libraries or schools, for example, such
 state action may be unconstitutional, because the government is exceeding
 its narrow justification in separating kids from hard-core pornography.

 As bad as private blocking is, PICS is worse, Lessig argues. PICS, an
 acronym for "Platform for Internet Content Selection," is a proposed
 labeling standard that makes it possible to rate and block material on the
 Net.

 "It was motivated as an alternative to the CDA," Lessig, 36, said in a
 recent telephone interview. "The MIT geniuses who thought it up realized
 it had broader potential that just blocking indecent speech."

 Like blocking software, PICS will probably be used as a general filtering
 tool w far exceeding the narrow interests of government, Lessig says.
 Another problem is the invisible nature of PICS: "If I use PICS on a
 search engine, and PICS returns two hits,  and blocks 8 hits, it doesn't
report
 back to me that 8 sites have fallen off the Earth," Lessig says.

 Most ominously, he argues, PICS can be imposed by anybody in the
 distribution chain. Thus a filter can be placed on an person's computer, or
 at the level of a company, an ISP or even a nation without the end user
 ever knowing it, Lessig says, making it easier for centralized censors to
 place filters on the Net.

 Taken together, filtering software and PICS lead to a hard-wired
 architecture of blocking that is antagonistic to the original free-wheeling
 and speech-enhancing values of the Internet, Lessig argues.

 By contrast, the scheme proposed by the old CDA wasn't that bad, he
 suggests. Of course, the original CDA was flawed because it went after a
 category of speech that was too vague to pass constitutional muster, Lessig
 says - a problem that CDA II could fix by taking sharper aim at hard-core
 pornography.

 More important, the scheme envisioned by the old law was somewhat
 protective of free speech values. Under the CDA, the "means" offered to
 separate kids from pornography was to put porn behind a wall that screened
 out kids with reasonable effectiveness. The technique was not filtering.  It
 was to set up identity checks on the doors through which people wanted to
 pass.

 This type of system has two things going for it, says Lessig. First, its
 restrictions extend only as far as the legitimate governmental interest w
 screening kids from porn. Second, it is unlikely to morph into a more
 comprehensive system for general censorship.

 Lessig adds that this type of identification system - contrary to the
 court's factual findings - is workable.

 Reaction to Lessig's ideas from the free-speech cohort is understandably
 mixed. James Boyle, a law professor at American University, for example,
 agrees with Lessig's point that people should be very suspicious of
 technological solutions to indecent speech on the Internet, like blocking
 software and PICS.

 "There's a kind of belief that technological solutions are pure and
 neutral. They have an allure - like Jetson's Jurisprudence," he says.
 "But I agree with Larry; people need to understand that technology isn't
 necessarily benign."

 Even so, Boyle is disinclined to reconsider the merits of the CDA adult
 identification scheme. "I do diverge there," he says, adding that it is
 impractical to be totally against filtering systems. "The question is how
 to design filtering systems so they have the maximum vibrancy."

 Jonathan Wallace, a New York lawyer and writer on cyberspace issues, also
 shares Lessig's skepticism on blocking software and PICS. But he thinks a
 dusting off of the CDA is "wrongheaded."

 Even assuming that an adult identification scheme were viable - which he
 doubts - Wallace asserts that any attempt to redefine indecent speech more
 narrowly would invite lawsuits from right-wing groups intent on proving
 that under their community standards, objectionable speech should be
 banned.


 Carl S. Kaplan at kaplanc@nytimes.com welcomes your comments and
 suggestions.

 Copyright 1997 The New York Times Company

----------------------------------------------------------------------------
We shot a law in _Reno_, just to watch it die.

Mike Godwin, EFF Staff Counsel, is currently on leave from EFF,
participating as a Research Fellow at the Freedom Forum Media Studies
Center in New York City. He can be contacted at 212-317-6552.
----------------------------------------------------------------------------







Thread