1996-09-09 - Re: Conservation Laws, Money, Engines, and Ontology

Header Data

From: “Greg Kucharo” <sophi@best.com>
To: <tcmay@got.net>
Message Hash: c3752fab6047111ecc24053954235603bc66896516d20cbbd679e464c80463da
Message ID: <199609090401.VAA26172@dns2.noc.best.net>
Reply To: N/A
UTC Datetime: 1996-09-09 07:49:12 UTC
Raw Date: Mon, 9 Sep 1996 15:49:12 +0800

Raw message

From: "Greg Kucharo" <sophi@best.com>
Date: Mon, 9 Sep 1996 15:49:12 +0800
To: <tcmay@got.net>
Subject: Re: Conservation Laws, Money, Engines, and Ontology
Message-ID: <199609090401.VAA26172@dns2.noc.best.net>
MIME-Version: 1.0
Content-Type: text/plain


  One thing that occurs here.  I imagine a scenario where you have a
"share" of resources on a system(and ISP for example).  You're metered as
to how much you can post or store.  Actually as it is now posting is
regulated through extra payments per meg above the limit.  Spam is being
somewhat regulated by Terms of Service type things, but my point is what is
to prevent pooling resources among several system to achive the same Spam
pursuits some have.  Say for example that an individual gets several
accounts to balance the load at thier point.  The Usenet for example has no
 "choke point".  How could ISP's apply conservation here?  If you limit the
amount of traffic you still aren't holding back the flow of "spam".
  Here's where reputations could come in.  You cound't open a new account
anywhere without a good "reputation".  This could aid in balancing the load
of certain people.

???????????????????????????????????????
Greg Kucharo
sophi@best.com
"Eppur si moeve" -Galileo
???????????????????????????????????????





Thread