From: Hal <hfinney@shell.portal.com>
To: sjb@universe.digex.net
Message Hash: 7c235845881d8d7884874ccced6c3ffe2171153bc938b7c966e2965d4c009378
Message ID: <199512052029.MAA08717@jobe.shell.portal.com>
Reply To: N/A
UTC Datetime: 1995-12-05 20:29:11 UTC
Raw Date: Tue, 5 Dec 95 12:29:11 PST
From: Hal <hfinney@shell.portal.com>
Date: Tue, 5 Dec 95 12:29:11 PST
To: sjb@universe.digex.net
Subject: Re: towards a theory of reputation
Message-ID: <199512052029.MAA08717@jobe.shell.portal.com>
MIME-Version: 1.0
Content-Type: text/plain
From: Scott Brickner <sjb@universe.digex.net>
>
> Hal writes:
> >Changing the market conventions (say, by introducing escrow agencies)
> >will change the weightings of the various factors that make up
> >utility. If I no longer have to trust the honesty of the person I am
> >trading with (because we have an escrow agency to help us make the
> >exchange) then the importance of his reputation for honesty goes down.
> >The result is that the "reputation" curves will change rather
> >dynamically and unpredictably as we consider different possible
> >structures in the market. This will make the analysis of them
> >intractable, I would think.
>
> Analytically, using an escrow agent doesn't change the utility
> function. It replaces the trading partner's honesty reputation
> estimate with the escrow agent's (which is presumably higher, or why
> use them?). This is just a parameter substitution.
>
> Whence comes the intractability?
By the "utility function" I was referring to Wei's model in which each
person has an idea of how much "utility" (a general summation of
personal value and usefulness) they would get from another person, as a
function of cost. The utility function takes cost as input and returns
"utiles" (or whatever) as output. So, with this model, using an escrow
agent would change the utility function; for a given cost, the utility
of a person to me would change (say, if the person involved were
thought to be dishonest, then the presence of escrow agents would make
him more useful to me). The utility function in Wei's model is a curve
where the Y axis is utility and the X axis is cost. Changing the
importance of honesty will change the position and shape of this
curve.
I think it would be more tractable to have a model in which honesty
played an explicit part. We might even make assumptions about the
mathematical relationship between honesty and overall utility - for
example, that utility to me would be monotonically increasing with
increased honesty of the other guy.
What I mean is something like this. Let t be the degree of trust
necessary for a business relationship to be consummated. For t=0, no
trust is needed, and the relationship is such that neither party takes
any significant risk - a cash sale, perhaps. For t=1, in some sense
total trust is needed, and a party can cheat the other with 100% safety.
Now let h(t) be the honesty reputation of a person, so that the utility
which people expect to receive from them gets multiplied by h(t). For a
person with a repuation for honesty, h(t) is close to 1 for all t. For a
person who seems dishonest, h(t) will go from 1 to 0 as t goes from 0 to
1.
This is all pretty hand-wavy, but the idea would be to come up with good
strategies to estimate h(t) from a person's behavior, and good ways to
choose what kind of behavior one should follow given the value(s) of t
which are prevalent in the market. This kind of analysis would lead you
to focus on the importance of the amount of trust needed in a transaction.
The underlying utility function is based on such traditional factors as
productivity and reliability. It won't change as we consider the
variables of our analysis, because we have factored out the honesty and
trust issues so that they are more explicit. That's the kind of
direction I was suggesting.
Hal
Return to December 1995
Return to “Scott Brickner <sjb@universe.digex.net>”