Subject: Re: How accurate is Metcalfe's law? (Was: Ximian software)
Date: Fri, 4 Jan 2002 14:36:42 -0500

Seth Gordon writes:
>    Consider what happens when we can manufacture robots that are roughly
>    equivalent to a human being, for under 3 years salary.  Let us assume
>    that the knowledge of an existing one can be readily reproduced in new
>    copies, and that these machines learn new jobs about as fast as a
> For any sufficiently complex task that I want my computer to perform,
> there is a gap between What I Want Done and What I Told The Computer
> To Do.  A new programming language, language extension, or user
> interface can close the gap for a certain class of problems, but once
> previously-difficult programming problems become easy, it just gives
> people more time to work on the previously-unthinkable problems that
> have become merely difficult.

Employers have the IDENTICAL problem with employees right now.  Or have
you never seen someone try to follow instructions and wind up with a
result that the person giving the instructions didn't want?

These hypothetical machines have, by assumption, the ability to be
interfaced with exactly like you would a human employee.  Talk with it,
have it ask questions, etc.  This is fundamentally a user interface,
yes.  It suffers from all of the general catches and gotchas that any
user interface to an external resource has.

There is a difference of degree between this problem as it applies to
interacting with people and interacting with computers.  There is not a
difference in kind.

> So even if we have robots who *can do* everything that a human's
> unassisted brain can do, we will still have the problem of figuring
> out *what to tell them to do*.

It always amazes me that people refuse to accept that there is nothing
a human can do which is not, in principle, accomplishable by a
computer.  Certainly many things people do are not feasible with
current hardware.  But there is no fundamental law or general principle
that anyone has come up with to demonstrate that a computer cannot
simulate any aspect of a human's mental processes.