14.0400 cybernetic totalism, creative error and Sirens' voices

From: by way of Willard McCarty (willard@lists.village.Virginia.EDU)
Date: 10/22/00

  • Next message: by way of Willard McCarty: "14.0401 online publishing"

                   Humanist Discussion Group, Vol. 14, No. 400.
           Centre for Computing in the Humanities, King's College London
                   <http://www.princeton.edu/~mccarty/humanist/>
                  <http://www.kcl.ac.uk/humanities/cch/humanist/>
    
    
    
             Date: Sun, 22 Oct 2000 07:27:38 +0100
             From: Willard McCarty <willard.mccarty@kcl.ac.uk>
             Subject: cybernetic totalism, creative error and Sirens' voices
    
    Humanists may enjoy reading Jaron Lanier's online essay, One Half of a
    Manifesto, beginning at
    <http://www.edge.org/3rd_culture/lanier/lanier_index.html> and related
    material I'll cite in a moment. Lanier writes against what he calls
    "cybernetic totalism", for which he articulates the following "partial
    roster of component beliefs":
    
     >1) That cybernetic patterns of information provide the ultimate and best
     >way to understand reality.
     >2) That people are no more than cybernetic patterns.
     >3) That subjective experience either doesn't exist, or is unimportant
     >because it is some sort of ambient or peripheral effect.
     >4) That what Darwin described in biology, or something like it, is in fact
     >also the singular, superior description of all creativity and culture.
     >5) That qualitative as well as quantitative aspects of inform ation
     >systems will be accelerated by Moore's Law.
     >And finally, the most dramatic:
     >6) That biology and physics will merge with computer science (becoming
     >biotechnology and nanotechnology), resulting in life and the physical
     >universe becoming mercurial; achieving the supposed nature of computer
     >software. Furthermore, all of this will happen very soon! Since computers
     >are improving so quickly, they will overwhelm all the other cybernetic
     >processes, like people, and will fundamentally change the nature of what's
     >going on in the familiar neighborhood of Earth at some moment when a new
     >"criticality" is achieved- maybe in about the year 2020. To be a human
     >after that moment will be either impossible or something very different
     >than we now can know.
    
    This essay is published by the Edge Foundation, <http://www.edge.org/>,
    whose mandate is "to promote inquiry into and discussion of intellectual,
    philosophical, artistic, and literary issues, as well as to work for the
    intellectual and social achievement of society". Edge originated in a group
    called The Reality Club (George Dyson, Freeman Dyson, Cliff Barney, Bruce
    Sterling, Rod Brooks, Henry Warwick, Kevin Kelly, Margaret Wertheim, John
    Baez, Lee Smolin, Stewart Brand, Rod Brooks, Lee Smolin, Daniel C.
    Dennett). Various members have commented on Lanier's piece. Allow me to
    quote the latter 2/3rds of George's Dyson's response (which Freeman Dyson
    compares to what he wrote at the end of Origins of Life), then a bit from
    Daniel Dennett's. First Dyson:
    
     >Back in the days when programs could be debugged but processing could not
     >be counted on from one kilocycle to the next, John von Neumann wrote his
     >final paper in computer theory: "Probabilistic Logics and the Synthesis of
     >Reliable Organisms from Unreliable Components" [in Claude Shannon and John
     >McCarthy, eds., Automata Studies (1956) pp. 43  99]. It makes no
     >difference whether you have reliable code running on lousy hardware, or
     >lousy code running on reliable hardware. Same results.
     >
     >What should reassure the technophiles, and unsettle the technophobes, is
     >our world of lousy code. Because it is lousy code that is bringing the
     >digital universe to life, rather than leaving us stuck in some programmed,
     >deterministic universe devoid of life. It is that primordial soup of
     >archaic subroutines, ambiguous DLL's, crashing Windows, and living
     >fossil operating systems that is driving the push towards the sort of
     >fault embracing template-based addressing that proved so successful in
     >molecular biology, with us -- and our computers -- as one of its strangest
     >results.
     >
     >Let us praise sloppy instructions, as we also praise the Lord.
    
    Dennett finds in Lanier's piece a tension quite similar to that in Joseph
    Weizenbaum's 1976 book, Computer Power and Human Reason:
    
     >are the Cybernetic Totalists just hopelessly wrong -- their dream is, for
     >deep reasons, impossible -- or are they cheerleaders we must not
     >follow because we/they might succeed? There is an interesting middle
     >course, combining both options in a coherent possibility, and I take it
     >that this is the best reading of Lanier's manifesto: the Cybernetic
     >Totalists are wrong and if we take them seriously we will end up creating
     >something -- not what they dream of, but something elsethat is evil....
     >Joseph Weizenbaum soon found himself drowning under a wave of fans, the
     >darling of a sloppy-thinking gaggle of Euro-intellectuals who struck
     >fashionable Luddite poses while comprehending almost nothing about the
     >technology engulfing them. Weizenbaum had important, reasoned criticisms
     >to offer, but all they heard was a Voice on Our Side against the Godless
     >Machines. Jaron, these folks will love your message, but they are not your
     >friends. Arent your criticisms worthy of the attention of people who
     >actually will try to understand them?
    
    
    Yours,
    WM
    



    This archive was generated by hypermail 2b30 : 10/22/00 EDT