17.768 a brief history of humanities computing

From: Humanist Discussion Group (by way of Willard McCarty willard.mccarty@kcl.ac.uk)
Date: Fri May 07 2004 - 16:58:43 EDT

  • Next message: Humanist Discussion Group (by way of Willard McCarty

                   Humanist Discussion Group, Vol. 17, No. 768.
           Centre for Computing in the Humanities, King's College London
                         Submit to: humanist@princeton.edu

       [1] From: Alexandre Enkerli <aenkerli@indiana.edu> (22)
             Subject: Broad History of HC?

       [2] From: Willard McCarty <willard.mccarty@kcl.ac.uk> (92)
             Subject: anecdotal contributions to a history

       [3] From: Michael Hart <hart@beryl.ils.unc.edu> (9)
             Subject: Re: 17.764 a brief history of humanities computing

             Date: Mon, 05 Apr 2004 07:06:47 +0100
             From: Alexandre Enkerli <aenkerli@indiana.edu>
             Subject: Broad History of HC?

    Joe Raben's fascinating first-person account of the development of
    Humanities Computing reveals interesting trends and provides some of us
    (especially junior scholars like myself), with much needed insight into the
    field's historiography. To oversimplify, the pattern seems to have gone
    from biblical studies to machine-readable texts made in order to do
    concordances which in turn were incorporated into broader database designs.

    But for those of us who weren't in touch with these developments, it's
    rather difficult to put these things in context. Several questions come to
    mind. Wasn't there anything done with computers in the Humanities without
    textual content? Was literature the only field benefitting from computer
    use? What were the deeper roots of HC? What happened between 1970 and 1993,
    when the Internet really exploded through the Web? How did computer
    scientists in general became interested in Humanities (if they ever did)?
    What were the first humanistic projects using computing itself as their
    main research topic?

    These may all sound exceedingly naive but a thorough understanding of the
    field's development may not be as common as one would wish.

    Thank you in advance for your help.

    Alexandre Enkerli
    Ph.D. Candidate
    Department of Folklore and Ethnomusicology
    Indiana University

             Date: Mon, 05 Apr 2004 08:41:13 +0100
             From: Willard McCarty <willard.mccarty@kcl.ac.uk>
             Subject: anecdotal contributions to a history

    The computer scientist R. W. Hamming, writing about what mattered to him in
    the history of computing, entitled an article, "We Would Know What They
    Thought When They Did It", in A History of Computing in the Twentieth
    Century: A Collection of Essays, ed. N. Metropolis and J. Howlett (New
    York: Academic Press, 1980), pp. 3-18. Looking at what passes for a history
    of computing he notes the radical inadequacy of what we still tend to get,
    namely a history of firsts, then asks about its next stage, an intellectual
    history: what sort of history of ideas do we want? "We wish to learn how to
    do great things ourselves rather than merely to recall what others have
    done.... We want to grasp the idea of creation itself, so that we can learn
    to create for ourselves." Hence his title, and the question that motivates
    the sort of history we also need: what were people thinking they were doing
    back then? What were they after -- not in our terms but in theirs?

    Much of what I noticed, especially from the late 1970s on, was a mixture of
    pragmatics, late-blooming positivism and something else, which I'll call

    First the pragmatics. Humanists, as Northrop Frye remarked somewhere, have
    as a social group always been rather good at mastering whatever helpful
    technology offered itself; they could see the advantages. So could the
    people I worked with as a graduate student at Toronto. Production of
    editorial objects, such as editions, was to many at that end of things
    obviously to be helped. Partly this was driven by financial necessity --
    achieving a goal that would not be funded or would cost too much -- partly
    as I recall by the desire to control a complex, error-prone process, to get
    this process into one's own hands, do things folks at the press might not
    understand or get right -- and be able to feel one's way along. Computing
    obviously benefitted the craft of editing and encouraged its inventive
    side. It brought to the surface those who were inclined to get their ideas
    from masses of data and those who were committed to providing reference
    works, such as concordances, dictionaries and prosopographies, and so had
    to manage masses. My own first efforts (apart from wordprocessing) were to
    manage bibliographic resources with a database program simply so that I
    could find what I had laborously recorded. My numerous dusty boxes of 3X5
    cards had turned into a graveyard for notes. But almost the first thing I
    then undertook was to do research that involved a massive amount of
    correlation -- putting together a synoptic account of the story of Theseus.
    Exactly when controlling the situation I was in became enlarging what I
    could do I cannot say, but I suspect there was a large overlap.

    So, I conclude, computing clearly had immediate appeal to those for whom
    the humanities have always meant direct, immediate encounter with cultural
    artifacts, for whom the motto could be (with apologies to William Carlos
    Williams), "no ideas but from data".

    Now for the positivism. Coming to the humanities from the sciences, as I
    did, and having spent years as a programmer in a physics research lab
    (various assembler languages, Fortran), I was rather sensitive to the false
    hopes of charmed humanists and to how the projected image of computing drew
    forth all sorts of rather strange attitudes toward the cultural artifacts
    of study. Crudely, I summarize this as, "At last we'll be able to prove X"
    (where X is a product of the imagination, such as an historical conjecture
    or literary-critical theory. I defer to David Lodge's Morris Zapp for
    further expression of this attitude). In addition, those of us born
    gadgeteers were prone to be dazzled by the machinery, which as Peter
    Galison says, is very romantic. I recall, for example, encountering the
    Xerox Star for the first time, and it was love at first sight, from which
    fortunately I awakened. (The design was impressive to be sure; what dealing
    with the company entailed was unmittigated disaster, unless you were a
    computer science department with a great deal of money, programming talent
    and pull.)

    Related to the positivism and supporting it was as usual the promotional
    rhetoric of computing, marketed chiefly as "productivity" tools. But on the
    ground the discrepancy between the promise and the reality was great. The
    journalist Jerry Pournelle called it the "Real Soon Now" syndrome. (To see
    what's happened to Jerry, go to http://www.jerrypournelle.com/ -- where he
    blogs!) The only sane way to cope, it seemed to me, was to blow the whistle
    on all that and ask about consequences: supposing the continual failure of
    computing to deliver what we wanted were due to something intrinsic, what
    would that be? The essence of an answer had been around for a long time,
    on the logico-mathematical side of computing, but perhaps not in a
    recognizable form. In any case, my point is that the frustrations were
    worth paying attention to, not just the gains in data-management.

    All this became real and immediate for me at the point at which, in a
    research project I was directing, I realized that the questions being
    raised of my source material (Ovid's Metamorphoses) by the attempt to
    computerize it for conventional purposes were far more interesting than the
    conventional questions I began by asking. These new questions came
    precisely from the cases that wouldn't fit whatever scheme I imposed -- and
    Ovid being Ovid, the elusiveness was clearly the whole point. So I
    abandoned the original purpose of the project (to support a conventional
    literary study) and joined the person I had employed to help me in asking
    the questions he was having to face. (I should note that he was a
    classicist but had more programming and editorial talent than I did.) That
    is when humanities computing as a research subject fully emerged for me.

    Allow me to put it to you that examining such moments in detail would turn
    out to be a profoundly interesting and important exercise in writing history.



                [Note: If you do not receive a reply within 24 hours please
    Dr Willard McCarty | Senior Lecturer | Centre for Computing in the
    Humanities | King's College London | Strand | London WC2R 2LS || +44 (0)20
    7848-2784 fax: -2980 || willard.mccarty@kcl.ac.uk

             Date: Mon, 05 Apr 2004 08:41:47 +0100
             From: Michael Hart <hart@beryl.ils.unc.edu>
             Subject: Re: 17.764 a brief history of humanities computing

    My apologies, if I had downloaded from the latest version of the
    Project Gutenberg graph this morning, I would have included:

    We should also add that the quality of the Project Gutenberg eBooks has been
    increasing over the same period, in terms of initial accuracy, format options
    and continuous error correction. In addition, Project Gutenberg now provides
    eBooks in 30 languages, with Project Gutenberg of Europe targeting over 50,
    and Project Gutenberg II offering 104 languages.

    And would reflect a correction of the typo on the last line.


    This archive was generated by hypermail 2b30 : Fri May 07 2004 - 16:58:47 EDT