10.0681 withering away & the overload

WILLARD MCCARTY (willard.mccarty@kcl.ac.uk)
Sun, 9 Feb 1997 09:10:56 +0000 (GMT)

Humanist Discussion Group, Vol. 10, No. 681.
Center for Electronic Texts in the Humanities (Princeton/Rutgers)
Centre for Computing in the Humanities, King's College London
Information at http://www.princeton.edu/~mccarty/humanist/

[1] From: Don Wilkins <dwilkins@ucr.campus.mci.net> (58)
Subject: Re: 10.0675 withering away & the heart

[Warning: this post is overlong, and I offer my apology!]

While we are still on the subject of Willard's provocative comments
about humanities computing, I would like to second Robert Tannenbaum's
remarks and respond to Todd Blayone's.

As a classicist using a Macintosh, I was stuck with Pandora (a program
to process the TLG CD) and quickly developed a love/hate relationship
with it. Eventually I came to the conclusion that I wanted and needed a
better program, and that I also needed to learn a serious programming
language after years of using HyperCard to write CAI programs for Greek.
This has proved to be a very demanding, on-going commitment after
several years of work and I continue to spend most of my extra time
programming or learning more about programming with the advent of HTML
and Java.

Among the many lessons I have learned (including "Everything takes
longer than you think" and "There is always one more bug") is the fact
that computing is so huge and diverse a discipline that to be good at
working in any one area you probably have to become a specialist at it.
I now know a good deal about the programming languages and the
idiosyncracies of Mac's, and I realize that relatively few professional
progammers have a lot of interest in or knowledge of the kind of text
processing with which I deal; conversely, it would take me a lot more
work to become proficient in any of the countless other areas of
academics or business that rely heavily on computers. So I think it is
unrealistic for classicists, and perhaps other academicians as well, to
think that they can get the program they really want by simply calling
in a support-staff programmer (at those universities which can afford
such staff) and telling him/her that they would like a program that will
do thus and so. Even when you have a good program, there is the constant
need to upgrade it; and in my case I encounter research problems that I
need the computer to solve, but the code that I would write into the
program is too esoteric and elaborate to make it a permanent part of the
program.

As to Todd's remarks, I don't think I understand what he means by "IMHO,
however, 'humanities computing' will become less and less
relevant as a younger generation of thinkers-- products of
cyberculture--- begin to reinvent the humanistic disciplines." It seems
to me that younger generations will most likely use computing routinely
in the humanities, so if Todd's meaning is that computing will cease to
be an issue of discussion and debate and simply become an integral part
of humanities, then I would agree with him. I think I do understand his
comment, "One might be foolish enough to suggest, however, that the
multitude of amateurs outside the centre are, at least on rare occasions,
less likely to miss the forest for the trees." However, I would suggest
IMHO that computing properly used will aid in providing a more accurate
view of both trees and forest. Too much of what we teach or assert seems
to be based on conjecture or general impressions. Obviously the computer
can serve to track down the evidence that we need to confirm or refute
hypotheses, but I also see it as a device to help us think through the
quantifiable aspects of our hypotheses, and in fact to expose various
aspects as quantifiable when we might mistakenly assume otherwise. Without
getting into more detail on this last point, let me just say that it is a
good exercise to ask oneself, given a little knowledge of programming, if
and how the computer can be programmed to solve a particular problem. Very
often I discover that the problem is at least theoretically computable,
whether I would actually want to write the program or not, and this
realization itself proves to be valuable.

Don Wilkins
UC Riverside