corrected CATH88 report (308)

Willard McCarty (MCCARTY@VM.EPAS.UTORONTO.CA)
Sun, 15 Jan 89 19:34:41 EST


Humanist Mailing List, Vol. 2, No. 504. Sunday, 15 Jan 1989.

Date: Sat, 14 JAN 89 12:16:19 GMT
From: LOU@VAX.OXFORD.AC.UK
Subject: CATH88 report

The CATH88 report which I somewhat precipitately sent to Humanist last
week contained a number of minor errors. The more significant ones were:-
Peter Denley is not the Chairman of the AHC but its Secretary general;
IBM was not sponsoring the event for the first time; the HIDES project is
not primarily CTISS-funded. Two paragraphs contained wordprocessed
gobbledegook, in one case entirely distorting the sense of what was being
described (the paragraph describing Roger Martlew's paper). My humblest
apologies to all; by way of partial recompense a corrected version of the
report follows:-

Computers and Teaching in the Humanities, 1988
Conference Report


This was the second conference on the theme of Computers and Teaching
in the Humanities to be organised by the Office for Humanities
Communication and the University of Southampton, with the support of
the ALLC (Association for Literary and Linguistic Computing), CTISS
(Computers in Teaching Initiative Support Service) and the AHC
(Association for History and Computing), and some sponsorship from IBM.
It attracted a similar mixture of attendees to that of the preceding
conference, more or less evenly divided between academic staff from both
universities and polytechnics with a third estate drawn from the growing
body of arts computing support staff. It was unlike the previous conference
however (also held at Southampton a year ago) in two respects. The first
conference had resembled a bazaar, with numerous parallel sessions
organised as workshops introducing specific applications areas with the aid
of a volume of essays (since published as IT in the Humanities, ed Rahtz,
Ellis Horwood, 1988). This conference built on the evident interest
generated by the first, laying a greater stress on the practical problems of
introducing computing tools to the undergraduate curriculum. It also had
a more unified programme, exemplified by the conference subtitle
`Redefining the humanities'. To many delegates, it seemed, the chief effect
of the introduction of computing had been to provoke a re-evaluation of the
methods and priorities of teaching methods in the humanities, quite
independent of any technological considerations.

The conference was opened by Peter Denley (Westfield College) who, as
secretary general of the Association for History and Computing, is well
placed to deliver a 'sociology of computing in the humanities'. He began by
describing the rise of humanistic scholarship itself in renaissance Italy,
stressing its emphasis on rhetoric and purity of language, together with its
importance as a way of both defining and sustaining the growth of a
secular educated elite. Movements as successful as renaissance
scholarship inevitably distort their successors' perceptions of them;
nevertheless, Denley argued, some of the problems currently faced by the
Humanities can be related to the change of direction implicit in the
reasoning of the first Humanists. It could be argued that emphasizing the
purity of classical Latin above the Latin vernaculars of the middle ages had
replaced practical linguistics by arid philology, while the emphasis on
classical education as vocational - on rhetoric as a necessary political
accomplishment - was clearly a two-edged sword. If the humanities were
purely vocational, what was the function of humanistic research? A new
agenda was needed, Denley argued, which recognised and reaffirmed the
fundamental importance of the humanities, rather than regarding them as
a useful collection of skills. As to computing in the humanities itself, he
suggested, it was time to take stock: the role of IT in the arts course
should be more than just to impart necessary skills, like word-processing.
Discipline-specific training was important, whether or not it used the
computer. Yet the structure of arts computing as it currently existed did
not always encourage new ideas: there was no career structure for the
strange hybrid characters who currently become arts computing advisors.
For Denley, history and computing go hand in hand: history, as a way of
handling information, demonstrates the complexity of knowledge, while a
database system enforces rigour of analysis, by requiring that events fit
into a structure. The Humanities, he concluded, needed urgently to
reassert their importance and their relevance in the evolution of
information processing.

A rather different perspective was offered by the next speaker, Tom
Stonier, Professor of Science and Society at Bradford. His message was an
evangelical one of unbridled prosperity for all, just around the corner. In
the nineties, he said, education will absorb more of the GNP than
anything else. Pursuing this rather odd economic metaphor, he pointed out
that human resources were the only sort of capital which could be made to
appreciate, by means of education. Like Denley, he felt that training for
skills alone was short sighted; unlike him, he was confident that education
and material progress would go hand in hand. Today's pupils would have
life expectancies of a century and enjoy enormous material wealth,
apparently caused by extensive use of robots and improved factory farming
techniques: there would be guaranteed income for all. Doubters in his
audience were exhorted to learn from the past: the purpose of history was
(of course) to enable us to forecast the future. We should abandon the
protestant work ethic, stop making a living and learn how to live.
Computers are the greatest pedagogic devices since grandmothers. When
the information operatives take over, their level of education will
necessarily preclude totalitarianism. A materials based society evolves
through competition, but an information based one develops only by means
of cooperation. And so forth. Such millenarianism seemed to a number of
the audience not only foolish, but also dangerously foolish, given the
increasing marginalisation of the humanities to which Denley had already
drawn our attention; however, as a morale booster, this was a most
amusing and effective speech.

David Bantz (Dartmouth College) was considerably less charismatic but
perhaps more reasonable. His presentation promised to address the extent
to which educational problems were solved by computing methods and
whether computing methods might not undermine traditional humanistic
values, by being inimical to the `great conversation of ideas' which Wayne
Booth sees as characterising the humanities, by over-valuing reductionism
and calculation at the expense of reasoning. He made several sound
criticisms of the current state of the art in computer aided criticism and
computer aided learning systems (for example, the way that most historical
simulation systems permit of only one right answer, the `drill and kill' style
of CALL etc.), but had little positive or practical to propose other than to
point out that computing hardware should be regarded as an expense item
rather than a capital investment, since a four year old machine is useless.
We learned that at Dartmouth, as at Bradford, all students are required to
buy their own machines (Macintosh) and that 85% do, at a special price of
$1500-2400, not so expensive when set against tuition fees of $20,000 p.a.

David Miall (College of St Paul & St Mary, Cheltenham), in one of the
more thought provoking papers of the conference, talked of a crisis of belief
in the humanities. Like Marlow's Faustus, the modern day humanist feels
that all the learning at his disposal has failed to give him power. The
Humanities are not about the acquisition of knowledge, but of transferable
skills, the purpose of which is to change people's feelings and raise their
consciousness. Miall then considered a variety of ways in which the
introduction of the computer in the classroom helps this by defamiliarising
a text, by interacting with and challenging affective models, and by
changing the nature of student/teacher relationships, of which he gave a
detailed and impressive discussion. The teaching of literature in particular,
he concluded, is concerned in the nineties with re-reading, rather than
reading, for which tools such as hypertext are eminently well suited.

This was followed by a rather weak paper on the use of a standard text
retrieval package called Personal Librarian, used by every student at the
Stevens Institute of Technology to access the 7 Mb of set texts used in a
course on the History of Science (Ed Friedman), and a rather stronger one
on the implications of hypertext for poetry teaching by John Slatin, from
the University of Texas at Austin. This was of interest more as a
demonstration of what tools such as Hypercard look like from the
perspective of an English literature specialist than for any concrete results
presented. Alan Dyer (Lanchester Poly) picked up the same theme in his
presentation, which concerned the way in which computing skills
necessarily spanned the division between the traditionally linear `readerly'
skills and visual or spatial skills. He described, and later presented, an
interactive hyperfiction produced by one of his students as an instance of
what could be achieved when creative people were offered suitably powerful
and easy to use tools.

Sebastian Rahtz (Southampton) gave a rapid but detailed description of
the Southampton/York Archaeological Simulation System, in which a
database of archaeological information is front-ended by something that
looks suspiciously like a computer game, but which reportedly enables
students to learn resource management. He also described the `arch_help'
system developed at York, in which a tailored form of the DEC mainframe
Help system is used to provide students with organised information about
courses, lectures, booklists and even accomodation details. Both systems
represent a shift in stress away from `teaching about the computer' to
`teaching with the computer'. Charles Henry (Columbia) initially treated us
to a brief survey of the pedagogic importance of visualisation in cognition
and memorising, from Pestalozzi to neural nets. His subsequent attempt to
use the insights gained in analysing the structure of the Old English epic
Beowulf was fascinating but too short to be convincing.

Arthur Stutt (OU) began by quoting Umberto Eco's definition of a novel as
a machine for generating interpretations. The artist, he argued, has always
been ready to apply technology: the special contribution of the computer
should be to facilitate processes otherwise impossible. Pointing to the
importance of argument in humanities, he made a good case for extending
the traditional single explanation school of expert systems to cope with the
traditional formal stages of argument. He did not draw a parallel between
the renaissance view of rhetoric as an essential component of the
humanities on the one hand and, on the other, the need to teach
techniques of argument as, in Gardin's phrase, `propositions which
mutually support each'.

R.A. Young (Dundee) also dealt with ways of formalising knowledge, but
from the point of view of the professional philosopher. He identified a
tension between the different attitudes to conceptual processes implicit in
the construction of formal logics by philosophers in the Russell tradition,
on the one hand, and the need to make expert systems that behave `as if'
intelligent which characterise knowledge engineering on the other. There
was a need for synthesis, not least because of traditional philosophy's
abilities to deal with inconsistencies and ethical issues generally
mishandled or ignored by the knowledge engineering paradigm.

Paul Davis (Ealing College) described a hybrid music system, and indeed
performed on one, after dinner. He gave a brief survey of various
approaches to the synthesis of music, stressing the importance of the
performer in designing appropriate interfaces for digital music systems and
asserted that music science was an area rather than a discipline. Coming
at the end of a long and intellectually demanding day, his presentation
seemed a little under-powered.

Lynette Hunter (Leeds) began by attempting a structural analysis of
contemporary computing mythology, in terms of the dominant myths of the
Western post renaissance man-made world. The machine offered an
illusory promise of freedom from drudgery by its power over semiotics,
mediated by the magical powers of the shaman (or computing advisor) and
vicarious participation in the club culture of the technocracy. But (as
David Miall had already remarked) it conferred only the appearance of
power. She then described recent changes of emphasis in the computing
component of the Leeds arts courses. Reductionism and the myth of
exactitude were inimical to humanistic skills of analagy and metaphor. The
place of the computer was to help in marshalling facts and memory, and
so it fitted better into courses dealing with textual editing or bibliography,
where classification skills and principles of selection needed to be taught.

Alison Black (Reading) gave an interesting and well-presented paper on the
differing reactions and achievements of students introduced to designing
documents on paper and on screen. Her talk was effectively illustrated
with examples of projects undertaken by the students and by statistics
drawn from questionnaires aimed at assessing student reactions at
different stages of their exposure to the different methods of document
design. Her analysis of the way new technology affects working practices
was clear and convincing as was her warning that whilst WYSIWYG
desktop publishing has a lot to offer the design student, we should not be
so dazzled by its superficial merits as to forget its limitations and to
abandon more traditional methods of document design.

Cell biology was the somewhat suprising subject of the demonstration
provided by Wendy Hall, the object of which was to present a hypertext
system developed at Southampton with Hypercard. This linked images
held on videodisc (some 54,000 images per side) with extracts from
relevant textbooks, adding sound and animation where appropriate.
Although this particular project was not humanities based, Wendy Hall
was quick to point out the general applicability of the technology and the
pedagogical methods behind it. In his paper `Videodiscs and the Politics of
Knowledge', Roger Martlew (Southampton) returned to a key theme of this
conference: the relative roles in the classroom of the teacher, student and
computer. Like David Miall and the DISH duo, he argued that traditional
styles of Humanities teaching impose specific roles on both lecturer and
student which computers had the potential either to fossilise or to
radicalize. He clearly felt that recent pedagogical developments in
secondary education were equally applicable at the tertiary level, and that
the lecturer `must cease to be a controller of knowledge, and must become
a manager of learning'. The link between Martlew's archaeological
videodisc and the politics of knowledge became slightly blurred in the talk,
but recourse to the abstract of his paper sets us back on the right track
with the reminder that `the control of access to visual information in
archaeology confirms the lecturer's power over the educational process'.
The videodisc controlled by a lecturer could be used in the same didactic
way as the traditional 'chalk and talk' methods of teaching; the videodisk
controlled by the student was equally possible, if the lecturer has the
courage to renounce power for the sake of pedagogy.

The last full session of the conference was concerned with three major
teaching packages. In the first, Frank Colson described and demonstrated
the HIDES CAL software package, used at Southampton as an important
part of the special subject component of the history degree. The software
runs on a network of PS/2s located in the University library, and presents
students with a structured walk through documentary sources, supported
by impressive graphics. It was claimed that students enjoyed using the
system, and that it also lead to their making greater use of original (non-
computerised) sources. In the second Susan Hockey and John Cooper
described, and Jo Freedman demonstrated, the 'Oxford Text Searching
System', developed at Oxford with CTISS funding to encourage arts
undergraduates to use concordance and free text searching software in
their study of set texts. Finally, Nicholas Morgan and Richard Trainor
described (but did not demonstrate) some of the principles underlying the
development of the highly successful DISH project for teaching history at
Glasgow. They reiterated the changes in the teacher-student relationship
made possible by the use of computers: the transformation of the
instructor from teacher into guide and the resulting emphasis on
exploration, and on the diversity of insights resulting from a variety of
routes through the material offered.

The conference was closed by Nigel Gardner (now with ESRC) whose
valedictory address as head of CTISS indicated that the Initiative had been
less succesful as an exercise in institutional change than in causing re-
assessment of the requirements of specific disciplines. The next round of
CTISS funding (announced at this conference by Gardner's successor,
Jonathan Darby) was thus aimed specifically at setting up topic-oriented
`centres of excellence', which would need to address more precisely such
matters as project management, staff training, resource control and
evaluative procedures. Gardner also suggested that there was a shift in the
role of computing centres which, if they were to survive at all in the world
of the individual work station, needed to re-emphasize their role in
providing administrative computing facilities, and support for
telecommunications and `learning resource centres' (what we used to call
libraries).

A somewhat sporadic general discussion followed this closing address, but
did not really bring together the two major themes that had run through
an unusually well-balanced and unified programme. The first is that with
or without the presence of a computer terminal in a classroom, teaching
methods in tertiary education must move away from the traditional
master/disciple roles which the availability of hypertext systems and
videodiscs are beginning to expose and challenge. Secondly that whilst
new technology has a lot to offer the humanities, particularly teaching in
the humanities, a great deal of caution and selection should be exercised
in the manner and degree to which it is applied. In retrospect, though little
was actually said about redefining the humanities as such (except by those
who wanted to annex computer science), quite a lot of thought had
evidently gone into redefining the teaching of the humanities.
Several speakers referred in passing to the copyright problems implicit in
using electronic materials for teaching purposes: this has been a recurrent
area of concern, and it is to be hoped that a special session at some future
conference will address it explicitly.

Lou Burnard and Judith Proud
(Oxford Text Archive)
----end