17.622 linguistic and cultural provincialism

From: Humanist Discussion Group (by way of Willard McCarty willard.mccarty@kcl.ac.uk)
Date: Mon Feb 09 2004 - 17:33:26 EST


<x-flowed>
               Humanist Discussion Group, Vol. 17, No. 622.
       Centre for Computing in the Humanities, King's College London
                   www.kcl.ac.uk/humanities/cch/humanist/
                        www.princeton.edu/humanist/
                     Submit to: humanist@princeton.edu

         Date: Mon, 09 Feb 2004 10:02:15 +0000
         From: "Liz Walter" <eawalter@arizona.edu>
         Subject: RE: 17.618 linguistic and cultural provincialism

<<how we cure the plague of monolingualism, the idea that knowledge is some
sort of permanent stuff that one can accumulate, and that therefore topics
can be "done" once and for all, needs a sharp look.>>

I believe that the "permanent stuff" should be seen, instead of knowlege,
as memory. Experience leading to (creating) memory is the only way to
understand knowlege. Language is a way of communicating those
memories. Memories are never done.

To eliminate the "plague" we might use a common language. In Science it is
mathmatics. Physics and Engineering, widely used, are dialects of
scientific language. These provide a known, world wide commonality for us
to work with. The culture of science is known everywhere. In many places
it is still not widely adopted, but certaintly known just about everywhere.

Those of us working with computing in the humanities/liberal arts
understand that there are many concepts (memories) which are difficult to
transmit digitally. Parallel processing may help with some of this, but
the work of mapping meaning into a mathmatical model is ongoing. In this
time when the first Hopi dictionary is less than 5 years old, it will be
sometime until we get even science to help us understand how to describe
our memories. Thank goodness it can never be "done".

-----Original Message-----
From: Humanist Discussion Group [mailto:humanist@Princeton.EDU]On Behalf
Of Humanist Discussion Group (by way of Willard McCarty
<willard.mccarty@kcl.ac.uk>)
Sent: Saturday, February 07, 2004 1:52 AM
To: humanist@Princeton.EDU

                 Humanist Discussion Group, Vol. 17, No. 618.
         Centre for Computing in the Humanities, King's College London
                     www.kcl.ac.uk/humanities/cch/humanist/
                          www.princeton.edu/humanist/
                       Submit to: humanist@princeton.edu

           Date: Sat, 07 Feb 2004 08:36:47 +0000
           From: Willard McCarty <willard.mccarty@kcl.ac.uk>
           Subject: our provincialism

Tito Orlandi, in Humanist 17.611, has rightly complained about a very old
problem in our new field: that we whose native tongue is the current lingua
franca (note that expression, please) remain largely trapped within the
bounds defined by language. Referring to the ongoing debate about
humanities computing science vs humanities computing, he asks, does "it
sound reasonable that an HC(S) scholar should know the humanistic culture
'at large', and not only one branch of it"? This is, of course, a
rhetorical question, but one that needs asking, and asking again and again.
But I wonder, in practical terms what can be done about it, given the
academic resources we have? We can all imagine the alternatives and sort
through them. What is utterly unacceptable, I would suppose, is a dismissal
of the problem.

One of the problems, that is. A more vexing consequence of Babel is the
untranslatability of linguistic cultural idioms, including the academic. It
is a possibility, is it not, that work done in one academic culture may
simply not be relevant to work on the same topic in another because the
assumptions, means and terminology are too different -- i.e. that there are
really two topics, not one? In philosophy, for example, we know the
difficulties of bridging Anglo-American and Continental European traditions
-- take the case of Heidegger, for example. Yes, this is a special case,
given Heidegger's intimate play with untranslatable aspects of the German
language, but they are only the beginning of the problem, for which see
George Steiner's masterful struggle to come to terms with Heidegger in his
book of that name. (Note that Steiner is completely fluent in English,
German and French at minimum.) More controversial, I suppose, are the
difficulties posed by the many attempts to bridge Anglo-American and French
literary critical traditions. The French mathematician and philosopher of
science Pierre Duhem infamously distinguished between French and English
ways of thought in La Théorie physique (1914) when he proposed two
corresponding kinds of scientific mind and so two kinds of theory: abstract
and systematic (French, clearly) vs. the sort that relies on mechanical
models. Even if he was only pointing to the way people think they think,
their persistence in thinking that way is strong.

Furthermore, quite apart from the worthy question and constant source of
guilt for a great many of us, how we cure the plague of monolingualism, the
idea that knowledge is some sort of permanent stuff that one can
accumulate, and that therefore topics can be "done" once and for all, needs
a sharp look. If they cannot, then what are we doing?

Yours,
WM

Dr Willard McCarty | Senior Lecturer | Centre for Computing in the
Humanities | King's College London | Strand | London WC2R 2LS || +44 (0)20
7848-2784 fax: -2980 || willard.mccarty@kcl.ac.uk
www.kcl.ac.uk/humanities/cch/wlm/
</x-flowed>



This archive was generated by hypermail 2b30 : Fri Mar 26 2004 - 11:19:41 EST