[1] From: Jim Marchand <marchand@ux1.cso.uiuc.edu> (37)
Subject: Programming in Prolog
[2] From: Jim Marchand <marchand@ux1.cso.uiuc.edu> (43)
Subject: Vagueness / Programming
--[1]------------------------------------------------------------------
Date: Mon, 19 Dec 95 09:50:42 CST
From: Jim Marchand <marchand@ux1.cso.uiuc.edu>
Subject: Programming in Prolog
Professor **'s question on Prolog allows me to tout my forthcoming book, _The
Use of the Computer in the Humanities_ (Ballentine/Random House), where I
have the following to say in the Appendix on Programming Languages:
PROLOG. Another AI language is Prolog. If you remember our
discussion early in the book on symbolic logic, you can classify
Prolog's logic for the most part as class logic or the logic of
relations, a fairly radical departure from other programming
languages. That is, in the statement of facts, a relation is
displayed over a set of atomic units, e. g. `Joe likes fish'
becomes <likes(joe.fish)>, with the relation `likes' displayed over
`joe' and `fish'. It is especially good in handling relations such
as <father(terach.abraham)>; if you remember your relational logic,
all of this is ridiculously easy to handle, but it can be also very
powerful, though somewhat unwieldy at first. It reveals its class
logic origins when you have to write <planet(earth)> to say `earth
is a planet'. The bible is: W. F. Clocksin and C. S. Mellish,
Programming in Prolog, 4th ed. (Berlin: Springer, 1994), but you
will want to learn it from: Leon Sterling and Ehud Shapiro, The Art
of Prolog, 4th ed. (Cambridge, MA: MIT Press, 1994). A useful
implementation of Prolog is Turbo Prolog; if you want to read a
ringing encomium of Prolog, get Herbert Schildt, Advanced Turbo
Prolog (Berkeley: Osborne McGraw-Hill, 1987). Prolog can be quite
useful in linguistics and artificial intelligence; it was adopted
by the Japanese for their Fifth Generation effort. Remember to
look for the FAQ in comp.lang.prolog; it lists a lot of "free"
Prolog implementations. There is also a www site:
URL=http://www.comlab.ox.ac.uk/archive/logic-prog.html.
You will definitely want to look at the comp.lang.prolog site on Usenet, plus
the www site. I did a lot of programming in Prolog in the old days in AI.
It has poor string manipulation capabilities, so is not strong for ordinary
language. In choosing a language, you choose a metaphor (extended metaphor).
It is still good to start out with flow charts (decision trees if you are a
TG freak). BASIC is also good to start with (I mean down to earth BASIC).
Speaking of BASIC, a good book to learn about logic (and logic programming)
from is Kemeny, Snell, Thompson, Introduction to Finite Mathematics; I prefer
the first edition.
!
Jim Marchand.
--[2]------------------------------------------------------------------
Date: Mon, 19 Dec 95 10:28:46 CST
From: Jim Marchand <marchand@ux1.cso.uiuc.edu>
Subject: Vagueness / Programming
The discussions of vagueness and programming seem to be going in the same
direction. We humanists are not used to well-defined concepts and are more
at home with vague, porous (IPA), ill-defined (anything), fuzzy (dialect,
language, genre), stippled spectrum (Staffellandschaft), ideal type
(anything) concepts. In fact, I find it unfortunate that we often treat our
concepts as if they were Aristotelian (well-defined, yes-no) rather than
ideal type. There are few entities in the humanities which can be defined per
proximum genus et differentiam. This means that there is little which can be
dealt with algorithmically. Most programming languages require well-defined
entities and algorithms.
Whenever I have been called upon to help a colleague with a programming
problem (e.g. making a concordance, counting words, making a dialect map), I
find invariably that the colleague has only a vague notion of the task. "All
I want is for it to count the words." "What is a word?" "Whatever is
bounded by spaces (sounds like Fries)." "So <word?> is a word." "No, tell
it to ignore punctuation marks." Etc., etc. "Just make a list of the words
and their contexts." "What is the context?" "How much is the context?" (this
leads, horresco referens, to KWIC concordances, which are usually quite
worthless for any careful work). I remember reading a nice article on "What
to tell the programmer" (in Wisbey's miscellany, I think).
Before getting into programming, the scholar should ask whether the problem
in hand is computable. One often hears: "The computer has shown that St.
Paul did not write 1 Corinthians," or the like. What happened was that
someone decided that certain criteria would/could identify an author
uniquely, e.g. verb/adjective ratio. The computer was then used to perform
the helot task of counting. There are several questions to be asked of such
a method. Is verb/adjective ratio a good deciding characteristic? Can you
identify the verbs and adjectives of a text (it is for sure that the computer
cannot do it)? Attribution and athetization are not performable by a
computer, though it can be used as an aid.
Ever since I can remember in the computer field (starting in the early
50s), people have wanted to use the computer to draw stemmata, for example.
Since human beings cannot draw satisfactory stemmata, it is not likely that
computers will be, though if you believe in numerical taxonomy, you are going
to need a computer. Most of Ross's "Philological Problems" are non-
computable.
It may be that we will someday have fuzzy computers; after all, we have
fuzzy set theory. They will not be digital; digital means "non-fuzzy".
Having said all this, I still see the computer as a great handmaiden for
the humanities. We just need to understand its limitations and its
abilities; some of its limitations seem petty and ridiculous, but many of its
capabilities stagger the imagination.
!
Jim Marchand.