Humanist Discussion Group, Vol. 20, No. 35.
Centre for Computing in the Humanities, King's College London
www.kcl.ac.uk/schools/humanities/cch/research/publications/humanist.html
www.princeton.edu/humanist/
Submit to: humanist_at_princeton.edu
Date: Sat, 27 May 2006 09:46:05 +0100
From: lachance_at_chass.utoronto.ca
Subject: Re: 19.672 'information' in communications theory
and in philosophy
Willard
You asked about works that broached the topic of the spread of the concept
and term "information". Rudolf Arnheim's small book _Entropy and Art: An
Essay on Disorder and Order_ (University of California Press, 1971) is
perhaps worth adding to the pile. Fair warning it might hurt (but not
harm) the student who would want to zip through its 56 pages plus
illustrations and notes. This passage though not characteristic of the
whole will give you and other readers a taste of its very Socratic style:
<quote>
The absurd consequences of neglecting structure but using the concept of
order just the same are evident if one examines the present terminology of
information theory. here order is described as the carrier of information,
because information is defined as the opposite of entropy, and entropy is
a measure of disorder. To transmit information means to induce order. This
sounds reasonable enough. Next, since entropy grows with the probability
of a state of affairs, information does the opposite: it increases with
its improbability. The less likely an event is to happen, the more
information does its occurrence represent. This again seems reasonable.
Now what sort of sequence of events will be least predicatable and
therefore carry a maximum of information? Obviously a totally disordered
one, since when we are confronted with chaos we can never predict what
will happen next. The conclusion is that total disorder provides a maximum
of information; and since information is measured by order, a maximum of
order is conveyed by a maximum of disorder. Obviously, this is a
Babylonian muddle. Somebody or something has confounded our language.
</quote>
Took me many a rereading to see just what pivots on the contention that
"with chaos we can never predict". We can predict. We can even predict
what will happen. We can predict what will happen next. What we cannot
predict is whether or not the prediction will be judged retrospectively as
being successful. Order has nothing to do with prediction. I can predict
heads or tails in a coin toss. I can only verify the match between
prediction and actuality after the coin toss. None of this challenges
Arnheim's description of the muddle surrounding the various uses of the
term "information". It does however open upon considerations of
temporality and computing.
> Those here for whom the idea of information is of interest will
> likely want to read the following article:
>
> Fred Dretske, "The explanatory role of information", Philosophical
> Transactions: Physical Sciences and Engineering, vol 349, no 1689,
> special issue entitled "Artificial Intelligence and the Mind: New
> Breakthroughs or Dead Ends?" (15 October 1994): 59-70 [in JSTOR].
>
> Dretske's brilliant exposition is followed by snippets of discussion
> involving Andy Clark, Yorick Wilks, Daniel Dennett, R. Chrisley and
> L. J. Cohen.
>
> Of everything I have seen so far (an important qualification), this
> provides the clearest explanation of the concept of information in
> communications theory and the only philosophical bridge between the
> technical sense of this term and issues in the philosophy of mind
> directly relevant to humanities computing.
>
>
Received on Sat May 27 2006 - 05:33:59 EDT
This archive was generated by hypermail 2.2.0 : Sat May 27 2006 - 05:33:59 EDT