Humanist Discussion Group, Vol. 20, No. 137.
Centre for Computing in the Humanities, King's College London
www.kcl.ac.uk/schools/humanities/cch/research/publications/humanist.html
www.princeton.edu/humanist/
Submit to: humanist_at_princeton.edu
Date: Mon, 07 Aug 2006 08:41:55 +0100
From: lachance_at_origin.chass.utoronto.ca (Francois Lachance)
Subject: More on entropy, information and meanings
Willard,
More on Arnheim's confusing contrariness (follow up to Humanist 20.035).
An off list comment by one of Humanist's readers to the effect that
Arnheim got Shannon wrong led me to try
and ascertain if Arnheim had indeed read Shannon. Looks like he is
relying on Norbert Weiner and
characterizes Weiner as contributing to confusion.
I believe what Arnheim missed in his reading of Weiner is the
relation between pattern and meaning. Reading
Weiner from a less contrarian position, on understands that messages
are used to convey information from
one point to another, that is they are a specialized function of
pattern. One could draw the following
schema:
pattern
information
message
meaning
The most likely reading off of this arrangement is that pattern is to
meaning as information is to message.
However there may be some value in considering a chiasmic relation:
pattern is to message and information
is to meaning. At some point in certain situations, more information
doesn't increase meaning. I remain
perplexed as to how to relate pattern and message. This may also have
been the rub for Arnheim though of
course he doesn't say so explicitly.
In any event, here follows an excerpt from Arnheim followed by fuller
sections of the Weiner he quotes.
Arnheim, Rudolf
Entropy and Art: An Essay on Order and Disorder
(University of California Press, 1971; rpt. 1974
p. 20 note marked by (*)
<quote>
Is it sensible to call information and entropy inversely related
measures, as Norbert Weiner does when he
says that "the amount of information is a quantity which differs from
entropy merely by its algebraic sign
. . ." (p. 129)? The two measures could be reciprocal only if they
referred to the same property of sets of
items; but this they do not do, as I just pointed out. Entropy theory
never leaves the world of pure
chance, whereas information theory gets nowhere unless it does,
because only then can it arrive at
sequences varying in probability of occurence. Its business is to
predict likelihood of occurence in a
world in which sequences are not all equally likely to turn up.
Ignoring these differences leads to much
confusion. Wiener states, for example, that "a haphazard sequence of
symbols can convey no information" (p.
6). This is by no means true, as any victim of lotteries or games of
chance can testify. Information, as
defined by the theory, is not "the measure of the regularity of
pattern," but rather the contrary. Nore can
it be said that "regularity is to a certain extent an abnormal
thing." It can be normal or abnormal, that
is likely or unlikely to turn up, depnding on wheerh one is trying to
predict the next hundre objects
produced by an automobile factory or the next hundred items in a
white elephant auction. Helmar Frank ([Zur
Mathematisierbarkeit des Ordnungsbegriffs. Grundlagenstudien aus
Kybernetik and Geisteswissenschaft vol. 2,
1961] p.40), as cited by Manfred Kiemle ([Aesthetische Probleme der
Architektur unter dem Aspekt der
Informationsasthetik, 1967] p. 30), has drawn attention to the
contradictions in Wiener's statements.
</quote>
I have not been able to consult the Frank or Kiemle. However, I did
look up the relevant passage in Norbert
Weiner's The <i>Human</i> Use of Human Beings [Only upon this
re-reading did I notice the italicized term
in the title].
P. 129
The full sentence plus some from Weiner reads: "The amount of
information is a quantity which differs from
entropy merely by its algebraic sign and a possile numerical factor.
Just as entropy tends to increase
spontaneously in a closed system, so information tends to decrease;
just as entropy is a measure of
disorder, so information is a measure of order. Information and
entropy are not conserved, and are equally
unsuited to being commodities."
P. 6-7
The full sentence plus some from Weiner reads: "When this question
was asked, it became clear that the
problem of measuring the amount of information was of a piece with
the related problem of the measurement
of the regularity and irregularity of a pattern. It is quite clear
that a haphazard sequence of symbols or
a pattern which is purely haphazard can convey no information.
Information thus must be in some way the
measure of the regularity of a pattern, and in particular of the sort
of pattern known as <i>time
series</i>. By time series, I mean a pattern in whcih the parts are
spread in tme. This regularity is to a
certain extent an abnormal thing. The irregular is always commoner
than the regular. Therefore, whatever
definition of information and its measure we shall introduce must be
something which grows when the <i>a
priori</i> probability of a pattern or a time series diminishes."
Wiener appears to be quite clear.
-- Francois Lachance, Scholar-at-large http://www.chass.utoronto.ca/~lachance ~~~ to be surprised by machines: wistly and sometimes wistfullyReceived on Mon Aug 07 2006 - 04:22:11 EDT
This archive was generated by hypermail 2.2.0 : Mon Aug 07 2006 - 04:22:13 EDT