4.0400 Computers -- Names for and Nature of (4/109)
Elaine Brennan & Allen Renear (EDITORS@BROWNVM.BITNET)
Tue, 21 Aug 90 17:55:23 EDT
Humanist Discussion Group, Vol. 4, No. 0400. Tuesday, 21 Aug 1990.
(1) Date: Tue, 21 Aug 90 08:55:38 DNT (19 lines)
From: Hans Joergen Marker <DDAHM@vm.uni-c.dk>
Subject: Re: 4.0395 Why "computers"...
(2) Date: Tue, 21 Aug 90 12:59:11 -0400 (32 lines)
From: amsler@flash.bellcore.com (Robert A Amsler)
Subject: What call them `computers'
(3) Date: Tuesday, 21 Aug 1990 08:24:35 EDT (16 lines)
From: "Patrick W. Conner" <U47C2@WVNVM>
Subject: 4.0395 Why "computers"...
(4) Date: Tue, 21 Aug 90 15:08:11 EDT (42 lines)
From: Chris Gowlland <ST402868@BROWNVM>
Subject: Re: 4.0395 Why 'computers'
(1) --------------------------------------------------------------------
Date: Tue, 21 Aug 90 08:55:38 DNT
From: Hans Joergen Marker <DDAHM@vm.uni-c.dk>
Subject: Re: 4.0395 Why "computers"; Jim Sledd (2/74)
Jim O.Donnell asks for alternatives to the name computer in use in other
languages than English. In Danish an alternative name was developped:
datamat. The most common name for a computer in Danish is the English
word computer, but the English word is still not fully assimilated in
the language ie: it is still pronounced in English not with a Danish
pronounciation. The word datamat is widely used and understood, I think
it is the second mostly use name for a computer in Danish, but
personnally I feel that the use of datamat for computer gives a text an
odeour of intellectualism that may be unwanted in a particular text.
Further I think that the word datamat is not particularly Danish anyhow,
it plays on similarity with Latin inspired words such as for instance
automat.
Hans JŚrgen Marker
DDAHM at NEUVM1
(2) --------------------------------------------------------------44----
Date: Tue, 21 Aug 90 12:59:11 -0400
From: amsler@flash.bellcore.com (Robert A Amsler)
Subject: What call them `computers'
James O'Donnell raises a good point in that the choice of name reflects
the time period in which a machine is introduced and the expectations
of its developers. However, we didn't call them `computers' initially,
they were `calculators' and then `electronic calculators'. They also
were `electronic data processing machines', which actually was more
accurate for today's purposes--but somehow became associated with
doing payroll calculations and hence tainted as a name.
Today I do not know what we would call them if they JUST
appeared--probably something like `bi-stable logic-based symbol
manipulators' (BLSMs) and eventually some more pronounceable name
would appear (remember ENIAC, UNIVAC, etc. were initial computer names
reflecting the VACume tubes of which they were made).
I would however take issue with the imposition of 0/1 on computers being
solely a mathematician's characterization. It is the essence of
digital computation and the basic common characteristic of all
modern computing technology. I.e., engineers basically look for
materials (e.g. silicon, gallium) or means of microscopically
storing two distinct states (e.g. bubble memory) to use as the basis
for computation. Two-state devices are essential since all our
hardware is based on that form of computation. (There is a fundamental
understanding that using two states one can represent everything
from numbers in any base to letters to grey-scale and color images, etc.
that may have been hard to accept--but now is well accepted in some
quarters. This might be a significant idea--akin to the quantum in
energy/matter we now have the bit in information).
(3) --------------------------------------------------------------24----
Date: Tuesday, 21 Aug 1990 08:24:35 EDT
From: "Patrick W. Conner" <U47C2@WVNVM>
Subject: 4.0395 Why "computers"; Jim Sledd (2/74)
Jim O'Donnell says <that everything in the machine is done with 0's
and 1's is not something intrinsic to the nature of silicon and the
movement of electricity, it is rather a mathematician's conceptualization
imposed on silicon and electrons>. Surely, he's wrong. Electricity
is either carried through a circuit, or it isn't. That is, the switch
is on or off (0 or 1). The way to have it partly on and partly off
is to have two circuits, one on and one off, combined, and there is born
the chip. At least, that's the way I learned it. I'm fairly certain
that two or three things are real in this imperfect world, and an
electrical switch is one of them. That's why God's first words were
Fiat lux, and then he flipped the switch.
--Pat Conner
(4) --------------------------------------------------------------48----
Date: Tue, 21 Aug 90 15:08:11 EDT
From: Chris Gowlland <ST402868@BROWNVM>
Subject: Re: 4.0395 Why 'computers'
I'm not sure what the objection is to the use of 1's and 0's within
computing. My understanding is that using them does not in any way
restrict one to a certain number of characters, like those in the
standard ASCII set -- you can extend the number of characters more
or less indefinitely by simply lengthening the descriptor for the
characters.
Last year I was working at the National Central Library in Taipei
on Taiwan, and one of the jobs I had to do was to act as rapporteur
at a conference about the use of computers in Chinese language
processing (with a special focus on bibliographic work).
Researchers in Taiwan and Hong Kong claimed to be working on
a standard set of codings which will allow them to describe
precisely every character ever written in Chinese, Japanese, and
Korean, including all the variant simplified forms and those which
are found only once... probably something over 70,000 different
and unique characters (I did have a more precise figure, but don't
have it to hand here).
It would probably have been a relatively simple matter to have a
much larger set of defined characters (covering all the major
alphabetic scripts etc.) when they were first deciding how to set
up the ASCII format... but I wouldn't have thought that the
limitations of the medium as it currently exists are necessarily
the result of choosing to use binary coding. I too am not
happy about the fact that there are no consistent formats for
sending the Roman alphabet plus diacritics, and I appreciate
that it must be much more frustrating for those who would like
to communicate using Greek, Arabic, Hebrew, etc. But how much
is enough? Should a "wish list" for a really comprehensive
system for character interchange include the International Phonetic
Alphabet too? How about Gothic script? How about every one of
the different alphabetic scripts found in South Asia? It's
surely just a matter of what is initially decided.
Enough. Incidentally, the ordinary Chinese word for a computer
is "dian-nao", which in syllabic etymological translation means
"electronic brain".