Humanist Discussion Group, Vol. 17, No. 210.
Centre for Computing in the Humanities, King's College London
www.kcl.ac.uk/humanities/cch/humanist/
www.princeton.edu/humanist/
Submit to: humanist@princeton.edu
[1] From: Willard McCarty <willard.mccarty@kcl.ac.uk> (12)
Subject: 0 and 1
[2] From: Stewart Arneil <sarneil@uvic.ca> (36)
Subject: Re: 17.205 beginning with 0 or with 1?
[3] From: Norman Gray <norman@astro.gla.ac.uk> (42)
Subject: Re: 17.205 beginning with 0 or with 1?
--[1]------------------------------------------------------------------
Date: Fri, 29 Aug 2003 06:00:00 +0100
From: Willard McCarty <willard.mccarty@kcl.ac.uk>
Subject: 0 and 1
On the question of starting with 0 or 1, raised by Francois Lachance in
Humanist 17.205, allow me to recommend Karl Menninger, Number Words and
Number Symbols: A Cultural History of Numbers, trans. Paul Broneer (MIT
Press, 1969) and still in print. As I recall, Meninger does a good job
explaining the great intellectual achievement in the invention of zero, for
example.
Yours,
WM
Dr Willard McCarty | Senior Lecturer | Centre for Computing in the
Humanities | King's College London | Strand | London WC2R 2LS || +44 (0)20
7848-2784 fax: -2980 || willard.mccarty@kcl.ac.uk
www.kcl.ac.uk/humanities/cch/wlm/
--[2]------------------------------------------------------------------
Date: Fri, 29 Aug 2003 06:00:25 +0100
From: Stewart Arneil <sarneil@uvic.ca>
Subject: Re: 17.205 beginning with 0 or with 1?
HI Francois
Although I'm sure much ink has been spilled on this, it's just the
difference between labelling something and counting it. All computer
systems address memory by a numeric label, which might as well start at 0
(in addition there are technical reasons why it starts at 0). Some
programming environments present to the user a 1-based (as opposed to a
0-based) interface, but that's strictly to accommodate the (understandable)
preference of some people to start counting at 1 and then to label each
item being counted with the same number.
So, for example, if you have an array with no elements, the array's length
will be zero (that's the count of the number of elements), and of course as
there are no elements there is nothing to address. If you have an array
with one element, the array's length will be 1(the count of the number of
elements) and the address of that one element is typically array[0] (the
label the computer uses to address that element) except in those systems
that present to the programmer a 1-based interface, in which case the
address of that one element is array[1]. Obviously, as a programmer you
need to know which system is used by the language you're programming in.
>Could some one explain why certain languages begin with zero and others
>with one?
>
>I have some vague impression that it is related to the treatment of
>arrays and Cartesian coordinates where the origin is represented by the
>pair (0,0). However my vague impression doesn't explain why certain other
>systems begin with 1. Would appreciate an explanation by anyone in the
>know.
>
>Thanks
>
>--
>Francois Lachance, Scholar-at-large
>http://www.chass.utoronto.ca/~lachance
-- Stewart Arneil Head of Research and Development, Humanities Computing and Media Centre, University of Victoria, Canada--[3]------------------------------------------------------------------ Date: Fri, 29 Aug 2003 06:08:51 +0100 From: Norman Gray <norman@astro.gla.ac.uk> Subject: Re: 17.205 beginning with 0 or with 1?
Greetings,
On Thu, 28 Aug 2003, Humanist Discussion Group (by way of Willard McCarty <willard.mccarty@kcl.ac.uk>) wrote:
> I found an echo of the whole numbers/natural numbers theme. In prepping > for a project, I note that Javascript begins counting with zero. > > Could some one explain why certain languages begin with zero and others > with one?
One answer is that Javascript does its arrays like that because it's supposed to look like Java; and Java does that because it looks like C. Others like Pascal and Delphi start numbering with 1 because Algol did; it did that because Fortran did; and Fortran started with 1 because maths does. Others such as Lisp do `the first one' and `the rest', and avoid the issue (sneaky!).
This is of course a monstrous distortion[1].
A complementary explanation is that, in languages which are `close to the machine' like C, it's natural and obvious to begin counting with the bit-pattern corresponding to the lowest number -- all bits zero, representing number zero. Fortran was intended to let people represent mathematical expressions, with arrays taking the place of indexed expressions, a_1, a_2, .... In general, humans tend to start that sort of count with 1. Mathematicians (I'm tempted to say `on the other hand') aren't so consistent, and Fortran can start numbering its arrays with any integer.
> I have some vague impression that it is related to the treatment of > arrays and Cartesian coordinates where the origin is represented by the > pair (0,0).
Arrays representing graphical images -- the two-dimensional arrays I believe you're thinking of here -- are commonly handled in languages such as C, so the cause you're adducing here is probably in fact the effect of the language choice.
So the real answer is that the way arrays are numbered (and thus where counting naturally starts in other counting contexts within a program) is indeed arbitrary, and the language designer will settle on the conventions which will most appeal to the community the language is aimed at.
Best wishes,
Norman
[1] http://www.levenez.com/lang/
-- --------------------------------------------------------------------------- Norman Gray http://www.astro.gla.ac.uk/users/norman/ Physics and Astronomy, University of Glasgow, UK norman@astro.gla.ac.uk
This archive was generated by hypermail 2b30 : Fri Aug 29 2003 - 01:35:03 EDT