Actually it is a convention, that depends from the deep dark ages where programming was done on switch boards, and computers seldom had more than 4 bits in their data bus width. Desperate to make the 4 bit data element as efficient as possible the scientists who designed the computers and the languages (It wasn't really engineers) decided to start at the base case, and save a bit.) Since they also developed the programming languages the conventions were passed on to the programming languages even when the data path width became significant.
While it may seem annoying today, it actually does save a bit, and increase the size of the value that can be indexed by a factor of 2 which can be significant when working at or near the data path boundary conditions, especially since that bit is often lost to signs etc. and you would lose 2 bits if it weren't. reducing the size of your index by a factor of 4. Since data path extenders often work with the next power of 2, over-extending a byte, wastes 8 bits, over-extending a 2 byte word wastes 16, over-extending the now popular 32 bit word wastes 32 bits, etc. When memory was exceedingly expensive, this was seen as being much more important than it is today when memory while expensive is much more readily available.
Ask not why we do it, ask why we are STILL doing it when the economics have shifted so radically from the reasons it was done in the first place.
This reminds me of a story. A Lady who has a popular radio show, or was mentioned on one, I don't remember which, once told the story of the "Roast". As she had grown up, her mother had always cut off a portion of the roast on holidays, and put it, into the bottom of the roasting pan, and she, being a consummate mimic, had taken this convention, as being important to the ritual of food preparation, and had dutifully cut off the same portion and put it into the bottom of her roaster, without really thinking about it. When her daughter grew enough to ask questions, she was asked why she did it, she had to admit that she didn't know but her mother had always done it. She endeavored to ask her mother, who replied "Oh, that, well when you were growing up I had a roaster that was too small to fit the whole roast, so I used to cut off a part of the roast to make it fit., I don't do it anymore now that I have a larger roaster".
@Graeme: You are wrong! Stop spamming this network!
@Malhar: It's about the difference between "to count" and "to enumerate". If you count an array you have 4 objects. If you enumerate the same array you get the number or indices 0-3.
Another example would be counting years. By the date 31.12.2010 it's the year 2011. But isn't it logic by 31.12.2010 it is the year 2010 and the year 2011 is starting with 31.12.2011? Look at the transition from the year 1999 to 2000.
Actually it is the 0 based indexing that creates the trap, that causes the programmer to have "Off by one errors."
From elementary school we are taught to count from 1 to 10. Then when we become programmers we are told that 0 based indexing is better. Better for whom?
Chi: That you would make a Logic statement based on leap years, I find amusing.
Hmm... You say that I am wrong. and that I am spamming the network. I say I am trying to have an intelligent conversation with hidebound pedagogues that seem not to understand the limitations of their own thinking. Am I wrong to point out that they are hidebound, or pedagogic in their statements? How many times have we made excuses for "Logic" without really looking at what it's limitations are?
The "Off by one error" can be seen as apology for the convention, but it wasn't the reason that the convention was created, merely a good excuse for keeping the convention once it was imbedded in everything programmers do.
In fact I'm talking about a decade. The decade is starting with 01.01.2001 an _not_ with 31.12.1999 as by convention.
Also I have difficult to count from 1 to 10 then for example from 1 to 4. Also the number 4 has some meaning. 4 directions, 4 postures, 4 degree cold water, 4 is break even of addition and multiplication.
Well, I certainly can't comment on how you count.... my views often don't.
But what amused me about the fact that you were talking about a leap year, is that leap years are caused because of the assumption that the period that the earth revolves itself has some bearing on how long it takes to revolve around the sun.
The Calendar Makers, tried to force the calculations to jibe, but they were off by about 1/4 revolution of the earth, so they cheated and put in a correction line in the calendar adding one day every 4 years, and another every millenium.
That means every four years, there is an "Off by one" error, and so on in anybody that assumes the calendar is stable from year to year.