There are a few reasons computers start counting from zero. First, when a computer starts up, it needs to find the address of the first memory location that it knows about. This is called the “zero” address. The computer then looks for other addresses in memory and starts counting from there. Second, when a computer is started up for the first time, it needs to load its operating system and all of its programs. This process can take some time, so the computer might start counting from zero after a while. Finally, computers sometimes need to calculate certain calculations before they can begin working. These calculations might require some number that’s not found anywhere in memory but is instead stored in a special part of the computer called an “address space”.
Today’s Question & Answer session comes to us courtesy of SuperUser—a subdivision of Stack Exchange, a community-driven grouping of Q&A web sites.
The Question
SuperUser reader DragonLord is curious about why most operating systems and programming languages count from zero. He writes:
Why indeed? As widespread as the practice is, surely there are practical reasons for its implementation.
What historical reasons exist for this, and what practical advantages does counting from zero have over counting from one?
The Answer
SuperUser contributor Matteo offers the following insights:
If you’re looking to delve deeper into the answer, the Dijkstra paper is an informative read.
If an array is stored at a given position in memory (it’s called the address) the position of each element can be computed as
If you consider the first element the first, the computation becomes
Not a huge difference but it adds an unnecessary subtraction for each access.
Edited to add:
The usage of the array index as an offset is not a requirement but just an habit. The offset of the first element could be hidden by the system and taken into consideration when allocating and referencing element. Dijkstra published a paper “Why numbering should start at zero” (pdf) where he explains why starting with 0 is a better choice. Starting at zero allows a better representation of ranges.
Have something to add to the explanation? Sound off in the the comments. Want to read more answers from other tech-savvy Stack Exchange users? Check out the full discussion thread here.