Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon. Entire thread

C is a big 0

Name: Anonymous 2017-08-04 4:47

https://softwareengineering.stackexchange.com/questions/110804/why-are-zero-based-arrays-the-norm

1-based counting was ``the norm'' for thousands of years. Years begin at 1. The Bible counts chapters and verses from 1. Programming languages before C started arrays from 1 or a user-defined index.

Only a few answers mention that some programming languages let you start from 1. This should be filled with answers saying ``1-based arrays used to be the norm and C hackers came along and forced everyone to use the C way because they get confused if you let them choose the index.'' Stupid questions do not bother me, but wrong answers do. Stack Overflow and Stack Exchange are spreading wrongness into the world. They are reducing the amount of truth on the Internet.

They all say that arrays count from 0 and nobody can change it because it would ``confuse'' people. This is the C mentality. They want to force their 0 on thousands of years of human history and on every non-C-based programming language. They want everyone else to cater to them because they are too dumb. Pascal programmers can learn and use C. They don't like it, but they can get used to it. C programmers don't want to use Pascal because it's not C.

Stop catering to the idiots. They are not good programmers if they get confused by simple concepts like array base. Kids using QBasic and Pascal can understand it, but these C ``expert hackers'' can't. We should stop dumbing down our languages and debasing important computer science concepts because some people are stupid.

Name: Anonymous 2017-08-12 15:42

0 makes sense as the very start of the array because it is the additive identity. Twice as far along the array is still the very start, and it makes sense to reflect this in the number you choose to index it.

When converting columns/rows to a single index in the array, it makes more sense to use y*stride+x or (y-1)*stride+x-1, than it does to use (y-1)*stride+x or y*stride+x+1.

In 0-based arrays, an index is before the start of the array if and only if it is negative. In 1-based arrays, 0 is before the start too.

Many people have made the pointer argument so I will omit that.

0-based arrays allow for interesting conveniences, e.g. the cardinality of an array is also the index you write to when adding another element to it (in resizeable array abstractions).

I don't care what you Pascal loonies think is "the right way", but the 0 way is far more convenient in every conceivable respect.

I suspect that the reason non-mathematicians and other lay people chose 1 to represent the first year in the calendar, the first day in the month, etc., is because these were invented before zero was invented. The Gregorian calendar is a tiny adjustment to the Julian calendar which appeared 43 years before the concept of zero was first recorded. (Recall also that there is no Roman numeral to represent zero). The rest is down to tradition, which is hard to break.

Newer Posts
Don't change these.
Name: Email:
Entire Thread Thread List