Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon. Entire thread

C is a big 0

Name: Anonymous 2017-08-04 4:47

https://softwareengineering.stackexchange.com/questions/110804/why-are-zero-based-arrays-the-norm

1-based counting was ``the norm'' for thousands of years. Years begin at 1. The Bible counts chapters and verses from 1. Programming languages before C started arrays from 1 or a user-defined index.

Only a few answers mention that some programming languages let you start from 1. This should be filled with answers saying ``1-based arrays used to be the norm and C hackers came along and forced everyone to use the C way because they get confused if you let them choose the index.'' Stupid questions do not bother me, but wrong answers do. Stack Overflow and Stack Exchange are spreading wrongness into the world. They are reducing the amount of truth on the Internet.

They all say that arrays count from 0 and nobody can change it because it would ``confuse'' people. This is the C mentality. They want to force their 0 on thousands of years of human history and on every non-C-based programming language. They want everyone else to cater to them because they are too dumb. Pascal programmers can learn and use C. They don't like it, but they can get used to it. C programmers don't want to use Pascal because it's not C.

Stop catering to the idiots. They are not good programmers if they get confused by simple concepts like array base. Kids using QBasic and Pascal can understand it, but these C ``expert hackers'' can't. We should stop dumbing down our languages and debasing important computer science concepts because some people are stupid.

Name: Anonymous 2017-08-12 20:34

>>50
You're basically reiterating the Stack Overflow answers. The thread is about how C is dumbing people down and that Stack Overflow is full of these dumbed down people who never used arrays that don't start at 0.

I believe you should be able to choose the lower bound for your array and that 1 makes more sense as a default because the number of elements and last index are the same (unlike C's ``one byte past the end'' rule that standardizes off-by-one counting and prevents trapping errors as soon as possible). It is only C and C-based programmers who say choice is bad because it confuses them. To see why it's confusing, compare the definition of arrays in ISO C and ISO Pascal. Compare the whole languages while you're at it.

I suspect that the reason non-mathematicians and other lay people chose 1 to represent the first year in the calendar,
Mathematicians counted from 1. FORTRAN counted from 1 because it was made for mathematicians and scientists (now it lets you choose the index). Computer scientists counted from 1. You will see one-based counting and one-based indexing in any CS paper from the 60s and 70s. This was when all of the innovation happened. They did so many things that we couldn't or don't want to do today because the C and UNIX semantics and way of thinking make it too complicated, if not impossible.

Einstein once said ``We cannot solve our problems with the same level of thinking that created them.'' but the tragedy is that a lot of problems were already solved before C came around to create them. Universities are ignoring the solutions because they're older than the problems. The solutions are also very simple and people who like complex solutions don't like hearing that changing the loop and array to make it more like Fortran, Algol, Pascal, and BASIC (all languages older than C) can eliminate a lot of problems instantly.

Since all of these smart people chose to count from 1 with closed-intervals and the ``same level of thinking'' that brought us C, C++, Java, and these other garbage languages was based on 0 and half-open intervals, I think counting from 1 is superior. Call that an ``appeal to authority'' but I think the way they think about computer science concepts has a lot to do with why they made such better programming languages.

the first day in the month, etc., is because these were invented before zero was invented.
They chose 1 because they are counting days from the beginning of the month. The first day is 1, the second day is 2, etc.

(Recall also that there is no Roman numeral to represent zero)
There is. It's simply an empty field. Roman numerals come from counting objects, and when there are no objects, there are no marks.

The rest is down to tradition, which is hard to break.
C programmers have no respect for tradition unless it is Bell Labs hacker tradition. They don't even respect counting. The choice of 1-based arrays and the traditional (counting with a step) loops were based on thousands of years of mathematical and human tradition.

Newer Posts
Don't change these.
Name: Email:
Entire Thread Thread List