Well, as I write this
Tek-Tips went down. Maybe this is an omen?
I'll refrain from including the offending Javascript (ColdFusion) code I was handed when the site died on me when it came back up. LOL! And no, I don't think it was a coding error.
Gosh, I wish I could recall in which thread I was doing the particular whining that you refer to
MasterRacker. And yes, "whining" is my word, not yours.
I can rail on about C and C-derived languages' syntaxes (and semantics) at length, from a number of different angles. But this thread has taken a particular direction already, and in the interest of time I'll try to stick fairly close to it.
One of the nastiest things about C is its basic syntax. Some of this has already been covered eloquently above by
CajunCenturian along with several points regarding some really good features of C-like syntax.
A lot of what we now take for granted in C was really done as it was in order to make it easy for the compiler to generate something close to good PDP-11 machine code - without a heavy optimizing compiler. As many of us know C was an evolution from the earlier B and BCPL programming languages. More history details are available at:
Goofy constructs like the iteration statement syntax and the pre/post increment/decrement statement syntax are fairly clear examples. I actually like the inc/dec statements, but I wish they had been done more generally and less machine-dependently, as for example in Burroughs Algol developed in the 1960s (and still alive today by the way):
A:=A+1
Can be represented by:
A:=*+1
And:
A:=A+(B/C)
Can be represented by:
A:=*+(B/C)
The last example may seem less useful, but shows that incrementing by other than 1 is easily done as in:
A:=*+3;
A:=*-SEGLENTH
But the thing that gives me fits in C-style languages is the way compound statements are handled.
You'd almost think these Bell guys were nothing but untutored hacks. It's like they'd seen Fortran, taken a peek at Algol, and then constructed a sort of "cargo cult" copy of what they thought they'd seen.
So they said "hey, let's have semicolons." And "hey, let's have BEGIN/END compound statements and blocks" but "I'm too freakin lazy to type, let's use {/} instead of BEGIN/END." There seemed to be a general love of obscurity among these guys, which made them feel plenty 1337 using funky characters, but this is no big deal in itself.
How is:
A[SomeIndex+2]=A[SomeIndex+2]+((SomeBool)?3:17)
Better than:
A[SOMEINDEX+2]:=*+(IF SOMEBOOL THEN 3 ELSE 17)
... I ask ya? And yes, Burroughs Algol can use mixed case, it just isn't case-sensitive. But we won't go there right now.
These Bell guys seemed to have totally missed the point that the semicolon character is a
statement separator and
not a
statement terminator (such as Cobol's use of the period). As a result we end up with a rule in C-style languages that reads something like:
"You need a semicolon when you need one. You can't have one when you can't have one."
Things got so ugly in Netscape's Javascript, with all its hack practitioners, that when ECMA firmed up the definition of ECMAScript (the current reference for implementations such as Javascript and JScript) they put in some funny rules. ECMAScript processors must basically be built so that
when the script doesn't "make sense" the processor must back up and take a stab at what the author might have meant to write if they knew the language rules.
"7.8.1 Rules of automatic semicolon insertion
"When, as the program is parsed from left to right, a token (called the offending token) is encountered that is not allowed by any production of the grammar and the parser is not currently parsing the header of a for statement, then a semicolon is automatically inserted before the offending token if one or more of the following conditions is true:
"1. The offending token is separated from the previous token by at least one LineTerminator.
"2. The offending token is }.
"When, as the program is parsed from left to right, the end of the input stream of tokens is encountered and the parser is unable to parse the input token stream as a single complete ECMAScript Program, then a semicolon is automatically inserted at the end of the input stream."
:
:
The full gory details can be seen at:
But where all of it ends up is at one conclusion: C is a nasty, poorly designed, little gutter language. It is very unfortunate that newer languages were derived from it without cleaning up many of the worst little nasties C embodies.
However, C is a fact of life that is hard to ignore. Like Microsoft's dominance - we may not like it all the time, but fighting it is like... spitting... into the wind. I write a goodly amount of C, C++, Java, and Javascript/JScript myself. I'm not a big fan, but it pays the bills, right?
My complaint isn't so much with C, et al. as it is cleaning up all of the nasty, hack code people write who don't
at least take the time to understand it properly, and write decently formatted and reasonable code. Maybe I'm biased because I don't deal with a lot of throwaway code. I deal with stuff that was written years ago and patched and repatched multiple times. Some examples are around a decade old (and still growing hairier in the ears as we sit here).
My real gripe with C and sons is that it lets (and encourages) people to write some really poor quality source code. And since they can't read their own code a year or two later I have to stop and help them unravel it myself because "I just can't figure out where I'm going wrong here."
I see this with Cobol, Fortran, Algol, Pascal/Delphi, VB, PL/1, Ada, various scripting languages (save me from Perl, please!) - but the biggest headaches are always with stuff done using C and sons. Thankfully APL is all but dead.
If I had to choose (and had the opportunity) I think Delphi/Kylix offers the best choice among commercial languages right now. Borland has gotten some things right. Too bad it represents such a small niche in the market.
I'll refrain from going off on a tangeant about the abuse of pointers, lack of bounds-checking, random-assed implicit type coercion, clunky explicit memory management or goofy garbage-collect schemes, and the general "drunk on objects" philosophy of so many C++/Java programmers right now.
I hope the rant was at least amusing, and even more that some of it was informative. To quote Roger Murtaugh in those
Lethal Weapon movies, maybe "I'm gettin' too old for this s..."
;-)