What's nice about humans is that if they use the wrong form, like a female form for a table, people around them would still understand what they're talking about. Although some might think that the language is ruined. That's OK.
When it comes to programming languages, things are different. Computer languages are supposed to be considered as an exact science, thus, if one not using the syntax correctly, the code won't execute.
Or so we'd like to think.
var i = 0;
Now, let's make a tiny difference (like using the female form in Hebrew, which is only a bit different than the male form):
var i = 0;
What was changed? Notice the semicolon (;) before the "while" statement. Try this using your favorite browser. If you get "done:2" as before, I urge you to replace your favorite browser immediately.
This piece of code shouldn't work. It has a syntax error. The semicolon is "unexpected".
Internet Explorer ignores that, and the code works as before. Other browsers (Firefox, Chrome) would halt the script execution.
I'm willing to forgive when people don't follow the Hebrew syntax down to the last rule, but when it comes to code - I'm unforgiving. Moreover, I'm unforgiving to the guys that implemented a "fuzzy" interpreter for a programming languages. This is supposed to be an exact science.
Let me give you another example. In this case, I'm not sure with which implementation I agree:
alert([1, 2, 3, ].length);
alert([1, , 2, 3].length);
Firefox and Chrome would print "3" and then "4", while Internet Explorer would print "4" twice. Basically, each array has an "empty" (undefined) element, so their length is supposed to be the same. But it appears that Firefox and Chrome would drop that element if it is the last one in the array.