shouldn't be presented as an issue. It should be presented in terms of solving the abstraction between the humans and the machines. SyntacticSugar
is more often good, as it improves productivity.
Bytes are SyntacticSugar
Assembly Language is the SyntacticSugar
for Machine Code.
C is a SyntacticSugar
for Assembly Language.
C++ is a SyntacticSugar
Java is a SyntacticSugar
C# is a SyntacticSugar
for C/C++, with a hint of Java and Pascal.
Objective C is a SyntacticSugar
Google Web Toolkit is SyntacticSugar
JSON can be considered SyntacticSugar
HTML can be considered SyntacticSugar
for Rich Text Files or LATEX.
per se, is not bad.
Really, it all gets compiled down to 1's and 0's eventually anyway :)
If "all programming languages are syntactic sugar", what about programming languages that are never compiled, only interpreted? Certainly it would be possible to design a language in which there wasn't a consistent 1-to-many mapping. For instance, imagine an operator who's meaning varied depending on the time of the program's execution
. Morning it means >, Evening it means <.
Further, what about Pseudocode then? Some people use very very rigorous pseudocode (that could almost be a real language suitable for compilation), yet that code is not ever executed by a binary computer. However, it still can serve to communicate to another human your intentions.
What about languages based solely off of mathematical notation? Or logical notation? They exist. Is all of math and logic just syntactic sugar? Is ALL language just syntactic sugar? I mean, suppose we define a subset of english, limit it so that we don't use things like contractions or improper grammar (in fact, limit the grammatical constructs) and then call that a programming langugae. It's still clearly english.
Is english then syntactic sugar? Something about that statement makes me think we're wrong here.
[All language is syntatic sugar, both written and spoken. Words are mere symbols that represent complex ideas. When any idea takes too many words to express, we create a new word that stands for that complex expression, which then becomes the definition of that word. Now the word alone will suffice, it becomes syntactic sugar for the idea it now expresses. The process of creating syntactic sugar... abstraction, key to speaking, thinking, and programming.]
Surely after all that's been said on this and related pages, it's time to come up with a really specific definition for "syntactic sugar", rather than just take extreme positions like yours while taking the definition for granted.
[There's nothing extreme about my position.]
- Your comments dismiss what I'm saying. That's inappropriate. I suggested that it's time to make a definition. You blew that off.
- I said "extreme" because you said "all language is..." -- it's an extreme position to make any sweeping statement about all language. Linguistics and cognitive science are still in a relatively early stage of understanding the nature of language, and any such generalization is in danger of ignoring all that.
[I see that as no more a sweeping statement than all water is wet, I wasn't trying to dismiss what you were saying, I just don't think all sweeping generalizations are extreme.]
You are implicitly taking the position that "syntactic sugar" is a synonym for "symbolic expression". I don't think that definition is at all the most widely used one assumed.
[No I'm not, nor did I say that. I didn't even mention symbolic expressions. I said words are symbols, which they are.]
- Tell me the distinction you are making between "symbol" and "symbolic expression".
[I was using symbol in the common english usage of the word, not in the programming sense. Words are symbols, but that doesn't extrapolate to syntactic sugar equalling a symbolic expression, at least not to me.]
Also, you said "mere", which is a danger signal. Words like "mere" and "just" are dismissive, implicitly claiming their referents are insignificant, without direct examination of whether they are or not.
[Don't over analyze it.]
- It's not "over analyzing", it's the simple, literal truth, as I first noticed when I was 15 years old, and as many others have pointed out since. Tell me the difference between the meaning of "Words are mere symbols" versus "Words are symbols".
[Context. I didn't mean words are nothing but mere symbols ever, I simply meant it in the context of this topic, maybe it was a bad choice of words.]
Also, I find it interesting that "all languages [are] syntatic sugar..." What, praytell, is english candy-coating? It's a fairly well-accepted fact that people who speak different languages think differently when they "think" in that language. Bilingual people everywhere confirm this. So Language is not just "syntactic sugar" for thought. What exactly is this "root form" of communication that Japanese, English, Cantonese, and Portugese are all covering up? Without language, anything beyond the most basic of communication is not possible.
[Different languages offer different abstractions, so what, how is that at all relevant? People who can't speak the same languages can still communicate, they just revert to weaker syntactic sugar like motioning and pointing, no where near as efficient as words, but effective none the less.]
- But they can't communicate as much with gestures (actual sign language is a different question, and irrelevant since it's just another language like English), by a factor of about a million. You can't ask another programmer how to implement a linked list using gestures.
[That all depends on context too, people at meat markets quite often communicate just fine with nothing more than gestures, but I don't see how that relates to the topic at all, could you explain more?]
You're going to have a hard time convincing me that abstraction is syntactic sugar, because by every definition of syntactic sugar I've seen, it can't represent new concepts, just present current ones in a more compact or convenient fashion.
[Abstraction is the process of simplifying something by removing irrelevant details. Syntactic sugar is essential the same. Function declarations are sugar for the more complex process of pushing variables onto the stack. void fun(arg) is an abstraction. Take any syntactic sugar you want, it's an abstraction for something, for some process that you don't want to have to do or think about, so you invent sugar.]
- True! But this contradicts other things you said above, now that you have given a more reasonable definition for "syntactic sugar". All language is not a matter of simplifying by removing irrelevant detail, it is the opposite, it is a means of expressing an arbitrary amount of detail beyond what can be expressed without language.
[I didn't say language is "only" syntactic sugar, so I don't see any contradiction. If I said all humans are animals, that doesn't mean that all humans can't also be mammals. Language is sugar for thoughts, ideas, it's easier for me to say "that red mustang", than to describe the entire car. Language compresses information and allows better communication. It's sugar to be able to say, just declare a function... because if I had to explain pushing variables onto a stack every time I wanted to express that idea, I'd never get anything done.]
If you're saying this is the case, how do we teach our children concepts like abstraction? To claim concepts like this are fundamental to the human brain and therefore a priori is problematic, because not everyone is equally good at it, and some people can barely think in the abstract.
Everyone can think abstractly quite easily, some are just better at it than others. Children learn like everything else learns, through experience. When does a child understand time? They figure out the concept via experience, watch em scream at the checkout line when you take the candy bar away to pay for it, because they don't understand that you'll give it back, but eventually they figure it out, they start thinking into the future, they grasp time. I'd venture to say abstraction is a fundamental concept to all brains, human or not.
- is obviously not a bad thing, although over used sometimes.
- demonstrates that we code for humans. If it were not the case, we'd all just use 1's and 0's (makes for nice small keyboards :)
Nonsense. There is a difference in facilities a language may provide, between those which provide trivially-different ways of expressing behavior and those which provide deep semantics and significantly simplify the code.
What are we arguing about?
Do the people on opposing sides of this argument mean the same thing by "syntactic sugar"?
All languages are eventually translated to machine code, and therefore all language constructs could be termed (extremely useful) "sugar" around the operations of the machine.
A popular notion, but untrue. True syntactic sugar is free of semantics. The parts of a language furthest from "syntactic sugar" are those where the language implementation performs valuable semantic checking (e.g. for type errors) and/or semantic transformations, and the grammar supports those semantics.
The whole point of the term "syntactic sugar" is to denigrate comparatively useless syntactic structures, not to denigrate all of them; that devalues the term to the point of uselessness.
The increment operator x++ in C is syntactic sugar; it is merely shorthand for x = x + 1
- Actually, both expressions have a value, but they are different.
int x = 5;
int y = (x++);
results in x == 6 and y == 5.
int x = 5;
int y = (x = x + 1);
results in x == 6 and y == 6.
In other words, x++ has very specific semantics that are different from x = x + 1. You might be able to argue that ++x is syntactic sugar, because it does mean the same thing as x = x + 1, although it does have a higher level of precedence than either the = or + operator.
If I remember correctly, the things I have read about optimizing C code say that the post-increment, pre-increment, and binary-addition operators are all represented differently when they become machine-code. I have never specifically checked for this behavior though.
ing misses the point. If C did not have the ++ operator (nor the += operator) then you would write x = x + 1, and so in that sense it is
merely a shorthand syntactic sugar, even though, yes, once the shorthand got introduced, some subtleties came along with it.
You are taking the stance that, for something to be sugar/shorthand, it has to pass the LiskovSubstitution
test, but I don't see why that is required.
Actually I was demonstrating that the example given does not meet the criteria "... free of semantics."
Ah. You're certainly right about that. Probably that means that my phrase "free of semantics" was excessive.
it all gets compiled down to 1's and 0's eventually
I apologize if I am interpreting a light hearted comment as a serious statement, but the point of programming languages is to aid in getting the correct 1s and Os written.
In my observation there are generally two kinds of developers: one set who tends to think linguistically, and the other who tends to think in terms of data structures, or at least a bunch of bins (variables) and ropes (pointers). The latter will tend to look at code an think of an AbstractSyntaxTree as a data structure to be operated on by the interpreter/compiler, while the former will see the language as a language directly. The latter tend to dismiss syntax as a means to an end while the former tend to see the language as an important communication facilitator between developers and machines (or other team members).
A song from MaryPoppins?
comes to mind. A spoonful of sugar helps the medicine go down.
Programming is syntactic sugar, get out your soldering irons and do it all in hardware as it was meant to be :-p