- compilers and interpreters
- virtual machines
- lexical analysis
- parsing
- context-free grammars and regular machines
- symbol tables
- code generation
- scheduling algorithms
- resource allocation
- memory hierarchies
- filesystems (including LoggingFileSystems)
- Algorithms and Complexity
- Database Theory
- Semantics of Programming Languages
- Computer security theory (e.g. CapabilitySecurityModel)
- cryptography
- Computational models, including functional models like the LambdaCalculus as well as process calculi such as CommunicatingSequentialProcesses
- network analysis such as InternetTrafficIsFractal

- The diligent reader may find the answer in EwDjikstra?'s manuscripts, maybe not in such a "user friendly" form as it may evolve on a healthy wiki. -- CostinCozianu

- If you know about Algorithms and Complexity, you won't promise a customer an optimal scheduling algorithm or other computationally intractable result. You'll understand the notion of good-enough solutions to "intractable" problems (i.e. redefining the problem slightly to make it tractable).
- If you know about Algorithms and Complexity, you won't choose to use a bubble sort on a 100,000,000 entry list. It occasionally comes up that libraries and the built-in sorted collections in some languages don't sort using an appropriate algorithm for your problem. For example, you're provided a run-of-the-mill quicksort, but you have a 99% sorted linked list data structure of 100,000,000 items.
- If you know about compilers, you might know why someone wrote a C loop like this: "while(*ptr++ = *ptr2++) {}", and why its probably no longer a worthwhile optimization. This kind of code is found commonly in the real world, written by programmers who don't understand what modern compilers do.
- If you know about scheduling, you might have an understanding about why your multi threaded program sometimes deadlocks, and sometimes doesn't.
- If you know about resource allocation, you might understand why using garbage collection finalizers to close file handles is a bad idea.
- If you know about virtual machines & compilers, you might understand why "java -server" behaves differently from "java -client", which is certainly something even your average programmer is likely to run into at one time or another.
- If you know about parsing, you can create programs to process complex streams and text files of data, and know when ambiguities occur and how to deal with them.
- If you know about database theory, you know how to normalize databases, you know the importance of constraints, you know when to denormalize if necessary and the implications thereof, and you know how to phrase efficient, optimizable queries.
- If you know about filesystems, you can repair corrupted filesystems.

It was written,

The dictionary definition:

- The application of scientific and mathematical principles to practical ends such as the design, manufacture, and operation of efficient and economical structures, machines, processes, and systems.

This feeds a notion that things are changing so fast that one can be ever-developing and never-producing a finished product. During development, new innovations make the application archaic before it can be productionized. This is the reason software is often rushed into production before it is fully tested and finished, so that at least the previous innovations can be utilized for a little time before they are replaced and then deprecated by the passing of a few months (soon to be weeks, then days).

See also IsComputerScience JanuaryZeroSix CategoryScience CategoryDiscussion

View edit of September 14, 2007 or FindPage with title or text search