Have software engineering
ideas stagnated since the 70's? By the 70's we had:
Most new ideas are just rehashes or refinements of these. Nothing really new has come since. So get out your plaid pants and afro wigs and shake your booty.
Methinks you're going to play the game called 'NoTrueScotsman' regarding what means a 'NewIdea?'. There is no idea that is not built upon or derived from other ideas, and it is impossible to express a new idea except in terms of old and better understood ideas. I see, below, that many of your bold counters rely upon the notion that the idea has antecedents. To be frank, so does everything on your list. Put away your afro wig and plaid pants... put on a toga.
-- Any true Scotsman knows that it's not an afro, it's a SeeYouJimmyHat?. :) By the way, I'm not in bold to SHOUT, just to distinguish myself from the person who was posting in italic . (More shortly, including hopefully some ViolentAgreement, the other WikiCliche? we can't go without.)
InformalHistoryOfProgrammingIdeas suggests "patterns" and UML are revolutionary. Patterns are merely an attempt to classify certain coding idioms. Giving names to something does not by itself make it new. (Plus, some people, such as PaulGraham, feel that patterns are signs of a lacking language or paradigm.) And UML diagrams have similarities to ideas that existed for decades. It is merely an attempt at standardization.
Some might disqualify these as they are deemed not formal and fundamental enough, but then one could easily go the opposite way and exclude most on of the list at the top and say that only electricity and math are formal enough.
Perhaps the post-70s period seems to suffer an "idea slump" because we're still too close to it to identify the ideas that are becoming, or will become, influential.
And also because contemplated ideas may take years to centuries to become implemented ideas, and because the implementation of the idea may take on facets and twists not contemplated fully by the many original ideas which become part of its realization. A perfect illustration of this can be seen in the ideas and sketches of LeonardoDaVinci
regarding manned flight and submersible ships. An IdeaImplemented?
is most often conglomerations of more than one IdeaConceived?
But I have not seen any decent candidates even.
Only because you reject candidates that build upon, derive from, or advance ideas from pre-70s. You could do the same for pre-1870s. Every idea has antecedents.
Perhaps it could be said that 40's-to-70's is when the key software engineering ideas we use today were identified, described, and recognized as fairly distinct and powerful ideas. Darwinian evolution was indeed hinted at many times before Darwin, but he "opened the book" on it. Same with Dr. Codd (relational).
[Very true. The ElderDays
established the essential foundations of computing. The "no, ideas have not ceased" list is one of relatively narrow iterative refinement and commercial application of ElderDays
foundations -- or outright obviousness. There is not an item in that list that does not rely on ElderDays
foundations, and there are few (if any) ideas in the list that are equal in generality or significance to any ElderDays
[Every industry goes through a "Golden Age" of significant research, invention, and innovation, where genuinely new territory is explored. In the automotive industry, for example, its ElderDays
were roughly from the late 1800s to the late 1930s. Virtually everything currently found in a modern car was either theoretically or practically explored at that time. Apparently modern innovations like airbags, anti-lock braking, emissions controls, electronic fuel injection, lightweight materials, hybrid power, etc., are all iterative refinements to pre-existing foundational work, and only appear to be significant to those unfamiliar with the history of the foundations that made them possible. Current automotive "innovation" is little more than applications of basic ideas, the majority of which are nearly a century old. Similarly, modern computing is little more than application of the basic ideas established in the ElderDays
By this reasoning there have been no significant ideas in math and physics since hundresth of years. I think you have to refine your condition for 'significance' somewhat and will realize, that it's not as easy as you think. -- .gz
[There is a clear and obvious distinction between fundamental theoretical research and trivial technological development.
In physics, for example, no one would confuse the development of (say) string theories with the invention of a better mouse trap. Yet, above, we have clear "mouse trap" entries like CascadingStyleSheets
, and ReallySimpleSyndication
. None of these required extensive research or any significant intellectual effort. They might be somewhat clever applications of existing technology, but that does not
make them important in any foundational sense. In scientific terms, they are trivial. This is not meant to diminish their industrial significance, but to regard CascadingStyleSheets
as on par with (say) the RelationalModel
is as ludicrous as treating the discovery of general & special relativity as on par with developing a slightly smaller portable MP3 player, or treating the invention of the internal combustion engine as equivalent to making a better windshield wiper. In terms of its pervasive foundational significance, there is an order of magnitude difference between the work done in the ElderDays
and the bag of gadgets listed above. Admittedly, there are some fuzzy areas -- like the "advances in TypeTheory
Many ideas of the 70s have borne fruit that is visible today, and a great many more have fallen to the wayside and been forgotten except in computer lore (UseNet
, anyone?). You can expect the same of many ideas of the 80s, 90s, and today -- some will advance beyond what you currently imagine, and others will fade away. Heck, it isn't too late for old 70s ideas to fade away. I expect that the RelationalModel
will be surpassed by ideas already present, including RDF and much of what is listed in WhatIsData
... if only because computer agents (future of 'Web3.0') must know what a 'tuple' means by context if it is to perform any sort of learning or DataMining
across vast stores of information.
The author of the bracketed argument above is also grossly underestimating the relative 'intellectual effort' and 'ideas' that went into practical implementations of such things as RemoteProcedureCalls?
, and ReallySimpleSyndication
. Fact is, a ballpark notion of where you want to be is just one idea... it takes finding and implementing of detailed ideas to get there... ideas on how to combine one idea with another.
will probably gain real use no earlier than twenty years down the line. Users of languages evolve far slower than the languages and language theory currently does, and code-base inertia prevents any rapid change. However, I expect that the LanguageOfTheFuture
will let you use any damn syntax you please (from Befunge to Occam, and even graphical programming like SmallTalk
) and will be tightly integrated with the OS of the future. Look at LanguageOfTheFuture
if you want a list of 'ideas' a long, long way from implementation. Just glancing at one -- the ExoKernel
-- shows an idea that's been around since circa 1994 but not yet in common use that very well might be the future design-path of all OperatingSystem
s. And there are wild ideas, too, like KillMutableState
. We don't know where all of them are going, and which will turn out to be duds... and which will disappear for fifty years only to appear again when another, newer, idea makes it practical.
was well implied at least as far back as 1967, by D. L. Childs in "Description of a Set-Theoretic Data Structure". I don't deprecate the notion of CSS or the broader principle it represents. I simply reject the notion that it is anything new or innovative, or that it represents the starting point of separating data and presentation. Such notions have been implicit best practice for decades, with early browsers and HTML actually being a step backward in terms of separating data and presentation.]
[You young'ns seeking worthwhile, significant, and theoretical innovations would be wise to review some history, starting with reading some of the classic papers in computer science. You'll be surprised to see how little is new, and how much is simply old wine in new bottles.] A new bottle IS a new idea. An idea is just a way of looking at, packaging, or combining other ideas, after all.
[As for RDF, or the content of WhatIsData
having any significance at all, let alone surpassing the RelationalModel
... I'll believe it when any of said content becomes the basis of a working multi-terabyte OLTP database or it's echoed in a Knuth volume. Until then, I'm not holding my breath.]
I would propose a list of post 70's ideas that would look like this (from my not so small references repository):
just ask for more -- .gz
I'll add wait-free atomic containers (heaps, lists, sets, maps, etc.). Those will be damn important when data is distributed widely on nodes across a network... since waiting on a lock held by a node that stops talking is ridiculous and insane. Oh, and I'll add all the newer theories regarding Network Survivability (which constitutes far more than failure tolerance... it constitutes resistance to -attack- and -natural disaster-) and Disruption Tolerant Networks. Many of those ideas will be necessary even in software engineering and protocols of the future, since overlay networks will become more and more common.
This list is more worthy than one that considers such things as W
ebCams and Blogs to be examples of fertile idea-smithing and productive intellectual toil. Some day, portions of your list might become as influential and foundational as the work done in the ElderDays
Time as Judge
I reject your position. This page isn't: PostSeventiesWorthyIdeaSlump?. And you are not the correct judge as to whether a particular idea is worthy... only society and time will determine whether an idea sees use or becomes influential. And the vast majority of ideas cannot be foundational. WebCams and Blogs could just as easily influence future ideas (including the manner in which (associated or similar) software is engineered) as those from TypeTheory or any given model or approach to computation or information storage or processing.
It took about 15 or 20 years before most of the listed ones were clear shiners. Thus if this pattern continues, 80's ideas should start being revered by now. But they are not. Was 80's just a coincidental gap? (I know I try to forget that decade :-)
Maybe not coincidental. I imagine that the 80s was a period of idea assimilation (that, and the microcomputer was a new hit wonder), and the 90s were loaded with ideas regarding how to use the tool new to the majority of humanity: TheInternet. Both times were loaded with ideas and derivative ideas, making practical some of the concepts merely fancied in the seventies and earlier. Your dismissal of the intellectual effort that went into such products and their influence today is in error, but such ideas are harder to point at or put in a list. The devil is in the details when you're the one implementing them.
[My "dismissal", as you call, it is not one of error vs. non-error, but simply a reiteration of an observation of evolution in an enduring industry. See, for example, http://lambda-the-ultimate.org/node/2059
The term "Elder Days" or "Golden Age" or whatever exists because it is commonly recognised that the computer industry, like many other industries, goes through distinct phases:
- "Elder Days" -- a phase of numerous pervasive innovations and breakthroughs, academic fertility, research, and theoretical development, but limited access (or interest) except to an intellectual elite. Such a period is rich in novel ideas, where "novel" is unambiguously recognised as such. This is clearly an "idea" phase, characterised by academic growth.
- post-"Elder Days" -- a period of persistent technical implementation and wide-scale accessibility, but relatively little theoretical research, innovation or novel ideas. However, the definition of "novel" (or "ideas"!) may be subject to debates like this one. This is clearly a "building" phase, characterised by commercial growth.
- Stasis -- a period of stability, entrenchment and iterative refinement, with relatively little technical or theoretical development. This is a "maintenance" phase, characterised by commercial consolidation and academic stagnation.
If computing follows the pattern of the automotive industry -- which has arguably reached statis -- it will reach a point of negligible new development, where minor tweaks are heralded as breakthroughs by marketing departments, but no one (outside of Marketing or naive observers) would claim the ideas are, well, ideas
. Of course, some theoretical innovation or discovery (to a limited degree, work on these always continues) may be sufficiently revolutionary to spark a new "Elder Days", and thus renew the cycle.
It must be emphasised that I do not deprecate any of these phases, nor do I attempt to put one above another in some fashion; I merely wish to highlight the fact that there is
(end of "Elder Days"), but it has been balanced by a P
ostSeventiesImplementationBoom (post-"Elder Days"). There is clearly a qualitative difference between the nature of the ideas spawned in the ElderDays
vs. most of the "ideas" since. While it's difficult to articulate precisely what
makes the development of (say) the B-Tree or Prolog different from developing a Web camera or CSS in terms of "idea-ness", that difference is unquestionably there.]
"but no one (outside of Marketing or naive observers) would claim the ideas are, well, ideas." -- So you're saying that those who argue with you are either in Marketing or are naive observers? How very kind of you.
There are, of course, qualititative differences in how the ideas are applied and among those who recognize the ideas. However, I'm not all that convinced there is a qualititative difference in the development of the ideas, or their idea-ness. Nor am I convinced that the world of computation science has entered a stasis, at least among fields involving automated theorem proving and type-theory, network survivability & disruption tolerance. It seems to have entered a new phase of pre-implementation work on OS design, compiler optimizations, and HCI as people try to get past what is currently a rather stagnant forms of the same (i.e. there's a lot of talk in these fields about what ought to be done, but little actual change).
As far as phases go... development of new ideas slows down when building atop older ideas only because it takes people a long time to master the old ideas. There's a lot of educational territory to cover before you reach the frontier. What you seem to be looking for aren't new ideas, but new 'revolutionary breakthroughs' -- ideas (e.g. models and theories) that, by themselves, open entirely new fields (new frontiers) of study, even if they aren't at all practical until someone starts advancing them through the more normal evolutionary development.