Why Lisp

The most amazing thing is what Lisp ended up being out of pure serendipity. JohnMcCarthy was going after a universal function that would compete or replace a TuringMachine.

But when S.R. Russell noticed that eval could serve as an interpreter for LISP, and promptly coded it -- against JohnMcCarthy's advice and wishes, we ended up with a very special programming language, where code is always data, and data can be code, where a simple interpreter and a handful of primitives define a powerful language:

 (define (eval exp env)
   (cond ((self-evaluating? exp) exp)
         ((variable? exp) (lookup-variable-value exp env))
         ((quoted? exp) (text-of-quotation exp))
         ((assignment? exp) (eval-assignment exp env))
         ((definition? exp) (eval-definition exp env))
         ((if? exp) (eval-if exp env))
         ((lambda? exp)
          (make-procedure (lambda-parameters exp)
                          (lambda-body exp)
         ((begin? exp)
          (eval-sequence (begin-actions exp) env))
         ((cond? exp) (eval (cond->if exp) env))
         ((application? exp)
          (apply (eval (operator exp) env)
                 (list-of-values (operands exp) env)))
          (error "Unknown expression type - EVAL" exp))))

(define (apply procedure arguments) (cond ((primitive-procedure? procedure) (apply-primitive-procedure procedure arguments)) ((compound-procedure? procedure) (eval-sequence (procedure-body procedure) (extend-environment (procedure-parameters procedure) arguments (procedure-environment procedure)))) (else (error "Unknown procedure type - APPLY" procedure))))

Understanding this "universal function" is equivalent to understanding a 400 page programming language.

There seem to be about 400 helper functions missing from this program though - it appears someone has abstracted them! Actually, no. But think about it for a little longer...''

To prove that McCarthy had little intention in doing this, take a look at what he wrote (http://www-formal.stanford.edu/jmc/history/lisp/lisp.html):

"Another way to show that LISP was neater than TuringMachines was to write a universal LISP function and show that it is briefer and more comprehensible than the description of a universal Turing machine. This was the LISP function eval[e,a], which computes the value of a LISP expression e - the second argument a being a list of assignments of values to variables. (a is needed to make the recursion work). Writing eval required inventing a notation representing LISP functions as LISP data, and such a notation was devised for the purposes of the paper with no thought that it would be used to express LISP programs in practice. Logical completeness required that the notation used to express functions used as functional arguments be extended to provide for recursive functions, and the LABEL notation was invented by Nathaniel Rochester for that purpose. D.M.R. Park pointed out that LABEL was logically unnecessary since the result could be achieved using only LAMBDA - by a construction analogous to AlonzoChurch's Y-operator, albeit in a more complicated way."

Which goes to show that he had no vision or plan to create the monster that he did: it just happened out of pure luck. (PaulFeyerabend lives!) This statement is specially enlightening:

"Writing eval required inventing a notation representing
LISP functions as LISP data, and such a notation was
devised for the purposes of the paper with no thought
that it would be used to express LISP programs in practice."

Thank God someone took his original notation seriously and literally... otherwise, the LispLanguage as we know it, may have never existed. And humanity would be deprived of its power, flexibility and simplicity.

In terms of impact, GeneticProgramming, most AutonomousAgent Technology, ArtificialIntelligence and all of its branches (LogicProgramming, Rule-Oriented, NaturalLanguageProcessing, ArtificialNeuralNetworks, GameProgramming, etc. etc.), Hybrid or Multi-Paradigmic programming, AspectOrientedProgramming, MetaObjectProtocols, and a great deal of programming language research could have never been possible if it wasn't for Lisp.

Literally, we would not be where we are in programming if it wasn't for a crazy guy that took a programming language research paper literally, and did a very dumb thing:

wrote some code in a short amount of time.

The rest was generated by this little "historical accident".

Once code is data, there is not need for ASTs (AbstractSyntaxTrees). Lisp notation is basically a condensed form for writing ASTs.

But what that means is that to modify the meaning of the language and/or to do even things like meta-programming, Lisp is the natural choice. Think about this: it is easy to generate data from a program, right? Well, if code is data, then it is easy to generate code. Things like GeneticProgramming are easy with Lisp.

Also, parsing "other languages" is easy with Lisp. It is no coincidence that most ACLs (AgentCommunicationLanguage?s) are Lisp-based, because the "verb/data exchanges" can be simply interpreted on either side. No need for SAX-like parsers that waste cycles in serialization and deserialization -- simply take the stream and interpret it. And if it is not exactly Lisp, get inspired by Lisp and create an eval-like function and then interpret the "new language". In that sense, eval is like a "universal SAX parser". Wouldn't you like to have one of those?

You still have to serialize and deserialize the byte stream, unless you're sending raw object code over the network.

On the meta-programming side, imagine this, you can set any level of filtering, execution of extra actions, change interpretation, or give new meanings to anything that is interpreted or "evaluates" by just changing eval. So if you want to keep meta data or meta processing on your programs there is nothing as simple as Lisp.

In Lisp, for example, is incredibly easy to do AspectOrientedProgramming because you can manage at every point what happens to the joint and cut points of functions and it is easy to do so.... everything goes through eval.

The JAspect tool does whole-code analysis and transformation during the Java compilation phase; the aspects are guaranteed to be applied throughout the program. This would be harder to accomplish in typical interactive Lisp development. (And anyway, the "define-method" facilities only work for CLOS calls.) To achieve the policy features provided by AspectOrientedProgramming, a Lisp programmer is more likely to add a new control structure, along the lines of "with-" or "call-with-".

If that wasn't enough, Lisp comes with a myriad of nice features such as lambdas (first order functions), closures (functions that generate functions), powerful macros (that can generate anything data, functions, classes, patterns, or even other macros), GenericFunctions (methods that cut across multiple classes in CLOS), and by extension, MOPs; and with a wide variety of powerful Open Source Lisp source code scattered throughout the planet. (The Lisp community was perhaps the first community that embraced Open Source.)

In my opinion, Lisp is the most powerful and flexible language, and in fact, I would say that once you understand Lisp, other languages seem very constraining, underpowered and unnecessary complicated.

Also, Lisp programmers were probably the first "agile developers" on the planet, because as far back as the mid-60s, many of the practices that we know today as "agile practices" or even "extreme programming" were already in practice.

Moved to AgileLisp since the page already exists and may attract a different audience anyway.

Sorry for my longish post -- I could rave and rant all day about the powers of Lisp.

I'll leave you with a thought. Here is simple example to extend Lisp to do Object-Oriented programming. Try to do something similar in C, C++, Java or even Smalltalk ;-)

 (defmacro define-class (class inst-vars class-vars &body methods)
   "Define a class for object-oriented programming."
   ;; Define constructor and generic functions for methods
   `(let ,class-vars
      (mapcar #'ensure-generic-fn ',(mapcar #'first methods))
      (defun ,class ,inst-vars
        #'(lambda (message)
            (case message
              ,@(mapcar #'make-clause methods))))))

(defun make-clause (clause) "Translate a message from define-class into a case clause." `(,(first clause) #'(lambda ,(second clause) .,(rest2 clause))))

(defun ensure-generic-fn (message) "Define an object-oriented dispatch function for a message, unless it has already been defined as one." (unless (generic-fn-p message) (let ((fn #'(lambda (object &rest args) (apply (get-method object message) args)))) (setf (symbol-function message) fn) (setf (get message 'generic-fn) fn))))

(defun generic-fn-p (fn-name) "Is this a generic function?" (and (fboundp fn-name) (eq (get fn-name 'generic-fn) (symbol-function fn-name))))

-- MikeBeedle

I still fail to understand, though, how a LispCompiler? can interpret data *cough* code which is created at runtime, at the same speed as original compile-time code. I suppose I could look it up somewhere. -- MatthewAstley, DeleteWhenCooked

Well, if it is CommonLisp the compiler is part of the runtime. So you can dynamically create functions, compile them (or not) and run them just like any other function...

Thanks. I thought it was more of a "translate to CeeLanguage, pass through GCC" kind of thing (see CeeAsAnIntermediateLanguage).

While there are projects to implement CL on top of a C compiler (see ECL), that behaviour is the exception, not the rule. There are, however, very good CL compilers (see CMUCL for a free one) which will compile CL to AssemblyLanguage, generally within a factor of 2 of a good optimizing C compiler. This is pretty amazing, considering how much more power you get in CL compared to C. To put it another way: a lot of people like languages such as tcl/perl/python, because it's abstractions let them code much faster than they would in C. However, the "abstraction penalty", if you want to call it that, is sometimes quite high, often a couple or even three of orders of magnitude. So often people optimize by going back and writing critical bits in C or C++, which is a bit of a mess. With CL, you get more power and abstraction than these languages, and with appropriately written code you can have very little penalty. Why this hasn't sunk it for more people I am not sure (actually, that's a fib --- I have some ideas). [WhyNotLisp?]

(I'm tempted to drag this ThreadMode chunk off to the CommonLispWiki. Comments? How about http://www.cliki.net/WhatCompilersDo) my comment: how about not creating StudlyCappedPages? on CLiki? LispIsNotCamelCase - they stick out like StickyOutThings? in our otherwise quite readable page naming convention

Anyway, back to compilation. I understand (ish) the "text to list of atoms" reading step. Does the compilation before execution resemble in any way the JustInTimeCompilation of ByteCode? Maybe at this point I shold just start hacking.

See also: ImplementingLisp, CommonLispWiki

View edit of July 8, 2013 or FindPage with title or text search