Programming languages devised by DouglasHofstadter to demonstrate, IIRC, GoedelsIncompletenessTheorem and the ChurchTuringThesis. Hofstadter described them as follows: *BlooP, FlooP, and GlooP are not trolls, talking ducks, or the sounds made by a sinking ship -- they are three computer languages, each one with its own special purpose.* They belong to that class of languages (?TuringTarpit), such as UnLambdaLanguage, which provide a bare-minimum of facilities while remaining TuringComplete. Trying to do something non-trivial in such a language can be an interesting exercise.
Here is a sample program. Note that **<=** is the assignment operator, and **output** represents the return value of the routine:
*n*:
**mu-loop**, which is equivalent to Java **while(true) {...}**, and so can solve problems outside BlooP's reach, such as computing the AckermannFunction.
There are still problems FlooP cannot solve, and Hofstadter proposed a mythical language, GlooP, that could solve them. But then he concludes, *In fact it is widely believed that there cannot be any more powerful language for describing calculations than languages that are equivalent to FlooP. This hypothesis was formulated in the 1930's by two people, independent of each other: AlanTuring ... and AlonzoChurch, one of the eminent logicians of this century. It is called the "ChurchTuringThesis". If we accept this thesis, we have to conclude that GlooP is a myth -- there are no restrictions to remove in FlooP, no ways to increase its power by "unshackling" it, as we did BlooP.*
B/FlooP, IIRC, does not support recursion. If it did, the beer song could be implemented without subtraction:

The JavaLanguage didn't exist when GoedelEscherBach was published, but the numbered blocks correspond nicely to Java's labelled statements.**quit block ***n* and **abort loop ***n* could be implemented with a labelled **continue** and **break**, respectively. The only local variables are **output** and a (possibly unlimited) array called **cell**, reminiscent of unnamed local variables in the PlanKalkuel.
A PerlLanguage implementation is available at the RetrocomputingMuseum.
Hofstadter's syntax calls for a pair of single quotes on either side of the procedure name in its declaration. The Wiki engine converts the name to italics.

Surely BlooP is not TuringComplete. It can't emulate FlooP, for example, and the HaltingProblem is solvable for BlooP.*It's a one-liner even: print "halts" :-)*
*"Practically infinite" is right - you've made a nice analogy between practice and theory, Bloop. If in practice the program uses all 19^18 operations, then in practice I would say the program ran forever! In practice it didn't halt, and (presumably) in theory neither did the FlooP program it was simulating. -- LexSpoon*
Nevertheless. DouglasHofstadter comments somewhere that it is quite hard to design a language in which one can do useful work, yet which is not Turing Complete. BlooP is such a language. This is what makes it interesting and remarkable.
*In practice, we get impatient with our Turing-complete non-terminating loops, anyway, and press BREAK or something. Especially if we don't know how long the program was going to take. We just don't know what N is going to be until we *do* press BREAK. Then there's the risk that you might press BREAK immediately before the thing would have printed its answer.*

If we added PimcPiflPire to FlooP, we might have GlooP. Of course, it is impossible to implement PimcPiflPire on any hardware that we can either build or simulate...

The four basic operations used in BlooP/FlooP are:

It is not true the Hofstadter's formulation of BlooP included only an unsigned integer type. He also describes a Boolean type with values YES and NO, used in predicate procedures distinguished by the "?" suffix on the procedure name.

CategoryProgrammingLanguage

define procedure ' 'factorial' ' [N]: block 0: begin if N < 0, then: quit block 0; output <= 1; cell(0) <= 0; loop N times: block 1: begin cell(0) <= cell(0) + 1; output <= output * cell(0); block 1: end; block 0: end.The syntax is admittedly verbose, and the facilities are primitive. The only data type is a (potentially unlimited precision) unsigned integer, the only operators are add, multiply, less/greater than, and equals. You can do subtraction, if you're willing to work a little:

define procedure ' 'minus' ' [m,n]: block 0: begin if m < n, then: quit block 0; loop at most m + 1 times: block 1: begin if output + n = m, then: abort loop 1; output <= output + 1; block 1: end; block 0: end.BlooP does not have unbounded loops, and so can only implement what is known as PrimitiveRecursive functions, i.e. functions that can be expressed as combinations of loops with fixed iteration limit. It's still possible to do interesting work in BlooP, though. The following code returns the next PrimeNumber after

define procedure ' 'divides?' '[d,n]: block 0: begin output <= 0; kd <= d; loop at most n times: block 1: begin if kd = n: output <= 1; kd <= kd+d; block 1: end; block 0: end. define procedure ' 'prime?' '[n]: block 0: begin output <= 1; d <= 1; loop at most n-2 times: block 1: begin d <= d+1; if divides?[d,n]: output <= 0; block 1: end; block 0: end. define procedure ' 'prime-beyond' '[n]: block 0: begin output <= n + 1; loop at most n times: block 1: begin if prime?[output], then: abort loop 1; output <= output + 1; block 1: end; block 0: end.FlooP adds the

define procedure ' 'beer' ' [i, limit]: block 0: begin if i < limit, then: block 1: begin output <= beer[i+1, limit]; print [i, "Bottles of beer."]; block 1: end; print [i, "Bottles of beer on the wall,"]; print [i, "Bottles of beer,"]; print [i, "You take one down, pass it around,"]; if i = 1, then: print["No more bottles of beer on the wall."]; block 0: end.-- David Brantley

The JavaLanguage didn't exist when GoedelEscherBach was published, but the numbered blocks correspond nicely to Java's labelled statements.

Surely BlooP is not TuringComplete. It can't emulate FlooP, for example, and the HaltingProblem is solvable for BlooP.

- While BlooP isn't Turing-Complete, it could emulate all FlooP programs which complete within an specified large amount of time on a finite-speed computer. This may include all programs that interest anyone but mathematicians.
- Suppose that the computer can complete a billion (10 ^ 9) operations/second, and you want to emulate any FlooP program which terminates in less than 31 years (a bit less than a billion seconds, in which 10 ^ 18 operations are possible). You could replace all mu-loops with "loop at most 1000000000000000000 times:". The translated BlooP program would only diverge from the FlooP original if the FlooP program required more than 10^18 operations (taking over 31 years).
- If you want to get ridiculous, you could loop for at most 10^(10^9) times, and get a "practically infinite" loop. Close enough for me, anyway. :-) -- Bloop

- See David Turner's paper "Total Functional Programming" (google for it).

If we added PimcPiflPire to FlooP, we might have GlooP. Of course, it is impossible to implement PimcPiflPire on any hardware that we can either build or simulate...

The four basic operations used in BlooP/FlooP are:

- Adding two natural numbers
- Multiplying two natural numbers
- Checking if two numbers are equal
- Checking if one number i sless than or greater than another number

It is not true the Hofstadter's formulation of BlooP included only an unsigned integer type. He also describes a Boolean type with values YES and NO, used in predicate procedures distinguished by the "?" suffix on the procedure name.

CategoryProgrammingLanguage

EditText of this page (last edited November 4, 2009) or FindPage with title or text search