Back

Hofstadter on Lisp (1983)

373 points1 yeargist.github.com
susam1 year ago

In case anyone else is confused by what the functions named "oval" and "snot" mean in the following example:

  > (cond ((eq (oval pi) pie) (oval (snot pie pi)))
  (t (eval (snoc (rac pi) pi))))
I realised after a few seconds that they are meant to be "eval" and "snoc" instead. The above code should be written as the following instead:

  (cond ((eq (eval pi) pie)
         (eval (snoc pie pi)))
        (t (eval (snoc (rac pi) pi))))
This article has been a fascinating read, by the way. Kudos to the maintainer of the Gist post. I am also sharing these corrections as comments on the Gist post.

EDIT #1: Downloaded a copy of the original Scientific American article from https://www.jstor.org/stable/24968822 and confirmed that indeed the functions "oval" and "snot" are misspellings of "eval" and "snoc".

EDIT #2: Fixed typo in this comment highlighted by @fuzztester below.

fuzztester1 year ago

>confirmed that indeed the functions "oval" and "snot" are misspellings of "eval" and "snot".

Correction of your correction:

confirmed that indeed the functions "oval" and "snot" are misspellings of "eval" and "snoc".

And I guess snoc is cons reversed and rac is car reversed.

susam1 year ago

> Correction of your correction

Thanks! Fixed.

> And I guess snoc is cons reversed and rac is car reversed.

Indeed! That's exactly how those functions are introduced in the article. Quoting from the article:

> The functions rdc and snoc are analogous to cdr and cons, only backwards.

Xen91 year ago
hinkley1 year ago

OCR maybe?

lionkor1 year ago

maybe LLM "reformat/rewrite this"?

bsder1 year ago

This article simply reinforces that the primary problem with the popularity of Lisp was people explaining Lisp.

This article, like every other Lisp article, tells pre-teen me nothing that he could use. Nobody ever demonstrated how much easier task X is in Lisp over asm/C/Pascal/etc.

By contrast, current me could have told pre-teen me "Hey, that spell checker that took you 7 months to write in assembly? Yeah, it's damn near trivial in Lisp on a microcomputer with bank switched memory that nobody every knew how to utilize (it makes garbage collection completely deterministic even on a woefully underpowered CPU). Watch."

I want to weep over the time I wasted doing programming with the equivalent of tweezers, rice grains and glue because every Lisp article and textbook repeated the same worn out lists, recursion and AI crap without ever demonstrating how to do anything useful.

troupe1 year ago

Common Lisp: A Gentle Introduction to Symbolic Computation might be useful for the context you are describing. https://www.cs.cmu.edu/~dst/LispBook/

_19qg1 year ago

Practical Common Lisp https://gigamonkeys.com/book/

bsder1 year ago

Didn't exist back then. Likewise SICP first edition was 1996.

I did have a copy of "LISP: A Gentle Introduction to Symbolic Computation" by Touretzky in 1986. It wasn't really that much better than any of the articles. It never explained why using Lisp would be so much easier than anything else even for simple programming tasks.

Had some of the Lisp hackers deigned to do stuff on the piddly little micros and write it up, things would look a whole lot different today.

Maybe there was a magazine somewhere doing cool stuff with Lisp on micros in the 1980-1988 time frame, but I never found it.

_19qg1 year ago

Generally I agree with what you are saying. I live outside the US and so this stuff from the end 70s / early 80s was very remote. Gladly we had then a well connected university where I got in contact with some of the more interesting stuff mid 80s.

The book I found most useful in the early times as an introduction to Lisp and programming with it was LISP from Winston & Horn. The first edition was from 1981 and the second edition from 1984. I especially liked the third edition.

https://en.wikipedia.org/wiki/Lisp_(book)

Lisp on microcomputers in the early 80s was mostly not useful - that was my impression. I saw a Lisp for the Apple II, but that was very barebones. Next was Cambridge Lisp (a version of Standard Lisp) on the Atari ST. That was more complete but programming with it was a pain. Still, I found the idea of a very dynamic&expressive programming language and its interactive development style very interesting. The first useful implementations on smaller computers I saw were MacScheme and Coral Lisp, both for the Macintosh, Mid 80s...

There were articles about Lisp in the Byte magazine early on, but having access to the software mentioned was difficult.

The early use cases one heard of were: computer science education, functional programming, generally experimenting with new ideas of writing software, natural language processing, symbolic mathematics, ... This was nothing which would be more attractive to a wider audience. David Betz Xlisp later made Lisp more accessible. Which was then used in AutoCAD as an extension language: AutoLisp.

Luckily I had starting mid 80s access at the university to the incoming stream of new research reports and there were reports about various Lisp related projects, theses, etc.

keithwinstein1 year ago

The first edition of SICP came out in the fall of 1984 (a year after these Hofstadter columns). This fall is the 40th anniversary!

bsder1 year ago

I stand corrected on that. Thanks.

abecedarius1 year ago

Hofstadter's followup article had a more interesting example.

When I first got a Byte magazine as a pre-teen, one of the articles was Lisp code for symbolic differentiation and algebraic simplification. I really couldn't follow it but felt there was something intriguing there. Certainly it wouldn't have been easier in Basic.

(Byte September 1981, AI theme issue. Later I was able to tell the code was not so hot...)

I didn't really get into Lisp until the late 80s with XLisp on a PC, and SICP. Worth the wait!

oaktowner1 year ago

I just love his writing so much -- he captures what I felt when I discovered Lisp. As a kid learning programming in the 80s, I had already done some BASIC, Fortran, Pascal and COBOL in high school and early college. There were differences, of course, but they had some fundamental commonality.

At UC Berkeley, however, the first computer science class was taught in Scheme (a dialect of Lisp)...and it absolutely blew me away. Hofstadter is right: it feels the closest to math (reminding me a ton of my math theory classes). It was the first beautiful language I discovered.

(edit: I forgot to paste in the quote I loved!)

"...Lisp and Algol, are built around a kernel that seems as natural as a branch of mathematics. The kernel of Lisp has a crystalline purity that not only appeals to the esthetic sense, but also makes Lisp a far more flexible language than most others."

Jeff_Brown1 year ago

Have you tried Haskell? It feels much closer to math to me. Definitions, not procedures. It even looks like math.

medo-bear1 year ago

Maybe Haskell is more like Bourbaki math, whereas Lisp is more like Russian style maths (ala Vladimir Arnold). I prefer the latter tbh, and I come to programming from a maths background. We are all different. Lisp to me is yet to be surpassed in terms of ergonomics when transfering my thoughts into computer code.

xanderlewis1 year ago

Interesting. How would you characterise each (Bourbaki and Russian-style)?

+2
medo-bear1 year ago
oaktowner1 year ago

No! After about 10 years of writing software professionally, I moved over to product management, and my time spent coding decreased drastically (in the last 15 years, only some Python to show my kids a thing or two).

But I'd love to try! Maybe I'll take an online class for fun.

Jeff_Brown1 year ago

I can't recommend it highly enough. You're already familiar with laziness from Lisp, but purity is another head-trip. It made me a better programmer in any language, and even a better software architect before I've written a line of code.

And algebraic data types make it possible to make your code conform to reality in ways that classes can't. Once you're exposed to them, it's very much like learning about addition after having been able to multiply for your whole life. (In fact that's more than a metaphor -- it's what's happening, in a category theoretic sense.)

Haskell has other cool stuff too -- lenses, effect systems, recursion schemes, searching for functions based on their type signatures, really it's a very long list -- but I think laziness, purity and ADTs are the ones that really changed my brain for the better.

sourcepluck1 year ago

Have you tried Coalton? It's a Common Lisp library that adds Haskell-esque (or near-Haskell) type wonders, and which smoothly interoperates with your Common Lisp code.

Your comment is great though, consider me convinced. I've done a bit of messing with Lisp, but really would like to try write something in Haskell, or slog through a book or two, some day.

+3
chamomeal1 year ago
pmarreck1 year ago

Do you ever code just for fun?

tightbookkeeper1 year ago

Personal anecdote: I got a lot more out of lisp that stuck with me than Haskell. Occasionally I say "oh this is a monad" or think about a type signature, but that's about it.

nxobject1 year ago

At the risk of diverging off from the original post, I also think that calling it "math" might make things a bit murky (and this is coming from someone who wanted to be algebraic topologist!)

It _is_ an elegant and minimal expression of a style of programming that is ubiquitous among dynamically-typed, garbage-collected languages. And it's a "theory" in the sense that it seems complete, and that you can think of ways to solve problems into Scheme and translate that into other dynamically-typed languages and still end with an elegant solution. Emphasis on the elegant (since minimal, wart-free, consistent and orthogonal, etc.).

Scheme was a simplification and a "cleaning up" compared to conventional Lisps of the time (lexical scoping, single shared namespace for functions and variables etc.)

furyofantares1 year ago

40 years ago, and 20 years into the field:

> February, 1983

> IN previous columns I have written quite often about the field of artificial intelligence - the search for ways to program computers so that they might come to behave with flexibility, common sense, insight, creativity, self awareness, humor, and so on.

This is very amusing to me because it reads like a list of things LLMs truly stink at. Though at least they finally represent some nonzero amount of movement in that direction.

zyklu51 year ago

You must interact with more interesting people than I because to me LLMs have demonstrated as much "common sense, insight, creativity, self awareness, humor" as the average person I run into (actually maybe more but that makes me sound crazy to myself).

ska1 year ago

Eliza effect is alive and well, also.

ableal1 year ago

Way back someone observed that the problem was not computers thinking like people, but people thinking like computers.

I believe we've been getting a bit of the latter.

ska1 year ago

His research group has a long history of trying to tackle these problems. Some interesting reading , even if much of it hasn’t (yet?) panned out .

kevindamm1 year ago

This article, and the two companion articles it mentions, can be found in the book "Metamagical Themas" [0] in chapters 17-19, as well as all of his other articles that appeared in this series of Scientific American.

[0]: https://www.goodreads.com/book/show/181239.Metamagical_Thema...

(the book's title is the article series, which originated as an anagram of the article series that Martin Gardner authored, "Mathematical Games," also published in Scientific American and which Hofstadter then took over)

antitoi1 year ago

I love this book. Highly recommended for all lovers of Godel Escher Bach (his classic).

onemoresoop1 year ago

This book is indeed very beautiful to read and look at, it has a lot of fascinating illustrations.

susam1 year ago

> Attempting to take the car or cdr of nil causes (or should cause) the Lisp genie to cough out an error message, just as attempting to divide by zero should evoke an error message.

Interestingly, this is no longer the case. Modern Lisps now evaluate (car nil) and (cdr nil) to nil. In the original Lisp defined by John McCarthy, indeed CAR and CDR were undefined for NIL. Quoting from <https://dl.acm.org/doi/pdf/10.1145/367177.367199>:

> Here NIL is an atomic symbol used to terminate lists.

> car [x] is defined if and only if x is not atomic.

> cdr [x] is also defined when x is not atomic.

However, both Common Lisp and Emacs Lisp define (car nil) and (cdr nil) to be nil. Quoting from <https://www.lispworks.com/documentation/HyperSpec/Body/f_car...>:

> If x is a cons, car returns the car of that cons. If x is nil, car returns nil.

> If x is a cons, cdr returns the cdr of that cons. If x is nil, cdr returns nil.

Also, quoting from <https://www.gnu.org/software/emacs/manual/html_node/elisp/Li...>:

> Function: car cons-cell ... As a special case, if cons-cell is nil, this function returns nil. Therefore, any list is a valid argument. An error is signaled if the argument is not a cons cell or nil.

> Function: cdr cons-cell ... As a special case, if cons-cell is nil, this function returns nil; therefore, any list is a valid argument. An error is signaled if the argument is not a cons cell or nil.

susam1 year ago

I was curious what it is like on Maclisp. Here is a complete telnet session with Lars Brinkhoff's public ITS:

  $ telnet its.pdp10.se 10003
  Trying 88.99.191.74...
  Connected to pdp10.se.
  Escape character is '^]'.


  Connected to the KA-10 simulator MTY device, line 0

  ^Z
  TT ITS.1652. DDT.1548.
  TTY 21
  3. Lusers, Fair Share = 99%
  Welcome to ITS!

  For brief information, type ?
  For a list of colon commands, type :? and press Enter.
  For the full info system, type :INFO and Enter.

  Happy hacking!
  :LOGIN SUSAM
  TT: SUSAM; SUSAM MAIL - NON-EXISTENT DIRECTORY
  :LISP

  LISP 2156
  Alloc? n


  *
  (status lispversion)
  /2156
  (car nil)
  NIL
  (cdr nil)
  NIL
  ^Z
  50107)   XCT 11   :LOGOUT

  TT ITS 1652  Console 21 Free. 19:55:07
  ^]
  telnet> ^D Connection closed.
  $
dokyun1 year ago

I recall reading that in early versions of Maclisp, taking the CAR or CDR of NIL worked differently: Taking its CAR would signal an error as you would expect, however taking its CDR would return the symbol plist of NIL, as internally the operation of CDR on the location of a symbol would access its plist, and that's how it was commonly done before there was a specific form for it (and it actually still worked that way into Lisp Machine Lisp, provided you took the CDR of the locative of a symbol).

Apparently the behaviour of the CAR and CDR of NIL being NIL was from Interlisp, and it wasn't until the designers of Maclisp and Interlisp met to exchange ideas that they decided to adopt that behaviour (it was also ostensibly one of the very few things they actually ended up agreeing on). The reason they chose it was because they figured operations like CADR and such would be more correct if they simply returned NIL if that part of the list didn't exist rather than returning an error, otherwise you had to check each cons of the list every time. (If somebody can find the source for this, please link it!)

pfdietz1 year ago

But of course cadr still has to check each access, to see if its of type (or cons null). So I don't see what was saved.

kazinator1 year ago

What's saved is that your code just calls that function, rather than open-coding that check.

Suppose you wrote this code in more than two or three places:

  (if (and (consp x) (consp (cdr x))
    (car (cdr x)))
you might define a function for that. Since there is cadr, you don't have to.

Also, that function may be more efficient, especially if our compiler doesn't have good CSE. Even if x is just a local variable, there is the issue that (cdr x) is called twice. A clever compiler will recognize that the value of x has not changed, and generate only one access to the cdr.

The function can be coded to do that even in the absence of such a compiler.

(That is realistic; in the early lifecycle of a language, the quality of library functions can easily outpace the quality of compiler code generation, because the library writers use efficient coding tricks, and perhaps even drop into a lower level language where beneficial.)

If x itself is a complex expression:

  (if (and (consp (complex-expr y)) (consp (cdr (complex-expr y)))
    (car (cdr (complex-expr y))))
we will likely code that as:

  (let ((x (complex-expr y)))
    ...)
The function call gives us all that for free: (cadr (complex-expr y)). The argument expression is evaluated once, and bound to the formal parameter that the function refers to, and the function body can do manual CSE not to access the cdr twice.
+1
dokyun1 year ago
dkarl1 year ago

The use of car and cdr are such a surprisingly concrete implementation detail in the birth of a language that was designed to be mathematical. The most basic and famous operators of "List Processor" were created to operate not on lists but on conses, an element in a particular machine representation that Lisp uses to build data structures! Not only are conses not always interpreted as lists, but a very very important list, the base case for recursive functions on lists, is not represented by a cons.

Sixty years later, most Lisp programs are still full of operations on conses. A more accurate name for the language would be "Cons Processor!" It's a reminder that Lisp was born in an era when a language and its implementation had to fit hand in glove. I think that makes the achievement of grounding a computer language in mathematical logic all the more remarkable.

fuzztester1 year ago

maybe related to the need to conserve CPU registers in machines of the time?

https://en.m.wikipedia.org/wiki/CAR_and_CDR

In any case, ASTute observation

er, ASTute ;)

lisper1 year ago

> Modern Lisps now evaluate (car nil) and (cdr nil) to nil.

Scheme doesn't. Taking the CAR or CDR of nil is an error.

susam1 year ago

Does Scheme even have NIL in the sense that other Lisps like CL or Elisp have? I mean in Common Lisp, we have:

  CL-USER> (symbolp nil)
  T
  CL-USER> (atom nil)
  T
  CL-USER> (listp nil)
  T
Similar results in Emacs Lisp. But in MIT Scheme, we get:

  1 ]=> nil

  ;Unbound variable: nil
Of course, we can use () or (define nil ()) to illustrate your point. For example:

  1 ]=> (car ())

  ;The object (), passed as the first argument to car, is not the correct type.
But when I said NIL earlier, I really meant the symbol NIL that evaluates to NIL and is both a LIST and ATOM. But otherwise, yes, I understand your point and agree with it.
lisper1 year ago

> Does Scheme even have NIL in the sense that other Lisps like CL or Elisp have?

No. It has an empty list, which is a singleton atomic value whose type is not shared with any other object, and it has a boolean false value, which is distinct from the empty list. A user can create a symbol named NIL, but that symbol has no characteristics that distinguish it from any other symbol. You can, of course, bind NIL to either the empty list or boolean false (or any other value) but it can only have one value at a time (per thread).

+1
blenderob1 year ago
dokyun1 year ago

I don't believe so, standardly. Guile scheme added the value `#nil' which is equivalent to NIL and distinct from #f and the empty list, but this was done in order to support Emacs Lisp.

pmarreck1 year ago

I'm not a LISPer but this just seems more correct to me, since stricter is usually more correct.

Ruby (not a lisp but bear with me) started to do this more correctly IMHO where a nil would start throwing errors if you tried to do things with it BUT it would still be equivalent to false in boolean checks.

lisper1 year ago

It depends on what you are trying to optimize for. There is a benefit to punning the empty list and boolean false. It lets you shorten (not (null x)) to just x, and that is a common enough idiom that it actually makes a difference in real code. And there is a benefit to being able to say or type "nil" instead of "the empty list" because "nil" is shorter. But yeah, for modern production code, I agree that stricter is better, all else being equal.

+1
pmarreck1 year ago
+3
pmarreck1 year ago
anthk1 year ago

Elisp and CL do.

sph1 year ago

Sadly this is not the case with Scheme and it makes for very unergonomic code, especially for a newbie like me.

Which is a shame, because I prefer (Guile) Scheme to Common Lisp.

pfdietz1 year ago

I'm very tied to Common Lisp, but I'm perfectly fine with the idea of a lisp in which car and cdr would be undefined on nil. Also, I'd be fine with a lisp in which () is not a symbol. I don't think these features of Common Lisp are essential or all that valuable.

sph1 year ago

They are not essential, but they make code that operates in lisp more compact and pleasant to write.

In Scheme my code is littered with

  (if (null? lst)
      ;; handle empty case here
      ...)
Simply because otherwise car throws an error. This whole section is often unnecessary in CL.
+1
tmtvl1 year ago
guenthert1 year ago

Now, which is more pleasant to read (arguably the more important question for all, but the most primitive of applications)?

BoiledCabbage1 year ago

> Sadly this is not the case with Scheme and it makes for very unergonomic code,

How so? If car of nil returns nil, then how does a caller distinguish between a value of nil and a container/list containing nil?

The only way is they can check to see if it's a cons pair or not? So if you have to check if it's a cons pair then you're doing the same thing as in scheme right?

I may be missing something, but isn't it effectively the same amount of work just potentially? Need to check for nil and need to check if it's a pair?

kazinator1 year ago

> how does a caller distinguish between a value of nil and a container/list containing nil

Very easily; but the point is that it's very often easy to design things so that the caller doesn't have to care.

For instance, lookup in an associative list can just be (cdr (assoc key alist)).

If the key is not found, assoc returns nil, and so cdr returns nil.

Right, so when we use this shortcut, we have an ambiguity: does the list actually have that key, but associated with the value nil? Or does it not have the key.

Believe it or not, we can design the data representation very easily such that we don't care about the difference between these two cases; we just say we don't have nil as a value; a key with a value nil is as good as a missing key.

This situation is very often acceptable. Because, in fact, data structures are very often heavily restrained in what data types they contain. Whenever we assert that, say, a dictionary has values that are, say, strings, there we have it: values may not be nil because nil is not a string. And so the ambiguity is gone.

A nice situation occurs when keys are associated with lists of values. A key may exist, but be associated with an empty list (which is nil!). Or it may not exist. We can set things up so that we don't care about distinguishing these two. If key K doesn't exist then K is not associated with a list of items, which is practically the same as being associated with an empty list of items. If we split hairs, it isn't, but in a practical application things can be arranged so it doesn't matter.

susam1 year ago

> How so? If car of nil returns nil, then how does a caller distinguish between a value of nil and a container/list containing nil?

How about this?

  CL-USER> (null nil)
  T
  CL-USER> (null '(nil))
  NIL
  CL-USER>
+2
BoiledCabbage1 year ago
pmarreck1 year ago

is there a term to describe the language design choice (reminds me of SQL, btw, where it is equally bad IMHO) where doing things to nil just returns nil without erroring? I want to call it "bleeding nils/NULLs" if there isn't another term yet.

As stated, I think this design choice is terrible, especially if nil isn't equivalent to false in boolean comparisons (as it is in Ruby and Elixir- with Elixir actually providing two types of boolean operators with slightly different but significant behavior; "and" will only take pure booleans while "&&" will equate nil with false). It might mean cleaner-written code upfront but it's going to result in massively-harder-to-debug code because the actual error (a mishandled nil result) might only create a visible problem many stack levels away in some completely different part of the code.

michaelcampbell1 year ago

> where doing things to nil just returns nil without erroring

Just call it a Result::Failure monad, say you meant to do that, and confuse legions of programmers for decades.

pmarreck1 year ago

LOL. I mean, that's in essence what it's trying to do, right?

lisper1 year ago

There really should be two different kinds of cons cells, one for "proper" linked lists and another for general purpose consing. The difference is that the cdr of the first kind of cons cell (I'll call it a PL-cons) can only be NIL or another PL-cons, not anything else. This would eliminate vast categories of bugs. It would also make the predicate for determining is something was a proper list run in constant time rather than O(n). (There would still be edge cases with circular lists, but those are much less common than non-proper lists.)

smrq1 year ago

I certainly know the Lisp information in this article already, but it's still a fun read. Hofstadter just has a charming way with words.

I found this bit extra amusing:

>It would be nice as well as useful if we could create an inverse operation to readers-digest-condensed-version called rejoyce that, given any two words, would create a novel beginning and ending with them, respectively - and such that James Joyce would have written it (had he thought of it). Thus execution of the Lisp statement (rejoyce 'Stately 'Yes) would result in the Lisp genie generating from scratch the entire novel Ulysses. Writing this function is left as an exercise for the reader.

It took a while, but we got there. I don't think 2024's AI is quite what he had in mind in 1983, but you have to admit that reproducing text given a little seeding is a task that quite suits the AI of today.

taeric1 year ago

I do think LISP remains the major language that can encompass the strange loop idea he explored in his work. I know LISP is not the only homoiconic language, but it is the biggest that people know how to use where the "eval" function doesn't take in a string that has to be parsed.

I hate that people are convinced LISP == functional programming, writ large. Not that I dislike functional programming, but the symbolic nature of it is far more interesting to me. And it amuses me to no end that I can easily make a section of code that is driven by (go tag) sections, such that I can get GOTO programming in it very easily.

nine_k1 year ago

Another (properly functional) homoiconic language that enjoyed mainstream adoption briefly in '00s is XSLT. Its metaprogramming features were rather widely used, that is, producing an XSLT from XSLT and maybe some more XML, instead of hand-coding something repetitive, was rather normal.

The syntax was a bigger problem than Lisp's syntax, though.

It's not easy to produce a language with a syntax that's good as daily use syntax, but is also not unwieldy as an AST. Lisp is one of the few relatively successful examples.

pmarreck1 year ago

I don't know how many other languages use it but I've long admired Elixir's approach to giving devs access to the AST using its basic types in order to write macros:

https://hexdocs.pm/elixir/macros.html

It is certainly possible to implement this sort of thing in other languages, I think, depending on the compilation or preprocessing setup

medo-bear1 year ago

Possible doesnt mean "requires same ammount of effort"

pmarreck1 year ago

That's fair. I think it's a big win, though. Macros, when the situation calls for it, are amazing. For example, I believe most of the UTF8 handling code in Elixir was done via macros which brought down the amount of code that had to be maintained by quite a bit.

apex_sloth1 year ago

Thanks for this little flashback to when I had to write XSLT for apache cocoon as my student job

AnimalMuppet1 year ago

> The syntax was a bigger problem than Lisp's syntax, though.

Yeah. XML and S expressions are pretty close to functionally equivalent. But once you've seen S expressions, XML is disgustingly clumsy.

chubot1 year ago

They have a different model -- one is better for documents, and one is better for programs/data

XML and HTML are attributed text, while S-expressions are more like a homogeneous tree

If you have more text than metadata, then they are more natural than S-expressions

e.g. The closing </p> may seem redundant, until you have big paragraphs of free form text, which you generally don't in programs

nine_k1 year ago

SGML was intended for sparse markup in mostly plaintext files. From it grew HTML that is markup-heavy, and XML which is often 100% markup. What made sense for rare markup nodes became... suboptimal when applied in a very different role.

jll291 year ago

1. GML => SGML => XML

2. rm *

3. JSON

4. rm -rf /

pfdietz1 year ago

"Any data can be turned into Big Data by encoding it in XML."

fuzztester1 year ago

Wow.

Also:

XML: eXtremely Murky Language

or Mindblowing

pjmlp1 year ago

For a while that is how I made my Website dynamic, by writing everything in XML and linking XSLT stylesheets, however the future ended up not being XHTML, and eventually I rewrote those stylesheets in PHP.

Doesn't win any price, or content worth of "I rewrote X in Y" blogpost, but does the job.

throwaway199721 year ago

Not to mention specifically with Scheme and continuation-oriented programming, the line between functional and non-functional programming becomes so blurry as to become nearly meaningless.

fuzztester1 year ago

The definition of functional programming is itself quite blurry, says Chris Lattner (of Swift, LLVM, Mojo), in this talk I posted here recently:

https://news.ycombinator.com/item?id=41822811

medo-bear1 year ago

Even the definition of a lisp is blurry when we zoom in to find the seperation boundry

brucehoult1 year ago

Lambda: the ultimate GOTO

bbor1 year ago

I love and relate to any impassioned plea on SWE esoterica, so this seems like as good of a place as any to ask: What, in practice, is this deep level of "homoiconic" or "symbolic" support used for that Python's functools (https://docs.python.org/3/library/functools.html) doesn't do well? As someone building a completely LISPless symbolic AGI (sacrilege, I know), I've always struggled with this and would love any pointers the experts here have. Is it something to do with Monads? I never did understand Monads...

To make this comment more actionable, my understanding of Python's homoiconic functionality comes down to these methods, more-or-less:

1. Functions that apply other functions to iterables, e.g. filter(), map(), and reduce(). AKA the bread-n-butter of modern day JavaScript.

2. Functions that wrap a group of functions and routes calls accordingly, e.g. @singledispatch.

3. Functions that provide more general control flow or performance conveniences for other functions, e.g. @cache and and partial().

3. Functions that arbitrarily wrap other functions, namely wraps().

Certainly not every language has all these defined in a standard library, but none of them seem that challenging to implement by hand when necessary -- in other words, they basically come down to conviences for calling functions in weird ways. Certainly none of these live up to the glorious descriptions of homoiconic languages in essays like this one, where "self-introspection" is treated as a first class concern.

What would a programmer in 2024 get from LISP that isn't implemented above?

taeric1 year ago

I'm basically a shill for my one decent blog post from a while back. :D

https://taeric.github.io/CodeAsData.html

The key for me really is in the signature for "eval." In python, as an example, eval takes in a string. So, to work with the expression, it has to fully parse it with all of the danger that takes in. For lisp, eval takes in a form. Still dangerous to evaluate random code, mind. But you can walk the code without evaluating it.

bbor1 year ago

HackerNews sadly never fails to disappoint. Thanks for taking the time to share, that was exactly what I was looking for! Would endorse this link for any lurkers.

The LISP (elisp?) syntax itself gives me a headache to parse so I think I'll stay away for now, but I'll definitely be thinking about how to build similar functionality into my high level application code -- self modification is naturally a big part of any decent AGI project. At the risk of speaking the obvious, the last sentence was what drove it home for me:

    It is not just some opaque string that gets to enjoy all of the benefits of your language. It is a first class list of elements that you can inspect and have fun with. 
I'm already working with LLM-centric "grammars" representing sets of standpoint-specific functions ("pipelines"), but so far I've only been thinking about how to construct, modify, and employ them. Intelligently composing them feels like quite an interesting rabbit hole... Especially since they mostly consist of prose in minimally-symbolic wrappers, which are probably a lot easier for an engineer to mentally model--human or otherwise. Reminds me of the words of wonderful diehard LISP-a-holic Marvin Minsky:

  The future work of mind design will not be much like what we do today. ...what we know as programming will change its character entirely-to an activity that I envision to be more like sculpturing.
  To program today, we must describe things very carefully because nowhere is there any margin for error. But once we have modules that know how to learn, we won’t have to specify nearly so much-and we’ll program on a grander scale, relying on learning to fill in details.
In other words: What if the problem with Lisp this whole time really was the parentheses? ;)

source is Logical Versus Analogical or Symbolic Versus Connectionist or Neat Versus Scruffy: https://onlinelibrary.wiley.com/doi/full/10.1609/aimag.v12i2...

taeric1 year ago

Glad you liked the post. I didn't do any effort to make the elisp readable, so please don't let that fully put you off the topic! :D

I keep meaning to expand on the idea. I keep not doing so. I have higher hopes that I can get back to the rubik's cube code. Even there, I have a hard time getting going.

medo-bear1 year ago

> In other words: What if the problem with Lisp this whole time really was the parentheses? ;)

I am yet to find a syntax style more ergonomic than s-expressions. Once you appreciate the power of structural code editing your view of s-expressions is likely to change

+2
BoiledCabbage1 year ago
wk_end1 year ago

The syntax of Lisp is made up of the same fundamental data types as you use when writing Lisp programs. `(+ 1 2 3)` is both a Lisp expression that evaluates to 6 and also a list containing four items, the symbol `+` and the numbers 1, 2, and 3.

In general, we can say that the Lisp language is very good at manipulating the same data types that the syntax of Lisp programs is made from. This makes it very easy to write Lisp programs that swallow up Lisp programs as raw syntax, analyze Lisp programs syntactically, and/or spit out new Lisp programs as raw syntax.

kazinator1 year ago

Some of it is because many people's only contact with Lisp is via academia, and the academics who teach it actually don't care about developing anything with Lisp. They use it as a vehicle for concepts, and those concepts typically revolve around functional recursion.

The Scheme language and it surrounding culture are also culprits. Though Scheme isn't functional, it emphasizes pure programming more than its Lisp family predecessors. The basic language provides tail recursive constructs instead of iterative ones, and demands implementations to implement optimized tail calls.

anthk1 year ago

>Emacs defalias

On Common Lisp too, by defining defalias as a macro:

https://stackoverflow.com/questions/24252539/defining-aliase...

eigenhombre1 year ago

I loved Hofstadter's writing on Lisp in Metamagical Themas and adapted the code in the last article of the series to Clojure for a study group at work, written up here[1].

[1] http://johnj.com/posts/oodles/

edit: clarification

silcoon1 year ago

Nice, I wonder if there was a translation on a modern Lisp.

abecedarius1 year ago
paulddraper1 year ago

> As you might expect, the value of the atom 1729 is the integer 1729, and this is permanent. (I am distinguishing here between the atom whose print name or pname is the four-digit string 1729, and the eternal Platonic essence that happens to be the sum of two cubes in two different ways - i.e., the number 1729.)

He is? What is the distinguishment he is making?

This writing styling is....interesting.

shrubble1 year ago

The use of 1729 would be known to people who know about Ramanujan: https://en.wikipedia.org/wiki/1729_(number)

Y_Y1 year ago

Hey, that's the code to my safe!

Y_Y1 year ago

An atom is something defined in the semantics of lisp and a part of the program, it will be represented as bits in the computer memory and as pixels on the screen. A number is a very general concept with many representations, on of which is as a lisp atom, and another could be a pile of 1729 kiwis. The kiwis and the code both represent the number, but they don't represent each other.

WolfeReader1 year ago

The lisp atom 1729 is like a "constant" in a programming language, representing a particular arrangement of bits in lisp systems. The integer 1729 is a number that, in a mathematical sense, has always existed and will always exist regardless of computer systems.

While some atoms can be assigned values, the atom 1729 cannot be assigned any value other than the number 1729.

susam1 year ago

> In a testament to the timelessness of Lisp, you can still run all the examples below in emacs if you install these aliases:

> (defalias 'plus #'+)

> (defalias 'quotient #'/)

> (defalias 'times #'*)*

> (defalias 'difference #'-)*

Looks like we also need a defmacro for def that is used much further in the article:

> > (def rac (lambda (lyst) (car (reverse lyst))))

I mean the above example fails in Emacs:

  ELISP> (def rac (lambda (lyst) (car (reverse lyst))))
  *** Eval error ***  Symbol’s function definition is void: def
If we want the above example to work, we need to define def like this:

  ELISP> (defmacro def (name lambda-def) `(defalias ',name ,lambda-def))
  def
Now the previous example, as presented in the article, works fine:

  ELISP> (def rac (lambda (lyst) (car (reverse lyst))))
  rac
  ELISP> (rac '(your brains)) 
  brains
AnimalMuppet1 year ago

> Every computer language has arbitrary features, and most languages are in fact overloaded with them. A few, however, such as Lisp and Algol, are built around a kernel that seems as natural as a branch of mathematics.

Algol? The kernel of Algol seems as natural as a branch of mathematics? Can anyone who has used Algol give their opinion of this statement?

gavindean901 year ago

From what I’ve studied, Algol wasn’t designed for typical software development—its main purpose was to give computer scientists a way to describe algorithms with a level of rigor that mirrors mathematical notation.

andyjohnson01 year ago

I did some Algol programming back in the late 80s - when it had mostly been obsoleted by Pascal, Modula, and even C for what we called "structured programming" back then.

I remember it as a likeable, economical, expressive language, without significant warts, and which had clearly been influential by being ahead of its time.

So my guess is that Hofstadter was just referring to its practical elegance - rather than the more theoretical elegance of Lisp.

nxobject1 year ago

Out of curiosity: which dialect on Algol, and on what platform?

andyjohnson01 year ago

I'm not sure, but possibly Algol 68. It was on an IBM mainframe running VM/CMS - possibly a 3090.

Long time ago...

aidenn01 year ago

Hard to say without knowing which version of Algol he is referring to. Algol 68 was very different from Algol 58.

Algol 60 was the first language with lexical scope, while Algol 68 was a kitchen-sink language that (positively) influenced Python and (negatively) influenced Pascal.

earthicus1 year ago

It was discovered that the procedure mechanism of Algol 60 was effectively equivalent to the lambda calulus. This insight was written out in a famous paper by Peter Landin, "Correspondence between ALGOL 60 and Church's Lambda-notation: part I"

https://dl.acm.org/doi/10.1145/363744.363749

retrac1 year ago

C is basically Algol with curly braces and pointers. The sentiment expressed there is probably equally applicable to C, or maybe Pascal. Those are often held up today as a minimal example in contrast to Lisp. There is a sort of sparse, warty elegance to the family. Blocks, arrays, if/then, assignment, while loops. What more could you need?

AnimalMuppet1 year ago

I've used both C and Pascal. The simplicity of C comes through to me (less so Pascal - the verbosity gets in the way). I never thought of it as "as natural as a branch of mathematics", though.

I mean... I guess you could think of it as having its own set of self-consistent axioms, and from them you can build things. It's a lot larger set of axioms than most branches of mathematics, though.

I guess, if Hofstadter meant the same level of naturalness, well, yes, C did feel pretty natural to me, so... maybe?

analog311 year ago

I read that article when it came out, as my parents subscribed to Scientific American. Even though I had learned BASIC and Pascal, the concepts in the article were just way over my head. Also, I had no access (that I was aware of at least) to a machine where I could try my hand at Lisp programming. Alas, I wish I had taken it more seriously.

At least Hofstadter was successful at getting me interested in math beyond high school.

TruffleLabs1 year ago

I took M490 “Problem Seminar” (a math class) in 1995 with Dr. Hofstadter - we studied triangles, and the definition of a triangle’s center.

You would think that there is a limited set of “triangle centers” but he showed us (and he had us discover and draw them out using The Geometer's Sketchpad) dozens of ways to find triangle centers and he had notes on hundreds more definitions of triangle centers.

His approach to teaching was fun and made us want to take on challenging problems. :)

lproven1 year ago

Me too. I admire the theory of Lisp, but man, all the Lisp folks going "but don't you get it, the absence of syntax IS the syntax!" don't half get tiring.

For some of us, we can just about handle the simple algebraic infix stuff, and we'll never make that leap to "my god, it's full of CARs".

https://xkcd.com/224/

anthk1 year ago

If you have a look on some Emacs code (and modules such as Mastodon.el), you'll see than the syntax is not that scary, as Lisp makes it trivial to modularize code into smaller functions.

lproven1 year ago

I have spent years writing about and studying Lisp, including buying several books.

This is categorically not the case.

Let me paraphrase my own post from Lobsters a year or two back:

I hypothesise that, genuinely, a large fraction of humanity simply lacks the mental flexibility to adapt to prefix or postfix notation.

Algebraic notation is, among ordinary people, almost a metonym for “complicated and hard to understand”. I suspect that most numerate people could not explain BODMAS precedence and don’t understand what subexpressions in brackets mean.

I have personally taught people to program who did not and could not understand the conceptual relationship between a fraction and a percentage. This abstraction was too hard for them.

Ordinary line-numbered BASIC is, I suspect, somewhere around the upper bound of cognitive complexity for billions of humans.

One reason for the success of languages with C syntax is that it’s the tersest form of algebraic notation that many people smart enough to program at all can handle.

Reorder the operators and you’ve just blown the minds of the majority of your target audience. Game over.

I admire Lisp hugely, but I am not a Lisp proponent.

I find it fascinating and the claims about it intrigue me, but to me, personally, I find it almost totally unreadable.

Those people I am talking about? I say this became I am one.

I myself am very firmly in the camp of those for whom simple algebraic infix notation is all I can follow. Personally, my favourite programming language is still BASIC.

+1
rini171 year ago
+2
lispm1 year ago
+1
kazinator1 year ago
analog311 year ago

For me, the issue wasn't cognitive, but simply lack of access. The two languages that ran on my Sanyo MBC-550 were BASIC and Turbo Pascal.

Outside of expressions, those languages are essentially prefix in that the operator comes before the list of arguments.

eska1 year ago

* After this "def wish" has been carried out, the rac function is as well understood by the genie as is car. *

Sometimes I wonder what non-programmers think about us when they hear us talk..

timonoko1 year ago

Maclisp goodness:

  (compress (reverse (explode x)))
Elisp much improved:

  (defun explode (x)
    (if (symbolp x) (setq x (symbol-name x)))
    (string-to-list x))
  (defun compress (x) (concat x))
timonoko1 year ago

I was wrong: It was "implode" in Maclisp.

  (compress (reverse (explode 'ABC)))
  ;COMPRESS UNDEFINED FUNCTION OBJECT

  (implode (reverse (explode 'ABC)))
  CBA
The point being that I never learn any fancy string-processing commands. I just implement explode and compress.
g192051 year ago

this is how explode behaves on a lisp machine:

    (defun explode (x)
       (mapcar (lambda (x)
          (intern (char-to-string x)))
        (string-to-list (prin1 x))))
turning character into symbol seems natural, because then you are reducing your needed function space even more. I'm surprised the original operated on prin1 output, not sure what the logic behind that is. on a lisp machine (zl:explode "foo") gives me '(|"| |f| |o| |o| |"|)
UniverseHacker1 year ago

(upgrade (mail (change (trash (fix (break (use (buy it))))))))

fuzztester1 year ago
fuzztester1 year ago

he he. good one.

but you may have misunderstood what I meant.

I wasn't criticizing you.

it was just a joke related to that scheme book.

lopatin1 year ago

Any Shen people in the house?

corinroyal1 year ago

Admirer, not user. So ambitious and gorgeous. Hosted on Common Lisp with full integration, so useful now. I hope more people check it out. The new Shen book is awesome.

jll291 year ago

links?

corinroyal1 year ago

Here's a link to their website with the book: https://shenlanguage.org/TBoS/tbos.html

rsktaker1 year ago

Interesting article, I enjoyed following along - but I do hate the parentheses lol

bonaldi1 year ago

What dialect is he using that has “plus” vs “+” and so on?

waffletower1 year ago

I find this article to be quaint -- remember reading it decades ago and feeling more receptive to its perspective. Ironically, I prefer using Clojure (though some here challenge its status as a Lisp lol) to interface with Large Language Models rather than Python. Clojure in particular is much better suited, for some reasons that Hofstadter details, and if you can interact with an LLM over a wire, you are not beholden to Python. But what we use to interface to these massive digital minds we are building, including the Bayesian sampling mathematics we use to plumb them, may have their elegance, but they are orthogonal to the nearly ineffable chaos of these deeply interconnected neural networks -- and it is in this chaotic interconnectedness where artificial intelligence is actually engendered.

iLemming1 year ago

> Clojure in particular is much better suited

Clojure in general is far better suited for manipulating data than anything else (in my personal experience). It is so lovely to send a request, get some data, and then interactively go through that data - sorting, grouping, dicing, slicing, partitioning, tranforming, etc.

The other way around is also true - for when you need to generate a massive amount of randomized data.

lenkite1 year ago

Clojure doesn't have a standard, well-maintained dataframe library - so it is not suitable for any medium to large data science.

iLemming1 year ago

I don't do "true" data science, so my voice of "expertise" in the matter is limited. This is the extent of what I've heard. In my opinion, neither of the clauses in your statement are true.

Clojure is very well suited for data science of all shapes and sizes. There's a great meetup lead by Daniel Slutsky where they regularly discuss this topic, and there's #data-science channel in Clojurians Slack where they regularly post interesting findings. As for the libraries, anything used in Java/Javascript can be directly used. Besides, there is TMD, https://github.com/techascent/tech.ml.dataset - it's a well-regarded lib and provides solid functionality for data manipulation.

+1
lenkite1 year ago
troupe1 year ago

> when you need to generate a massive amount of randomized data.

Even faster than Clojure: Open VIM for a VS Code user and ask them to exit.

iLemming1 year ago

There's no such thing as a "VS Code user", VS Code is the one that uses you, not the other way around.

btw. this isn't some kind of an FP joke, there's no 'fun' in it, only sad truth.

dunefox1 year ago

> I hope you enjoyed Hofstadter's idiosyncratic tour of Lisp. You can find more like this re-printed in his book Metamagical Themas.

This seems like an interesting book.

ceautery1 year ago

It was one of my favorites back in the 1980s. It was a followup to Gödel Escher Bach, written in much the same style.

dahart1 year ago

> Why is most AI work done in Lisp?

That’s changed, of course, but it remained true for at least another 15 or 20 years after this article was written and then changed rather quickly, perhaps cemented with deep neural networks and GPUs.

Other than running the emacs ecosystem, what else is Lisp being used for commonly these days?

iLemming1 year ago

Some purist won't consider Clojure a "true" Lisp, but it's a Lisp dialect.

> what else is Lisp being used for commonly these days?

Anything that runs on Clojure - Cisco has their cybersec platform and tooling running on it; Walmart their receipt system; Apple - their payments (or something, not sure); Nubank's entire business runs on it; CircleCI; Embraer - I know uses Clojure for pipelines, not sure about CL, in general Common Lisp I think still quite used for aircraft design and CAD modeling; Grammarly - use both Common Lisp and Clojure; Many startups use Clojure and Clojurescript.

Fennel - Clojure-like language that compiles to Lua can handle anything Lua-based - people build games, use it to configure their Hammerspoon, AwesomeWM, MPV, Wez terminal and things-alike, even Neovim - it's almost weird how we're circling back - decades of arguing Emacs vs. Vim, and now getting Vim to embrace Lisp.

tombert1 year ago

When I was there, Apple used Clojure for a lot of stuff involving the indexing of iTunes/Apple Music. I used it for some telemetry stuff on top of the indexer as well. Not sure what other teams used it for.

SSLy1 year ago

Google Flights was built on CL, no?

sourcepluck1 year ago

I think personally that Coalton and the stuff its built on is crazy cool. Coalton is a little library you add to your Lisp, but, to quote the third link here: "In terms of its type system, Coalton’s closest cousin is Haskell." So Lisp's dynamism with all sorts of advanced typing.

QVM, a Quantum Virtual Machine https://github.com/quil-lang/qvm

Quilc, an "advanced optimizing compiler" for Quil https://github.com/quil-lang/quilc

Coalton, "a statically typed functional programming language built with Common Lisp." https://coalton-lang.github.io/20211010-introducing-coalton/

nextos1 year ago

> Why is most AI work done in Lisp?

Yann LeCun developed Lush, which is a Lisp for neural networks, during the early days of deep architectures. See https://yann.lecun.com/ex/downloads/index.html and https://lush.sourceforge.net. Things moved to Python after a brief period when Lua was also a serious contender. LeCun is not pleased with Python. I can't find his comments now, but he thinks Python is not an ideal solution. Hard to argue with that, as its mostly a thin wrapper over C/C++/FORTRAN that poses an obvious two-language problem.

buescher1 year ago

A friend used lush as his “secret weapon” for a while. I didn’t quite warm to it and now regret not paying attention. It’s amazing how much is packed in “batteries included.”

Apparently it didn’t make the transition to 64-bit machines well? But I haven’t really looked.

shawn_w1 year ago

It's just as easy to have thin wrappers over C/etc. number crunching libraries in Common Lisp as it is Python. And pure CL code is typically faster than pure Python (though pypy might be a different story). There's no technical reason it still couldn't be dominant in AI.

It's a shame things took the course they did with preferred languages.

mportela1 year ago

My take is that Python won by having a complete ecosystem centralizing many tools that were dispersed in different languages: - Numpy/Scipy/Matplotlib enabled scientists to do data analysis with Panda similar to what was available in R - PySpark enabled big data scripts in Python instead of Scala - PyTorch made Torch available for non-Lua users

Bit by bit, more people got used to doing data analysis and AI research in Python. Some projects were even written for Python first (e.g. Tensorflow or Keras). Eventually, Python had so many high-quality packages that it became the de facto for modern AI.

Is it the _best_ language for AI, though? I doubt. However, it is good enough for most use cases.

sourcepluck1 year ago

Hadn't seen that before, very interesting!

tmtvl1 year ago

> what [...] is Lisp being used for [...] these days?

I dunno, there's Nyxt, Google Flights, MediKanren, there's some German HPC guys doing stuff with SBCL, Kandria,... I believe there's also a HFT guy using Lisp who's here on HN. LispWorks and Franz are also still trucking, so they prolly have clientele.

There are fewer great big FLOSS Lisp projects than C or Rust, but that doesn't really tell the whole story. Unfortunately proprietary and internal projects are less visible.

tombert1 year ago

Can't speak for the entire industry obviously, but at a few jobs I've had [1] Clojure is used pretty liberally for network-heavy stuff, largely because it's JVM and core.async is pretty handy for handling concurrency.

I know a lot of people classify "Clojure" and "Lisp" in different categories, but I'm not 100% sure why.

[1] Usual disclaimer: It's not hard to find my job history, I don't hide it, but I politely ask that you don't post it here.

packetlost1 year ago

> I know a lot of people classify "Clojure" and "Lisp" in different categories, but I'm not 100% sure why

It mostly boils down to Clojure not having CONS cells. I feel like this distinction is arbitrary because the interesting aspect of Lisps is not the fact that linked-lists are the core data-structure (linked-lists mostly suck on modern hardware), but rather that the code itself is a tree of lists that enables the code to be homoiconic.

pfdietz1 year ago

I mean, you can have a tree of vectors also, so I don't see why lists are needed for homoiconicity.

+1
iLemming1 year ago
packetlost1 year ago

That's mostly my point. A linked-list structure is not the interesting part. I use the "generic" reading of list above and don't mean to imply some particular implementation

ryukafalz1 year ago

Guix is a Nix-like package manager and distro that is almost entirely written in Guile Scheme: https://guix.gnu.org/

I would guess it's by far the most active Guile project.

vindarel1 year ago

Quantum computing and symbolic AI? But also web services, CAD and 3D software, trading, designing programmable chips, big data analytics…

present companies (that we know about): https://github.com/azzamsa/awesome-lisp-companies/

mepian1 year ago

>what else is Lisp being used for commonly these days?

It is being used for formal verification in the semiconductor industry by companies like AMD, Arm, Intel, and IBM: https://www.cs.utexas.edu/~moore/acl2/

jjtheblunt1 year ago
casta1 year ago

The pricing engine for Google Flights (and behind many big airline websites) is written in Lisp.

knbknb1 year ago

Some computer science departments (and their MOOCs) use Lisp Dialects "Racket" and "Scheme" as a Teaching Language . For example, IDE DrRacket has an innovative language preselection feature that allows students to start out with a "Beginning Student Language".

https://www.racket-lang.org/

volltrottel1 year ago

Running hacker news

chromaton1 year ago

AutoCAD automation?

fuzztester1 year ago

Yes. AutoLisp was available from the early days of AutoCAD. I didn't use it much myself. I just helped some mechanical engineers with it in a company where I worked, in small ways, just tinkering, really. At that time I was quite junior, so I didn't really grasp the power of it, so I didn't play around with it much.

kjellsbells1 year ago

Regardless of your opinion on the utility of Lisp, this is an exemplary piece of writing. Crisp, engaging, informative.

God I miss old Scientific American. Today's SA isn't especially terrible, but old SA, like old BYTE, was reliably enlightening.

sgustard1 year ago

The title of his column and book "Metamagical Themas" is an anagram of Martin Gardner's previous column "Mathematical Games". It's clever wordplay turtles all the way down.

madcaptenor1 year ago

Other Hofstadter book titles with wordplay:

- Gödel, Escher, Bach: an Eternal Golden Braid (you have GEB/EGB, and I guarantee you he noticed those notes form a musical triad)

- Metamagical Themas (anagram of Mathematical Games)

- Le Ton beau de Marot (I don't have my copy at hand, but "ton beau" is surely a pun on "tombeau" meaning "tomb")

- The Mind's I (editor) (I = eye)

- That Mad Ache (translation of "La chamade" by Francoise Sagan; "mad ache" is an anagram of "chamade")

shrubble1 year ago

At least one of the covers of GEB specifically had artwork that shows GEB/EGB : https://en.wikipedia.org/wiki/G%C3%B6del,_Escher,_Bach

gjm111 year ago

"tombeau" literally means "tomb", but the term also sometimes means "piece written as a memorial", like Ravel's piano suite "Le Tombeau de Couperin". And yes, Hofstadter explicitly links "ton beau" with "tombeau" (he doesn't explicitly mention the "memorial" meaning, though when he mentions the literal "tombeau de Marot" he is talking specifically about the epitaph on it) and also with "tome beau", the great book of Marot's life and work.

I'd find it a cleverer bit of wordplay if "le ton beau de ..." itself didn't feel clumsy. Surely it would always be "le beau ton de ..."?

madcaptenor1 year ago

This was all somewhere in the back of my head but my copy of this book is in my parents' basement somewhere. I'll have to rescue it so I can keep it in my basement.

goldfeld1 year ago

The author of GEB is a phenomenal writer, an old-style researcher who knew his greek, and the book for me is more interesting in its commentary on literature, and psychology, approaching themes of say, Foucault.

I don't know about the work's true impact on AI or tech languages, but it's a masterpiece of criticism, analysis and penmanship.

jhbadger1 year ago

Old school SA was written assuming a basic level of scientific and mathematical background. Many people reading it were professional scientists and engineers who read it to learn about developments in other fields than their own. Current SA seems to be written at a level similar to the science coverage in newspapers, written for the hypothetical "layman" who is supposedly frightened of mathematics and anything technical. I couldn't imagine someone like Martin Gardner or Hofstadter writing in SA today.

taeric1 year ago

Agreed. It saddens me how I feel I completely slept through a golden age of magazines out there. With no real clue how I could help support that coming back.

I was happy with the section in Wireframe magazines that would show how to code some game mechanics every issue. Would love for more stuff like that.

fuzztester1 year ago

Same with the old National Geographic magazine, before it became slimmer and more ad-heavy, IIRC.

lproven1 year ago

Exactly so. I bought the final issue, because it was the last one, and I read it, and that reminded me why I didn't read National Geographic. Because it's mental chewing gum: an enjoyable flavour, without nutrition; pretty pictures, but I learned little.

fuzztester1 year ago

yes, but what I meant was that the much earlier issues were very good, with not just good pictures, but lots of interesting textual info as well, about the different geographical topics that they covered, e.g. countries, regions within countries, rivers, forests, peoples, etc.

I remember one particular issue about USA rivers which was really good, with great photos.

damn cool article.

the suwannee river was one that was covered.

https://en.m.wikipedia.org/wiki/Suwannee_River

I looked up that river in Wikipedia for the first time today.

TIL it is a blackwater river. first time I heard the term.

https://en.m.wikipedia.org/wiki/Blackwater_river

the NG issues used to come with very good maps as supplements, too, in color.

also there used to be nice color ads about good cameras, IIRC, like canon, minolta, etc, and cars like the cadillac, lincoln, etc.

gas guzzlers, of course.

a different time.

activitypea1 year ago

I remember reading GEB and being shocked that he never mentions Lisp. He _does_ wade into CompSci topics, but it's something half-hearted about how compilers are programs that read and generate programs. This really should've been integrated into a revised edition of the book.

lisper1 year ago

Huh? He mentions Lisp all over the place. Check the index.

baruchthescribe1 year ago

Nonsense.

"One of the most important and fascinating of all computer languages is LISP (standing for "List Processing"), which was invented by John McCarthy around the time Algol was invented. Subsequently, LISP has enjoyed great popularity with workers in Artificial Intelligence."

silcoon1 year ago

Give it another go! _The Anatomy of LISP_ is the first entry in the bibliography.

InDubioProRubio1 year ago

Lisp aNeeds Braces

paddy_m1 year ago

> Lisp needs braces

You're a troll, but I'll feed you. I adapted Peter Norvig's excellent lispy2.py [0] to read json. I call it JLisp [1].

Lispy2 is a scheme implementation, complete with macros that executes on top of python. I made it read json, really just replacing () with []. and defining symbols as {'symbol': 'symbol_name'}. I built it because it's easier to get a webapp to emit JSON then paren lisp. I also knew that building an interpreter on top of lisp meant that I wouldn't back myself into a corner. There is incredible power in the lisp, especially the ability to transform code.

[0] https://norvig.com/lispy2.html

[1] https://github.com/paddymul/buckaroo/blob/main/tests/unit/li... #tests for JLisp

NateEag1 year ago

Here, have another approach to Lisp formatting:

https://readable.sourceforge.io/

I looked into porting it to elisp a while back, but the elisp reader was missing a feature or two sweet-expressions require. I should see if that's still true...

f1shy1 year ago

Would be nice. But I think after hours/days of working with lisp, the brain starts to see it at sweet expressions. That is why all tries to go away from s-exp don't get traction: anytime anybody starts doing it, pretty fast discovers it is really not needed.

lproven1 year ago

I think this is true for the small percentage of people who get through that initial stage -- but it excludes the (I suspect) majority who just bounce off it.

I just bounced off it, and I have tried quite hard, repeatedly.

Idea: for the rest of us who can't simply flip syntax around in our heads, there should be an infix Lisp that tries to preserve some of the power without the weird syntaxless syntax.

There are of course several, of which maybe the longest-lived is Dylan:

https://en.wikipedia.org/wiki/Dylan_(programming_language)

... but instead of Dylan's Algol- or Pascal-like syntax, do a Dylan 2 with C-style syntax?

pjmlp1 year ago

In what form?

We have Dylan, Julia, and a couple of other attempts at the matter.