Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The only way it would have "benefited" would be that the web would only be developed by lisp programmers.

To the vast majority of programmers, syntax matters. C-style with brackets, or python whitespace, or Ruby do/end, these fit better the brains of the majority of programmers. Perhaps not the majority of HN readers but the majority of corporate devs.

Another example of this is Erlang and Elixir. Elixir adds a couple of features over Erlang, macros and protocols, but Erlang does everything else. What made Elixir take off where Erlang didn't, after decades, is that Elixir has a syntax that people are comfortable with. Erlang has a syntax that will summon Cthulu.



To be fair, what made Elixir take off, other than syntax there is also the developer experience overall, including documentation, package manager, unit testing framework out of the box, web-framework initially inspired by Rails etc.

Though, slightly off topic, but worth mentioning, that both Erlang and Elixir communities support each other very well. For example, now not only elixir is built on top of Erlang, but also Erlang adopts some things from elixir, such as monadic expression `with` from elixir inspired `maybe` in Erlang, or starting OTP27 Erlang is using ExDoc introduced by Elixir to generate documentations.


Is it an innate property of humans that the curly-brace style is more natural? I wonder if in an alternate universe where Lisp took off as the browser language people would find it more natural instead. It seems like somewhat of a chicken-egg problem.


I think it's innate that having differentiated syntax for different types of grouping is natural. Look at mathematical papers where people will introduce new brackets with new meanings. (Indeed look at the entirety of QM for a clear, simple case)


Some Scheme and Lisp dialects have that. For example, Racket often uses square brackets instead of parentheses for things like clauses of a cond expression, and Clojure uses square brackets for vector literals and curlies for hash map literals.


> "Look at mathematical papers where people will introduce new brackets with new meanings"

Common Lisp has left brackets like {} and [] to the user (aka developer). It supports "reader macros", where the user can extend/supersede the syntax of s-expressions.

So, specialized tools/libraries/applications can introduce these brackets for their own use. Examples are embedded SQL expressions, notations for Frames (special objects, in kind of a mix of OOP and Logics), grammar terms, etc.

Thus it explicitly supports the idea of "people will introduce new brackets with new meanings".


Because lisp is trivial to parse, it’s easy to make an extension for vs code that shows / edits Common Lisp / scheme looking completely different; you can make it unrecognisable. If the () are the only thing bothering people, then this is very simple to resolve. Can hardly be the only thing though. It’s also easy to build operators similar to the ones you have in Python etc like filter, map etc so you don’t have to apply recursion. These are there already but you can build them yourself in a few hours.

So it’s probably just what people learn first + lack of ‘marketing’ or negative PR (there are no libraries or ecosystem! The thing that least bothered me about CL but people with npm leftpad experience seem bothered by it).

It’s interesting as I worked I almost everything in production; c/c++ (including the MS 90s flavour), Delphi, VB, Perl, PHP, Java, C#, Haskell, F#, Common Lisp, Erlang, TS/JS, Python, Ruby, asm (z80, arm, x86) and I simply have not had a better overal experience than CL. The others are better at some things but a an overal experience, CL just is a pleasure.


The curly braces themselves are 100% irrelevant, as evidenced by the many, many successful and well-liked languages which don't use them, including Python, which is in the running for the most-used language these days. They're an implementation detail.

What's closer to innate is the Algorithmic Language, Algol for short, the common ancestor of the vast majority of languages in common use (but not, notably, Lisps).

Algol was designed based on observational data of how programmers, who had to somehow turn their ideas into the assembler to run on machines, would write out those ideas. Before it was code, it was pseudocode, and the origins predate electronic computers: pseudocode was used to express algorithms to computers, when that was a profession rather than an object.

That pseudocode could have been anything, because it was just a way of working out what you then had to persuade the machine to do. But it gravitated toward a common vocabulary of control structures, assignment expressions, arithmetic as expressed in PEBCAK style, subroutine calls written like functions, indexing with squared brackets on both sides of an assignment, and so on. I revert to pseudocode frequently when I'm stuck on something, and get a lot of benefit from the practice.

So I do think that what's common in imperative languages captures something which is somewhat innate to the way programmers think about programs. Lisp was also a notation! And it fits the way some people think very well. But not the majority. I have some thoughts about why, which you can deduce an accurate sketch of from what I chose to highlight in the previous paragraph.


> Algol was designed based on observational data of how programmers, who had to somehow turn their ideas into the assembler to run on machines, would write out those ideas. Before it was code, it was pseudocode, and the origins predate electronic computers: pseudocode was used to express algorithms to computers, when that was a profession rather than an object.

I believe you, but do you have a source for this? I can't find papers on how they chose to develop the syntax of Algol in the beginning.


Does decades of empirical evidence not prove that people are more comfortable with imperative, curly brace programming over s-expressions? It's not a chicken and egg problem. The egg has hatched and nested parentheses lost.


You may be right, idk, but I want to point out that you’re conflating two orthogonal concepts: S-expressions and imperative vs. functional programming.

There are lisp dialects that are very imperative, for example elisp, but they still use S-expressions. Historically they might have been considered “functional” because they have first-class functions and higher-order functions like mapcar, but nowadays practically every modern programming language (except go!) has these.

The thing all lisp dialects have in common is not where they land on the imperative vs. functional spectrum, but rather the fact that the syntax is trivial and so it’s easy to write powerful macros.


I think the simple uniform syntax is the main reason why Lisp never became popular.

Code is communication, and communication needs redundancy for error correction. You can see it in natural languages, and it makes sense to have it in programming languages as well. Using different kinds of syntax for expressing different ideas is an easy way to increase redundancy without making the code more verbose.


Lisp did become popular.

Then the AI Winter killed it and people avoided it like the plague.


The Lisp machine companies killed Lisp. They were the ones who blew Lisp sky high, but they also created expensive, monstrous workstations running Lisp images requiring tens of megabytes of RAM. Developers used them for prototyping and then had to cob something together to ship to the users, who had hardware like IBM PC machines with less than a meg of RAM.

Today's cruft like you ... Python, JS, whatever ... would not stand a chance in the world of the 1980s on that hardware.

It's amazing how far they were able to bloat up Lisp while continuing to peddle it commercially.

Leaner Lisps running on small systems existed all along, but they would rescue Lisp from the associations brought about by big Lisp.


>would not stand a chance in the world of the 1980s on that hardware

This is what fascinates me about Unix, they created an OS which works with text and processes, as opposed to binary structures and function calls when computers were hundreds of times slower. Even today the overhead of process creation and serialization for pipes is not negligible, how the hell did they manage to do it in 1970s?


Clojure has all those other braces also. They’re just used for data structure literals rather than blocks of code.


Or as I tell my colleagues who try to push for more abstract syntax: do you want your brain to do compilation each time your read something, or just have verbose text giving hints at each line ?

It s weird people prefer reading implicit text.


Isn’t this a big reason we have syntax highlighting? You can use color and styling to give you those hints that are otherwise implicit in text.


Decades of empirical evidence prove that people are more comfortable with functional, reactive, beging/end delimited programming, i.e. Excel.


millenia of empirical evidence and through to today, shows that most people are more comfortable not coding at all.


True, the majority of humans never even saw a computer.


No, it doesn’t.

What has happened in reality is that C became really popular and then all the people designing languages they wanted to be popular, rather than to be experimental, or to push boundaries, etc obviously chose a syntax which was familiar with most programmers, ie a syntax like C’s.

Further, one can disprove that the syntax is particularly important by simply pointing to Python which became immensely popular despite a lack of curly braces and even worse with significant white space simply because colleges and bootcamps decided it would be a good language to teach programming to beginners.


Arguably python and c are much more similar than any of them compared to a lisp.

I would argue the important part are the blocks in the former two, which sort of gets lost in the homogeny of lisps. Whether a block is marked with curly braces or indents doesn’t matter much - they being dissimilar to a regular expression does. Of course well-formatted lisp code tries to indent as well, but still there is a lot of visual noise there making it harder to visually inspect the code, I would guess.

Of course familiarity with a given way is significantly more important. We pretty much learnt the non-intuitive writing of math, to Chinese people their writing system is the intuitive one, etc.


Hmm, can't find the paper (mostly clutter from language bootcamp results) but around a decade or so back there was an education research project that concluded that teaching SQL first, rather than any imperative language (regardless of punctuation), was better for getting students to develop reasonable mental models for computing. (Unfortunately without the reference I can't address what the criteria for "better" were - but "what people get paid to do" isn't really proof of comfort at any level...)


I think it's quite telling that almost all of the innovations in lisp (garbage collection, first class functions, repl etc) have been absorbed into more popular languages except for s-expression syntax, which remains a small niche despite many great implementations of s-expression based languages.


Because as soon as you adopt the s-expressions, what you got is no longer <language>, but lisp itself. Something like this:

  static char _getch() {
    char buf;

    if (read(0, &buf, 1)) return buf;

    return '\0';
  }
would become:

  (define _getchar ()
    (declare static)
    (return-type 'char)
    (let ((buf (char)))
      (if (read 0 (& buf) 1)
        buf
        "\0")))


No. There is a good github gist rant I can’t find anymore, but if we call every AST in the form of s-expressions lisp, then is anything lisp? A programming language has to have an associated evaluation strategy, otherwise it’s just data. What you wrote only makes sense to execute as C code, which sure you can write a compiler for in your given lisp as well (so can you write a C compiler taking C AST in any other language, so it’s not special at all).


This one? https://gist.github.com/no-defun-allowed/4f0a06e17b3ce74c6ae...

It also responds to a few parents up "almost all of the innovations in lisp [...] have been absorbed into more popular languages" - pervasive interactivity hasn't even been taken up by some "Lisps", let alone has it been absorbed outside Lisp.


Yes! Thanks for digging it up for me! I can’t find anything on google for the life of me since they switched to vector search, even though I used to be able to find some obscure blog post..


Right. From which we can infer people like many things about lisp except for the syntax.


Does over a century of empirical evidence not prove that people are more comfortable with keyboards whose top row is laid out "QWERTYUIOP"?


Has there ever been research on this? Perhaps this situation has come about because the schools people must go to to get the programming jobs only teach the Javascript way? It seems circular logic to say that the current paradigm must be superior for the fact that it is the current paradigm. Is it possible that there are other reasons it reached that status?


does n=1 count? :)

some time ago I tried Racket, and just no. recently I tried Scala ZIO HTTP, and yes.

Maybe it's the types? Maybe it's the parens. But probably both. I cannot really recall my experience, just that manipulating code was ridiculously clunky. My assumption was that the IDE will manage the parens for me and when I'm moving something somewhere it'll figure out if I messed up the parens.. and ... no, nothing. I had to balance them with hand.


One reason emacs is popular for lisp programming is that the paredit package (or its newer competitor smartparens) do basically exactly what you describe: structural editing of sexp-based languages.


There's also parinfer, which does pretty much everything for you. I've heard some people don't like it because they dislike auto-insertion of text. Maybe that's more like what OP expected.


No, because people who start in programming do not go to a syntax comfort clinic, where they are tested, and then assigned to a programming language.


It only proves that those languages are the most learned because they are the most popular in industry.

It says nothing about what makes a language easy to learn.


I would argue that imperative programming is most natural - it's what everyone gravitates to in the beginning. Then, at a sufficient level of complexity, a programmer gravitates to solutions like OOP or FP, but there's an obvious trade off in readability there. 99 Bottles of Beer implemented with a loop is intrinsically going to be easier to read than an implementation with tail recursion, even though the latter is generally better. Lisp's inside-out parentheses style adds yet more cognitive load on top of that.

Many things are socially constructed, but not everything.


> I would argue that imperative programming is most natural - it's what everyone gravitates to in the beginning.

When 6.001 (the introductory class for which SICP was written) was launched, most of the students who took it had never used a computer before. Yes, MIT students. This was around ~1980. And in the first hour of their first class they were already doing symbolic differentiation in scheme.

I think your view of what’s “natural” is a just so story.


> And in the first hour of their first class they were already doing symbolic differentiation in scheme.

People heavily trained in maths can take quickly to languages designed to make programming look like maths, that's hardly a surprise.

I wouldn't base my assumptions about what most people find natural on the experience of MIT students taking 6.001 in 1980.

(Not to mention, 'doing' is doing a lot of heavy lifting in that sentence. I could show you a intricate sentence in French in the first hour of your first French class, but unless you came up with it yourself, are you demonstrating much learning just yet?)


But you are basing your assumptions on absolutely nothing.


I'm basing my assumptions my own experience, both learning to code and teaching others to code, which isn't nothing to me, but may well be nothing to you. (No shade intended, that's totally valid.)

I would certainly be interested in the results of a study that put a simpler interpreter / compiler and a language reference in front of motivated non-programmers, but I strongly suspect that the amount of elegant tail recursion we'll see will be limited (and I'd very much expect there to be a correlation between that and a training in mathematics).

Imho, data comes from experiments, but experiments come from hypotheses, and hypotheses come from experience.


Yeah, mid-1980s the thing incoming students had to unlearn was BASIC, not anything with curly braces. (Source: I was a 6.001 Lab TA at the time.) Of course, the next class on the rotation used Clu, where you had to unlearn "recursion is free".


Imperative programming is probably the most intuitive, but I'm doubtful curly braces and C-like syntax are anything more than coincidence. The first programming language was Fortran, and it didn't look anything like C. This is a really old Fortran program copied from a book:

     WRITE(6,28)
     READ(5,31) LIMIT
     ALIM = LIMIT
   5 SUM=0.0
     DO 35 ICNT=1,LIMIT
     READ(5,32) X
  35 SUM = SUM + X
     AMEAN = SUM/ALIM
     WRITE(6,33) AMEAN
     GO TO 5
  28 FORMAT(1H1)
  31 FORMAT(I3)
  32 FORMAT(F5.2)
  33 FORMAT(8H MEAN = .F8.2)
     END
Most modern programming languages seem to take inspiration from C, which took inspiration from BCPL, and that from Algol. Others took inspiration from Algol directly, like Ada, or Lua. And Python has indentation-based block structure, rather than having blocks of statements delimited by braces or or an "end" keyword.


I always liked Pascal's BEGIN and END statements instead of curly braces. There is also Basic where the blocks built into control flow statements, like FOR I=1 TO 5 [code here] NEXT I

I'd argue a lot of programming language evolution is influenced by the capabilities of our IDEs. When you code in a text editor, the terse syntax of C is great and brings advantages over the verbosity of Pascal, Basic or god forbid Cobol. Once your editor does auto-indentation the braces seem redundant and you get Python. Smart completions from IntelliSense are essential to efficiently writing C#, and now that LSP has brought that to every IDE or smart text editor we have the explosion of popularity of more explicit and more powerful type systems (Typescript, typed Python, Rust). Programming languages are shaped by their environment, but the successful ones far outlive the environment that shaped them.


It really depends on your mindset. I grew up with math (composable operators, no side effects) and a lot of immutable + virtual operations software (maya, samplitude, shake, combustion) ... so to me imperative programming, with all the control flow, subtly changing state and time dependencies, coupling of concerns was almost instantaneously an fatal issue..

Backus also shifted away from imperative inspired languages to design FP/FL language (I thought they were contemporaries of BCPL but came 10 years later, later than APL), even though he contributed to FORTRAN directly.


You know, you're probably right. It's just been so long since I was introduced to programming languages that I had almost forgotten (though I'm probably younger than you).

I remember learning JavaScript as a kid (for some class) and trying to get used to the mutable variables, having to mutter to myself "Okay, here, let x be 4. After this line, x is x + 1, which is 5, a new value." From there, eventually thinking things like: "After every loop, x changes to be itself plus 1. So after the loop, x will be its value before the loop plus however many times the loop ran." Things like that. Basically informal Hoare logic without realizing it.

I had almost forgotten, because I then went years before I programmed again, and the language I learned was C, which was probably easier because I was already familiar with while loops and mutable variables.

Maybe it would have been equally intuitive to learn a functional language first. It's probably no more intuitive to mutter that under your breath versus stuff about the type system and equational reasoning.

On the other hand, it seems easier to get beginners interested in programming with an imperative approach. In our assignments in that class using JavaScript, we used libraries to make little games, which imperative programming seems better-suited for.


> I would argue that imperative programming is most natural - it's what everyone gravitates to in the beginning.

Why do you believe this is anything more than an historical accident?

For example, it wasn't what Alonzo Church gravitated to when he invented the lambda calculus in the 1930s, before any programming languages or indeed general-purpose computers existed.

> 99 Bottles of Beer implemented with a loop is intrinsically going to be easier to read than an implementation with tail recursion

First, you don't need to use explicit tail recursion. See e.g. https://99-bottles-of-beer.net/language-haskell-1070.html

Second, this sounds like unfamiliarity, not anything inherent. Why is it "intrinsically easier to read"? For a tail recursive version, the main tail recursive function would look like this in Haskell:

    _99bottles 0 = printVerse 0
    _99bottles n = do
        printVerse n
        _99bottles (n - 1)
In fact, with a bit of experience you might write this as:

    _99bottles 0 = printVerse 0
    _99bottles n = printVerse n >> _99bottles (n - 1)
It's only less easy to read if you're completely unfamiliar with the concepts of pattern matching and recursion. But the same is true of any programming language.

Given the above, what's a "for loop" and why would you need one? Sounds complicated and unnatural.


CPUs are imperative.


So what? That doesn’t say anything about what’s “most natural” for a human to express algorithms in. No-one writes machine code directly.


OOP is imperative programming. It's just function calls where the first parameter is to the left of the function name, after all.

A better name for "non-OOP" programming is procedural programming, where you organize code in long blocks that go straight down, code duplication is accepted vs jumping all over the place, etc. Honestly underrated. It can be quite easy to understand.

Strictly-evaluated FP is also imperative. The only really different languages are the ones with different evaluation systems or that can do things besides evaluate - people like to say Haskell is the best here but I think it's actually unification languages like Mercury. Maybe even SQL with transactions.


I'd argue a FP implemenation with map (something like `[99..1].map(|n| f'{n} bottles of beer ... {n-1} bottles of beer on the wall').join('\n\n')`) is inherently as readable as the for loop, and not really more complex.

There are lots of great parts in FP, and for the last ~10-15 years imperative programming languages have made a lot of effort to add them to their syntax. You just need to leave out the more dogmatic parts that make FP popular in academia.


Hehe, it's easy if you ignore half the song, the singular for n=1 and the whole n=0 case! (Not that we're talking about rocket science if you don't, but c'mon, oranges to oranges!)

I agree with you otherwise though.


Assembly is imperative, so there's a lot to be said for a language that mimics how the computer actually works. Lisps always leave me saying, "oh, that's clever."


> even though the latter is generally better

Why is tail recursion better generally? I'm not familiar with FP very much, but it feels like loops more closely resemble the way computers execute them than tail recursion.


It's more concise and illustrates the common subproblem. Loops make you model a state machine in your head, which I'd rather leave to the computer.


Very fair question. It may seem surprising, but loops (especially `for` loops) don't really reflect the underlying machine code very well. There is no distinct loop concept in machine code; instead, there are ordinary instructions followed by conditional jumps (if {predicate} go to {memory address} and keep on executing from there), which may or may not return to an earlier point, and will thereby conditionally repeat. Tail recursion, provided it's done in a compiler that understands tail recursion optimisation, will in some ways mirror this better than a `for` loop. (Though an old school, C-style 'do {code} while {predicate}' - note the order, and lack of any loop state variables being created or modified - is closest to the machine code).

Loops, while not bad per se, do have a lot of foot-guns. Loops tend to be used to make all sorts of non-trivial changes to outside state (it's all still in scope), and it can be nightmarish to debug errors that this may produce. Let's say you're looping over chickens in your upcoming Hen Simulator 2024, and you call a function from inside your chicken loop to update the henhouse temperature, which has a check to see if the temperature has gotten too high, which might result in a chicken overheating and passing on into the great farm in the sky, which changes the amount of chickens remaining, but wait, isn't that what you're looping over? Uh oh, your innocuous temperature update has caused a buffer overflow and hard crash. In a rare and possibly hard to reproduce case. Have fun debugging!

Generally, functional programming prefers encapsulated solutions - arguments go in, results come out, nothing else happens - which makes it easier to reason about your code. The most common replacement for loops is something like map, which just applies a lambda to each member of a list. This should make it somewhat harder to achieve the mess above (the other chickens shouldn't be in scope at all, so your temperature update function should complain at compile time).

With tail recursion, you could make a function that takes a list of chickens to update. You pop the first chicken, update it, and recur on a list of the remainder of the chickens. Because this needs to be a function (so you can recur), you have control of the arguments, and can determine what exactly is passed to the next iteration. You can't overflow the buffer, because you're passing a new 'remaining' list every time. This is also where you can get a little clever - you can safely change the list at will. You can remove upcoming chickens, you can reorder them, you can push a new chicken into the list, etc. If a hen lays an egg mid-loop, it can be updated as part of the same loop. Plus you have the same scope safety as you do with map - you can't do anything too messy to the outside state, unless you specifically bring it in as an argument to the function (which is a red flag and your warning that you're doing something messy with state).


This is completely socially constructed.

Lisp was once a very popular introductory programming language and students learned it just as easily or easier than any other language.


Then why does this web page use indentation to clarify who's replying to whom, instead of {}s?


I don't know if it's innate but it's what we have. Lisp has been around about as long as programming, it's had plenty of time to catch on, it hasn't.

Maybe innate, maybe it's an offshoot of teaching math in an infix style, 1 + 2 vs. + 1 2.


I don't think it's been tested at all. for people who took and finished a course in Lisp as their first programming language, how many "hate parens"?

I have no trouble with lisp's parens, i like them. What I never liked though, is that the first item in the list was an operator, a verb lets say, and the rest were the nouns; whereas, you could also have nested lists say of numbers where there were no operators. Never felt right (not that I can think of a better way, not worth adding more parens)


But good college math departments teach reverse Polish notation; i.e., Hewlett-Packard over Texas Instruments. It’s demonstrably more advanced / efficient.


Lisp became very popular, then died off rapidly due to association with the AI Winter.


C-like syntax is brutally hostile to programming beginners. There is not a shred of anything natural about it.


There's nothing natural about programming, because no ones like to be that formalized in their thinking process, especially with things that should be common sense (although ambiguity is still an issue). It's the recursive definition that get people. Especially when pointing that the things that the computer can do form a very small set. It's just that they can do it very fast.

You can see that when observing novices programming (without stack overflow or similar help). They often assumes that it will get done (magically) as soon as they call that function. And their code organization reflects the ad hoc thinking instead of a planned endeavor.


The formality of programming is unnatural to many (though the impulse to formality and robustness as displayed in logic & math clearly is millenia old, however niche), but a fair bit of it is as natural as language itself, or magic.


That alternative universe was the early 1980s where Lisp was very popular to learn due to bring consider the best language for AI.


"The only way it would have "benefited" would be that the web would only be developed by lisp programmers."

Considering the state of the web I do not think this is making the argument you intend.


I agree with most of what you said here but I want to emphasize that this is not necessarily a good outcome. Fitting the brains of corporate devs is not a metric to measure if your goal is to make the best tool for the job - the majority of corporate devs are extremely mediocre at their job even with a language that they’re not scared of.

All that to say, I completely emphatically agree with the original comment. The world would have been so much better off with Scheme as the language of the web.


If the "job" is to make lots and lots of software, even if most of it is mediocre, then the best tool is what will enable millions of mediocre developers to develop, not just thousands of elite developers.


> To the vast majority of programmers, syntax matters. C-style with brackets, or python whitespace, or Ruby do/end, these fit better the brains of the majority of programmers.

You have no idea whether this is actually true, or whether people have just fit their brains to what is out there.

The idea that programming language syntax fits people's brains rings untrue for anyone who has watched beginners struggle with it, or remembers being one.


Plus, add a heaping tablespoon of survivorship bias.

1. Many people try programming.

2. The vast majority of the people who try programming are subject to external forces that guide them to whatever they learn and use.

3. Out of these, a certain fractions stick with it and are found programming in the long run, even working in it.

We could easily conclude (probably quite wrongly) that the popular languages turn people away from programming, except for a few weirdos for whom they click.


When Lisp was popular, developers learned it as easily or easier than any other language.


>The only way it would have "benefited" would be that the web would only be developed by lisp programmers.

Millions have learned javascript because it is the technology of the web. Are they better off?

So many people have been introduced to programming, computer science, and programs through the abstractions provided by javascript, and that sucks


> To the vast majority of programmers, syntax matters.

And yet, when you tell them the reasons, why some other syntax than their Java/PHP/Python syntax would be better, they usually counter with "It's just syntax." or "Every language can achieve the same result, why care so much about syntax." or similar.

> C-style with brackets, or python whitespace, or Ruby do/end, these fit better the brains of the majority of programmers. Perhaps not the majority of HN readers but the majority of corporate devs.

I would need a source for that.

I think most programmer's brains have simply not been exposed to other syntaxes much or much too late, when their brain already calcified the C-style syntax. Or they don't actually care.


Significant white space would have been a disaster on the web.

As much as everyone poops on js it is a very forgiving language for embedding.


JS is fine for scripting, sprinkling a bit of interactivity on a page. The issue is when you want to create whole software out of it, and the last thing you want is forgiveness. You want the compiler and linter complaining loudly.


There is so much begging the question in this thread that it’s absolutely mind-boggling.


People aren't born familiar with C style syntax. Quite the opposite: many people struggle with it for a long time! Back in the day it super common to get frustrated because you missed off a semicolon or something. Nowadays IDEs probably help, but what's the point of syntax that the computer could write for you?


Very much a personal anecdote, but I spent about a month earlier this year seriously learning various Lisps (CL, Racket Scheme, Clojure). I stopped when it clicked for me - Lisps are a mess. Everything I wanted from Lisp I found in Haskell.

I'm reasonably confident that all the anecdotes you hear about 10x improvements from switching to Lisp are just programmers learning about functional programming and good design patterns for the first time. But those aren't contingent on using a Lisp, and I'd argue using Lisp brings an enormous amount of cruft and baggage that makes FP seem far more alien and difficult than it needs to be.


The 10x seems to mostly have been from programmers switching from extremely low level languages like C++ to something far higher level. Paul Graham also talks about macros in his well known essay, but I honestly think a lot of the value one gets from Lisp can be found with Python. There are a lot of things you don't have to worry about like manual memory management and so on. Python isn't as fast as lisp or as beautiful (opinion), but the ecosystem is very impressive and the community isn't as fractured as the lisp community (for example see the bipolar lisp programmer essay).

I don't think FP by itself is that massive of a win despite what some dubious studies or zealots say, but it's certainly better than enterprise Java. I've read my fair share of horror stories of Haskell in production too.


> Paul Graham also talks about macros in his well known essay

With all due respect, he is ironically probably the biggest blub programmer.


How so?


Ultimately, programming languages are tools, and different tools are appropriate for different jobs.

There's nothing stopping you from writing a massively-scaling e-commerce site in Verilog and running it on an FPGA, but it - uh - probably isn't the soundest course of action.


Yep. I'd agree with that statement. Although there's a lot of tools I'd struggle to use practically anywhere.


I think the idea that Lisp was so much more productive than other languages originates from a much earlier time. But now the most important features of Lisp - like garbage collection - are commonly available in most languages.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: