> Python ... the language is nice for ... lacking surprises
Really?! The number one thing I dislike about Python is that it is chalk full of surprises, especially if you know more regular languages like MLs (e.g., SML, F#) or Schemes (e.g., Scheme itself, Racket). The scoping rules alone are a minefield.
If one knows F#, for example, going to Python is a step back in basically every way unless you need a package that is only in Python (somewhat rare).
> Academic functional languages don't have most of those, and academia has no right to complain given that they've done little to rectify this.
It's an interesting stance, given Guido van Rossum seemingly did everything he could to avoid what came out of academia.
For F# vs Python, I cannot think of anything that Python does better than F#, and F# has several features that Python doesn't. Things that come to mind in F# are unhampered lambdas, clear and consistent scoping rules, cleanly separating binding and assignment, built-in pattern matching inherent to the language, discriminated unions, built-in records, default built-in immutability, actual concurrency, piping, embracing imperative and OOP and functional paradigms, performance, modules, type inference, units of measure, etc.
They share many things like whitespace sensitivity, notebook implementations, relatively clean syntax, but even the things they share, I'd give F# the edge on all of them.
Isnt this like comparing apples and oranges, both designed & built decades apart (Python late 80's by CWI in Netherlands, F# 2005, Microsoft Research at Cambridge).
F# seems to be a language with more 3rd party addons built into the standard Off-The-Shelf offering seen with other languages. My biggest takeaway when I looked at F# is it seemed better suited to db work than C#.
I think the major thing is that F# did not ignore the things that came before it, whereas Python did. While Python did come about in the early 90s, it had already ignored things like Lisp, Scheme, Smalltalk, Erlang, and Standard ML (SML). As it evolved into the 2000s and 2010s, it continued to ignore developments. Compare that to Clojure, Elixir, and F#, and there's a stark difference. All three of those languages were designed to be highly pragmatic and not academic, but they all three stood on the shoulders before them. If you read interviews with van Rossum, he shows a complete unwillingness to accept the functional programming paradigm as pragmatically useful and basically rejects it wholesale. In his Slashdot interview, he explicitly said "functools (which is a dumping ground for stuff I don't really care about :-)".
It's frustrating to me, because if Python had adopted at least some of the things that languages like SML then OCaml and then F# were doing, it would dramatically elevate it. At the time of the interview, in which he goes on to state "Admittedly I don't know much about the field besides Haskell", the year was 2013, and so at that point, F#, Clojure, OCaml had already been around for a while, Erlang had been released with a permissive license, and Elixir had actually been released the year before. Further, it's not like even before that that Haskell was the only functional language. SML, Scheme, and even Racket (then known as PLT Scheme) had been around for a while.
Python is a wolf in sheep's clothing to me. It has a deceptively simple look at at the surface, but it is actually a quite complicated language when you get into it. I've done concurrent programming my entire programming life in F#, Elixir, and LabVIEW. I simply do not understand how to do concurrency things in Python in a reliable and composable way (for one, there's several different implementations of "concurrent" programming in Python with various impedance mismatches and incompatibilities). I.e., it's complicated.
There wasnt much before Python either, I know some people who think OS/2 was better for multi tasking than windows, but the popular market didnt seem to think so.
> (for one, there's several different implementations of "concurrent" programming in Python with various impedance mismatches and incompatibilities). I.e., it's complicated.
I dont have a problem with any of the synch objects because they also have strengths and weakness which are relevant to what ever the code is doing.
Speed of synch objects is one factor, but there are others, like handling of shared data between two or more apps.
So are you saying F# removes the ability to choose synch objects?
Half the problem I find is just keeping up with different terminology used in various parts of the world.
Isnt this a CPU optimisation thing, ie code & languages working better for some instruction sets than others?
One of the languages I've used in the past always performed best on certain AMD cpu's, but that was when we could choose what cpu the compiler was to optimise for, ie 286, 386 etc etc.
I'm also aware of backroom shenanigans going on with some languages and tools working better for some CPU's.
Tail call optimization has nothing to do with optimizing for different CPUs, it's about dropping a function's stack frame when it's evaluating its return expression and its stack frame isn't needed anymore.
In a language implementation that doesn't optimize tail calls, the stack would look like the following after the call to g:
g
f
main
In a language implementation that does optimize tail calls, the stack would look like this, because the result of f is whatever the result of g is so f is no longer needed:
g
main
If a language implementation doesn't optimize recursive tail calls, the following code will quickly overflow the stack and the program will crash:
def loop() =
do something...
loop()
In a language implementation that does optimize recursive tail calls, this code can run forever because loop's stack frame gets replaced with the stack frame of the new call to loop.
The reason people want recursive tail calls optimized out is at a much higher level than anything to do with the actual CPU instructions being used, they just want to have a way to write recursive functions without worrying about the stack overflowing.
I have never seen this referred to anything other than things like tail-call recursion, tail-call optimization, etc.
Languages like Python make implementing simple loops like:
def loop():
<whatever>
loop()
impossible.
Python will reach a maximum recursion depth and error.
Why is this important? Like I said, it makes looping very easy. For example, actors can almost be trivially implementing in languages with tail-call recursion.
It’s not in Python because like most things in Python, van Rossum doesn’t like it because <reasons>.
There’s little point in having full traces of the data doing in and out of the tail-call loop is immutable, so you only really care about the current call of the function.
Yes, the different terminology is a reflection of the entrance of so many new entrants going for what is easy in the short term, instead of learning the theory of their industry and thus learning better approaches that are not 'immediately' obvious.
This lack of learning theory in our industry, instead going for something that is 'easy to get started' explains the popularity of python and javascript, and at the same time why python and javascript are littered with problems that have already been solved, and cluttering up the field of knowledge by reinventing terminology because they never learned the original existing terms.
And yet, people are using Python on much more for things that never touch Numpy, Pandas, or machine learning libraries.
Also, Python isn't the only source for scientific computing. Although, I'm a bit frustrated Microsoft didn't see F# as a Python competitor early on and poured resources into it like they did with Visual Studio Code or TypeScript.
Really?! The number one thing I dislike about Python is that it is chalk full of surprises, especially if you know more regular languages like MLs (e.g., SML, F#) or Schemes (e.g., Scheme itself, Racket). The scoping rules alone are a minefield.
If one knows F#, for example, going to Python is a step back in basically every way unless you need a package that is only in Python (somewhat rare).
> Academic functional languages don't have most of those, and academia has no right to complain given that they've done little to rectify this.
It's an interesting stance, given Guido van Rossum seemingly did everything he could to avoid what came out of academia.