Re "it is a lisp" and "everything is an expression", I would like to add a bit of clarification.Or, given that you use Mathematica regularly while I was just reading surface docs (for purposes of doing some stuff with Wolfram Alpha), rather a question if my perspective is well-founded.
Based on my understanding of how expression evaluation works, the slightly more revealing statements would be "it is a lot of lisp macros" and "everything is an s-expression". Which means, a big mess. Let me expand:
As a functional programmer, "everything is an expression" sounds comforting, and I would expect there are clear transformation rules on how expressions are evaluated (and, maaybe, type signatures).
Instead, what you get is, "you can throw in some random form of expressions into this function, and it will do something with them". As in, it takes an AST input, and transforms them in some loosely specified way. There doesn't seem to be a type system, so you don't have types to guide you, rather you likely need to figure what kind of expressions work with which functions.
Now, if I'm wrong about this, and the functions behave consistently in what they take and how they transform it, then I'm more than open to be corrected. It is just that my high expectations (based on marketing of the lang) and the subsequent realization left me a bit bitter.
It's really, really difficult to come up with a type system for mathematics. Let's just talk about Plus, the symbol for using the plus sign. What's its type? You might say it takes a few numbers and returns a new number. But what kind of number does it return? It is capable of returning machine precision numbers or their custom high precision numbers. It can return integers, rational numbers, real numbers or complex numbers, as the case may be. It is capable of working on lists of numbers and matrices of numbers, and it returns lists of numbers or matrices of numbers. But wait Mathematica doesn't require a list's elements' types to be homogeneous, so it can return different types of numbers for each element of the returned list. It is capable of working on completely undefined symbols, much like in real mathematics you expect a teenager to be able to reason about the expression `x+x+x` and simplify it to `3x` without knowing what `x` might be. It could very well leave everything the same, for example when you add two undefined symbols `x+y` and get back `x+y`.
So I personally think it is perhaps not productive to think about type systems and type signatures when working with Mathematica. But you can definitively think in terms of transformation rules. And Mathematica either documents these rules or makes these rules intuitive.
It is possible to put filters on function arguments, e.g., the definition
f[x_Integer] := ...
will define a rule for f[] that only matches expressions where the argument to f[] has the head "Integer". It is even possible to use arbitrary predicates:
This lets you sort-of have type-checking. This is entirely opt-in, so you have to be somewhat rigorous about its use or it does not do any good. Also, in practice if any invocation of f[] does not have arguments which match the types for which you have defined it, the expression just remains unevaluated, which can create a mess (but maybe less of a mess than evaluating the function on input of the wrong form). The performance impact (particularly of the predicate version) is also non-zero, but my experience is that the biggest performance limitations come from trying to keep your machine from grinding to a halt when a runaway expression applied to the wrong thing explodes in complexity and eats all of your RAM... and this helps avoid that.
While I have found this to be very helpful for writing and debugging hairy expressions, I used Mathematica for years before I even knew this was a thing. In reality almost no one does this, certainly not with any consistency, and the situation is as bad as you fear it would be.
Lisp-style macros are actually difficult to write because of the infinite evaluation of the language. I was able to write a quasiquote package for myself to help with that that though.
These are all valid criticisms. There is no type system, although some safeguards can be implemented through pattern matching and conditions (see the answer by @derf_ above). For quick and dirty transforms on symbolic math expressions, these are often good enough, but it is indeed a mess to use as a full fledge programming language.
I do like that the lispy language itself closely mirrors math expressions, and it is consistently accessible throughout the user interface. For example, the mathematica notebook frontend (IDE) is simply some `MakeBoxes[]` of the expressions, which are all valid mathematica code themselves. I tried sympy a while ago, which I believe took an object-oriented approach, and it was very clumsy when compared to mathematica.
Still, I would not recommend using mathematica for general programming, precisely because of the mentioned shortcomings. By default, it is also impure and not lazy (eager eval, although it can be forced to be lazy on a case by case basis using `Hold` or `Unevaluated`).
`docker run --entrypoint /bin/sh -it --rm the-image` IIRC