I can strongly relate to that. You can write a custom "fill" method for your array and just do "arrray.fill", bam! It just works. You want numpy "zeros"?
Scala has a strong type system and when working with it on a daily basis I was not programming, I was thinking about types and fighting with the compiler. And that is the language which has a pretty decent IDE plugin. When I moved to D, it felt like a breathe of fresh air to me. Therefore, the whole talk above about auto and types looks like subjective nitpicking.
In practice though, D codes fast and runs fast, as promised on the official site.
It's the Java 1.4/Java 6/Java 8 problem all over again. Big companies refuse to keep up with the base languages, and important parts of the ecosystem (e.g. Spark) fall behind the language.
Scala is a beautiful and elegant language. I really thought that it would be one of my favourite languages when I first started working with it. However, after some time I got disenchanted.
First, Scala is hard. Projects written by one person quickly become a deep functional labyrinth and custom architectural patterns because Scala is so expressive and you can do all kind of twists.
Second, the entrance level remains high. Your more experienced colleagues will not allow to write anything that even remotely smells OOP. This delays the time when you can be productive.
Third, on paper Scala tries to serve both object oriented and functional worlds but in practice the the latter is the unconditional standard.
Although I like the positive tone of the article, I find it difficult to agree with Scala ever becoming mainstream.
Yeah, I was also excited about Scala. Went to a conference in 2010; eventually took the Odersky course and tried it for some personal projects.
I ended up abandoning it, and for me Bruce Eckels captured a big part of why, talking about it as a "landscape of cliffs". [1] Every time I thought I knew what was going on, I tripped over something and ended up in a chasm. With much study and effort I could dig myself out of that hole, but soon enough I'd be in another one, trying to understand some abstruse bit of type magic when I just wanted to render a web page or something.
It also reminds me of the famous Kernighan quote: "Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it." [2] Scala might be optimal for some group of people, but definitely not for me, and I think not for a mainstream developer audience either.
I must say this comment reminded me of LISP and the many criticisms put against it. Perceived as overly flexible ("custom architectural patterns") with snobbish people ("colleagues will not allow to"). But at the same time many people will find LISP (or Scala) their "superpower" that allows them to perform really well.
no one flavor of scala will ever be a standard. you have to adopt coding styles and stick to them. once you get to the point where you have a pattern and you’re just coding-by-number scala isn’t the problem anymore
I was asked to join a team of scala developers. Most except one had started learning about 6 months earlier. I was asked to learn it while making small contributions to the project. I quickly found out that there were statements that could be interpreted in multiple ways, can't remember now what it was. Then I got stuck on something and even our seasoned very enthusiastic Scala developer could not solve it. I then and there decided not to pursue learning the language and quickly have myself transferred to another team. It's a difficult language, one should not spend thinking on the language but spend time thinking about the problem itself. Probably what I say is not doing justice to Scala as I was just a newbie but I did not have those problems with Clojure which is another not that mainstream language.
And that's the whole problem with Scala. When you work with it, you are having a battle with types and compiler more often than you should. Funny but it is not the first typed language I had to write code in. Never did I have to do so much unnecessary dancing around and asking senior colleagues for help. I guess not all of us are capable of grasping such expressiveness at the cost of such complexity. The majority just need simpler tools to be productive.
Going full FP in Scala with Scalaz and "tagless final" (I still have no clear idea what's that) and whatever else that's hip, leads to madness. Sure you can write maximally polymorphic code. Great. But you end up doing the extractions/refactors that the type system and the compiler allows, not those that would help the code the most. Because those are too slow (too much boxing and GC churn), or the compiler has a bug (or the library) so you end up putting a TODO comment in the code with a link to a github issue.
Or it's a feature that's not supported by Scala, so you end up with a FP spaghetti. :(
IMO, it really depends on the team you join, as with most programming projects. Scala can be used in a classic mutable OOP manner, as if you were writing Java in 2005, or if you heavily buy into pure FB libraries it can be used basically like it's Haskell, but IMO neither of these approaches embrace the strengths of the language. It's meant to be a true mix of OOP and functional paradigms, not going extreme one way or the other. The object oriented aspects let you embrace great OOP software architecture patterns like Domain Driven Design, so that you can really model the business domain clearly and faithfully. Then the functional aspects allow you to implement a "functional core, imperative shell" style, that's really easy to reason about, and easy to make concurrent (safely), while also having excellent/safe/descriptive types like Future/Option/Try and ADTs.
It sounds like you've worked with a group of people who want to treat Scala like it's Haskell, but that's a minority community that IMO isn't really embracing the strengths of the language. I've worked on projects that go all-in on libs like Cats and Scalaz, and I agree that they're mostly unnecessary complexity that also obscure the modelling of the business domain. You're going to massively confuse newcomers with all the Applicative/Effect/Monad/Monoid/Functor/etc. talk, for next to no benefit. Actors are a bit of a different story - excellent for the 1% of times where you really need them (lots of mutable state and lots of concurrency), but 99% of the time you don't need them. Some devs are enamoured with the mathematical purity of pure functional programming, without properly considering how hard it is to understand for most other devs. If you let these types take over your company, that’s a cultural problem, not really a language problem.
> ... programmers are increasingly facing new challenges for high-level domain modeling, rapid development, and, more recently, parallelism and concurrency. It seemed to me that a unification of the traditional object-oriented model with functional programming concepts was feasible and highly desirable to address those challenges.
I completely agree, and have yet to find a better language than Scala for the above demands, assuming you really embrace the OOP/functional mix. I've been writing lots of Scala at my day job for the past ~5 years, and it's my favourite language. I've worked on backends in Scala, Java, Go, Python, PHP and Node, and prefer Scala to all of them.
I also agree with Li Haoyi - it's becoming a really solid, stable, reliable language, with improvements focused on the build tools, compiler, and the odd language wart, without significant changes to syntax/style, which is great. It does take awhile to learn, and you do have to be careful about what style you program in, but I think if you just embrace the language's OOP/functional fix, and for "core style" mostly stick with the standard library (Future/Try/Option/etc.) vs. going crazy with Cats/Scalaz/Akka/etc. (unless you REALLY need Akka specifically), it's an outstanding language.
It is, and C/C++ are used in so many areas which are not system programming right? Being a scientific Python backbone is one of them for example. My claim was that it is nothing out-of-ordinary. In fact, I really dislike the article's somewhat evangelist attempt to target newbies. These so-called benefits existed and do exist in so many other languages. The only thing that Rust comes with is memory safety which on closer look is actually an illusion and is a pretty evil one.
I would praise Rust for speed, yes. For ecosystem, yes. But that's it.
"My claim was that it is nothing out-of-ordinary". Well I guess will agree to disagree there are pretty prominent cases of very capable people trying to do a project in C++ and failing and then leveraging Rust and actually delivering.
Mozilla namely: "By 2017, Mozilla had made two previous attempts to parallelize the style system using C++. Both had failed." but they were able to complete it in Rust.
Unfortunately with scientific computing the only place I see D could be writing custom performance critical algorithms but then again for university folk it is more straightforward to do it in C++ or C since there are plenty of code snippets lying around.
There is excellent mir library but its documentation is subpar, no tutorials or any examples too. There is Netflix Vectorflow small deep learning library but only for CPU and only feed-forward networks, so it works for some specific narrow case. There is fastest on earth csv parsing library TSV-utilities from one of the ebay engineers but I have only learnt about it when I started looking through D website resources, also no tutorials. There are tools but using them needs more time investment than alternatives.
I actually have quite a lot of tools available that I've built up over the years. I wish I had more time to work on it. I have a matrix algebra library that in my opinion is extremely convenient to use that I'm preparing to release complete with documentation in the next couple months.
I have everything available in R at my disposal, because it's easy to embed an R interpreter inside a D program. Note that this does not always give poor performance, because the R code calls into compiled code or you can just call the compiled code directly using the R interface.
For numerical optimization, I call the R optimization routines directly (i.e., the C library functions).
For basic model estimation, I call the Gretl library.
For statistical functions (evaluating distributions and such) I can call into the R API or Gretl.
For random number generation, I can call into Gretl, R or GSL. I have parallel random number generation code that I ported from Java.
For machine learning (I do a limited amount like lasso) I call R functions. The overhead with that is so low that there's no point in not just calling the R functions directly.
So things are there. It's just a matter of finding the time to turn it into something others can use. Right now I'm focused on doing that with my linear algebra library.
Oh hello friend, you've piqued my interest as someone interested in D, numerical optimization, and linear algebra. I have some questions though.
How do you do numerical optimization in D? Do you somehow wrap Coin-OR's CBC C++ library or lp_solve? What does it mean to call the R functions directly? Do you have an example? I'm going to guess that won't be able to handle the massive and time critical models I use, but am still curious.
How do you do linear algebra? Are you binding to BLAS, LAPACK, Armadillo? Or did you write some routines from scratch?
A couple of points to make before I give my answer. You want to check out the Mir project linked in the other comment. It's a well-designed library (though maybe lacking documentation). The other thing is that you can #include C headers directly in a D file using https://github.com/atilaneves/dpp so you can always costlessly add a C library to your project.
For optimization, I was referring to calling into the R API, which exposes the optimization routines in base R (Nelder-Mead, BFGS, Conjugate Gradient, and simple bounds constrainted BFGS). In terms of what it can handle, I guess that's entirely up to what R can handle. Here's the project page, but it looks like I haven't committed to that repo in three years: https://bitbucket.org/bachmeil/dmdoptim/src/master/
If you do try it and have problems with anything, please create an issue so I can fix it or add proper documentation.
There were two reasons for that. First, it offered a really simple LAPACK interface when I was starting out with D, and second, it offers a lot more than just linear algebra.
Is this something I'd recommend to others? I don't know. I built my infrastructure over a period of several years while waiting for my son at his many practices and activities. I also optimize for programmer convenience rather than performance at all costs. The time it takes to write correct, performant code is far more valuable than having code that runs 15% faster.
This has a lot of potential, but ultimately I'm paid to do other things, meaning those things become the priority...
This is great, please do! It would be nice to share it not only on dlang forum too. How does it compare to scid btw?
I like R but generally use it for basic stat tasks and plotting instead of Python. It would awesome if you could share your experience on how to set it up with D in blog post or whatever form you find useful.
I've been writing up a summary of how I use D to put on my website. Maybe this is the push I need to finish it.
About scid[0], I looked at it when I started, but it seemed to be largely inactive by that time, it didn't do what I needed, and the documentation wasn't really good enough. I was also turned off by the excessively generic nature of everything - there were just too many templates. At least that's what I recall.
Just do a template
void zeros(T)(ref T arr) { arr.each!"a = 0"; }
someArr.zeros;