> What's an example of a language that's as bad as Perl, that's used as widely as Perl was, that's still in use today?
PHP? I don't know how widely it's still used, but I'd guess it's more widely used than Perl. Also, PHP is not "as bad" as Perl. It's much, much, much worse. It's Perl without the charm.
I was writing a comment asking if it was really easier. Then I took a look at Cython. Yes, this looks easier than Perl's XS, which I have some experience with! There are ways to do something similar in Perl these days, notably https://metacpan.org/pod/FFI::Platypus. But these are relatively new (starting in the 2010s) compared to the history of Perl, and Cython goes back to the early 2000s.
Somewhere in the continuum from SWIG through XS and on to Platypus there are also the Inline modules these days. They allow one to put inline sections of other languages into Perl code the way many language tools used to allow one to inline assembly into C or Pascal code.
There are some of these modules for other languages than those listed here, a lot of them as high level as Perl (including Raku and even another Perl system for some reason).
True. For whatever reason, these never displaced XS. For wrapping C libraries in particular, it's not clear to me how much Inline::C helps with this. You're still stuck using a lot of Perl C API calls, AFAICT, which I think is the biggest challenge of using XS (I still have nightmares from trying to figure out where and when to add `sv2mortal` in my XS code).
One or two calls into a library with a simple C interface isn’t that bad with Inline. You just use Inline to handle the Perl to C to Perl part, and actually do the interfacing in the inline C. It’s a lot more mess if you’re doing complex things and bringing complex data structures back to Perl, or having Perl make a lot of intermediate decisions and doing a lot of round trips. So if you use Perl to get the data ready to pass in, pass it in, do all the work in C that you need the library for, then pass the results back to Perl once it’s not terrible.
I’ve tried not to get into XS, so I couldn’t really compare.
Using Inline::C with your own C code in places where Perl is too much overhead is certainly easier than wrapping a complex existing library with a lot of interactions across the boundary.
FFI::Platypus or something like it really is the way of the future though for existing C calling convention libraries in other languages.
As a very long-time Perl developer and FOSS contributor, I think this blog post is incorrect about whether Perl 6/Raku was a factor in Perl's decline. I think Perl 6/Raku did a few things that hurt Perl 5:
1. It pulled away folks who would otherwise have spent time improving Perl 5 (either the core or via modules).
2. It discouraged significant changes to the Perl 5 language, since many people figured that it wasn't worth it with Perl 6 just around the corner.
3. It confused CTO/VP Eng types, some of whom thought that they shouldn't invest in Perl 5, since Perl 6 was coming soon. I've heard multiple people in the Perl community discuss hearing this directly from execs.
Of course, hindsight is 20/20 and all that.
Also, even if Perl 6 had never happened the way it did and instead we'd just had smaller evolutions of the language in major versions, I think usage would still have shrunk over time.
A lot of people just dislike Perl's weird syntax and behavior. Many of those people were in a position to teach undergrads, and they chose to use Python and Java.
And other languages have improved a lot or been created in the past 20+ years. Java has gotten way better, as has Python. JavaScript went from "terribly browser-only language" to "much less terrible run anywhere language" with a huge ecosystem. And Go came along and provided an aggressively mediocre but very usable strongly typed language with super-fast builds and easy deploys.
Edit: Also PHP was a huge factor in displacing Perl for the quick and dirty web app on hosted services. It was super easy to deploy and ran way faster than Perl without mod_perl. Using mod_perl generally wasn't possible on shared hosting, which was very common back in the days before everyone got their own VM.
All of those things would still have eaten some of Perl's lunch.
I think this is mostly the correct take. Perl's strength is that it was really good and quick and dirty one-offs, especially with text manipulation. This made it particularly popular with UNIX sysadmins and sometimes network admins. This was helped by the fact that CPAN made it easy to share a lot of these, which added to its popularity (it can't be overstated how revolutionary CPAN was).
The 1980s/1990s was full of many different data formats in a time before XML/JSON, often by long dead companies. Many a tech person was in a situation where "Oh fuck, how do I get this data out of some obscure database from some dead company from Boston that only ran on SCO UNIX into SAP/Oracle/etc" only to see somebody else already done it and made a CPAN module.
But stories like that became less common as DBs converged into a few players.
Yeah i'll say, even to this day, when I need a quick script and bash just isn't doing it for me I'll just write it in the same perl I used regularly as far back as 1998... but it never goes further than that. If I have to take that script and build it into anything that will ever leave my laptop it's going to get rewritten in something else (probably golang since it is universal on my team).
I would also say -- in the late 90s, Perl's claim to fame was that it had CPAN. At the time, CPAN was revolutionary: a big, centralized repo of open-source libraries, which you could install with a single command.
Now, of course, that's a common and maybe even expected thing for a library to have: Python has Pypi, Javascript has NPM, etc.
And the whole culture around CPAN, too, with the likes of Module::Build and Test::Harness and the strong expectations around POD documents. Nothing like that existed for the other scripting languages of the time.
There was a well-trodden path from writing a hacky one-off script to deal with a specific task, to realising "hey! this might be useful for others too!" and trying to make it a bit more generic, to checking in with your local Perl Mongers for advice, to turning it into a well-tested, well-documented CPAN module.
That was the route I followed as an early-career sysadmin in the dying days of the dotcom boom - it helped me take on much more of an "engineering" mindset, and was an important foundation for my later career.
I can't have written more than a few dozen lines of Perl in the last 15 years, but do I owe that community and culture a lot.
I love tcl. My absolute favorite thing about it is that `man tcl` [1] gives a dozen paragraphs that completely describe the language itself. Its simplicity always astounded me since it seems really simplistic but at some level it’s just as malleable as lisp. I wish it had caught on more (outside the hardware community which seems to have fully embraced it).
I was a Perl programmer from the early 1990s until into the 2000s and I mostly agree with you. It was a variety of factors.
The point where I disagree is I think Perl 6/Raku played a significant role in Perl's decline. It really gave me the perception that they were rudderless and that Perl probably had no future.
Other than that, I absolutely loved Perl. I love the language. It's super expressive. I never took a liking to CPAN. And I wonder if it could make a comeback given better dependency management.
I think Perl with tooling similar to uv would cause me to switch back today.
> The point where I disagree is I think Perl 6/Raku played a significant role in Perl's decline. It really gave me the perception that they were rudderless and that Perl probably had no future.
I assume you disagree with the blog post, not with my comment, since this is exactly what my comment says too!
Between perlbrew and cpanm it's pretty easy to have multiple perl installs with different versions and quite simple to manage your dependencies.
Carton (manage and bundle your perl modules based on lock files) and Pinto (easily run your own private CPAN) provided the icing on the cake that made things really powerful.
I miss working in Perl, but the job market has pulled me in other directions.
This combined with a cpanfile is how I rescued someone else's workshop from being an "Install these missing dependencies" session to being back on track in 3 minutes with "Here's this file, run 'cpanm --installdeps --notest .'"
You mention it last, but I think PHP was most of it. PHP was the first and best-integrated technology for this world, which was huge and impactful. The "center of thought" in the text processing problem area moved hard to web development. And so the ideas about what needs to be improved or changed rapidly centered themselves around "Things PHP Did Badly". And that begat Ruby and Node, not a fixed-up Perl.
Perl remained (and remains!) eminently useful in its original domain of Unix system automation glue and ad-hoc text analysis. But it was denied a path to the future by PHP, and by the time PHP was itself replaced it was too late.
Finally everyone else (python in particular) sorta caught up to the "clever systems glue" feature set, and the world moved on entirely. Perl is mostly forgotten now except by those of us who lived it.
PERL tripped over it's own feet (too clever, line-noise, unmaintainable).
Java(TM) being "guaranteed to be business-like" sucked the serious use cases away.
PHP was easier to grok, had "editable man pages" (ie: comment forum attached to each built-in), and didn't have "slow CGI overhead" or "FastCGI complexity".
Python was waaaay easier to read/write/maintain, and was a serious alternative (except for trickier process-control integration, you couldn't just "$XYZ = `ls -al`" like you could in perl).
...and then "PERL6 will be gloriously filled with rainbows, butterflies, will be backwards incompatible, and will be released Any Day Now(tm)" sucked alll the oxygen out of a developer investing in perl.
By the time nodejs became another contender for server-side languages, there was no place for PERL as it's effectively become kindof a COBOL for unix systems. Don't touch it if you can avoid it, and it requires expensive, difficult-to-find specialists to maintain it if you need to.
I started programming in the late 90s using Perl as my first "real" language. When I first saw some Python code (2.3 if memory serves), I found it much easier to understand. Not necessarily easier to write, but diving into a Python codebase 6 months later was a lot easier than doing the same than with Perl. Purely subjective, but I was far from the only one, even if quite a few people at the time preferred Perl.
Later, when the Web really took off, running PHP 4 was much easier to get going than either Python or Perl, so a lot of us went with that.
And yeah afterwards JS with node.js in particular. The millions pumped into it by Google certainly helped.
There was also a time a bit later with Ruby getting really popular. Not just for new projects, I heard of some codebases migrating from Perl to Ruby, when the writing was on the wall for Perl 5.
Go, I think when it became popular not many people were still using Perl. Also not really something you would use to bang out a quick script with. Probably more of a competitor to Java.
As for me, back to Python as my bread and butter. Probably haven't touched Perl in 20 years at this point. Looks like gibberish to me now...
> Also, even if Perl 6 had never happened the way it did and instead we'd just had smaller evolutions of the language in major versions, I think usage would still have shrunk over time.
Maybe. I mean the whole point of 6 was to modernize perl.
Perl needed efforts like 6 to happen, but it needed them delivered in smaller chunks over the years rather than as a big decade long "And now 6 is here".
Java learned this lesson after Java 8 and 9 which took multi-year effort to deliver 1 or 2 big changes to the language and the JVM. Now Java has multiple efforts in flight which have trickled in over the years (tickling me as a dev). Every 6 month release is a little better which makes the multi-year efforts seem all that much more worth it when they land.
My impression was experienced Perl programmers took pride in making the smallest code possible, all in one line.
At my company they really locked in the project being dead if the original contributors left.
Perl propped up regex (JavaScript regex is based off of it), so I get the impression Perl practitioners tried to make all the code regex-y as possible as a cultural thing.
There was a regular feature in the perl community for 'golf' or 'crazy one liners' but almost no one used that shit in any actual code that left their user directory.
I remember the Perl art competitions, where you would write Perl code that would run and actually do something, and would be ASCII art at the same time. Obviously, lots of camels ;-)
Not to mention that trying to understand existing Perl by asking the community 'what does this do' or 'how does this work' often (in my experience) resulted in 'RTFM' or 'man perldoc' rather than someone taking any time to actually answer the question, whereas other communities were more welcoming and helpful to each other. That made it difficult to learn Perl through other people's code compared to other languages.
I think that depended on where you were looking and how you were asking.
My main source of support back when I did much Perl (late 90s, early 00s) was usenet, and while some groups were very snubby and elitist others were very helpful and encouraging for a young budding programmer.
True. Adding to these. Perl6/Raku suffered from the second system effect. Described in the Mythical Man Month. Larry and company were over confident with their previous success.
> And Go came along and provided an aggressively mediocre but very usable
See, that's one of the things lots of people who enjoy Perl and/or Ruby in the comments around in this thread don't quite grasp: some languages require programmers possessing a somewhat special state of mind to read and write productively, and some languages allow pretty much every mediocre programmer to read and write, and still produce a manageable program.
The other thing is the information density. In my experience, most people after graduating high school have experience with reading mainstream fiction and school textbooks, and those things you can half-skim/half-read and still get out most of the meaning. Then those people hit the college-/university-level textbooks and screech and crash because those books, you have to read them sentence by sentence; not everyone can get get used to it (or even realize they have to do that in the first place). And similar observations hold for programming languages: Perl and APL are just way too dense compared to Go and Python; if you're used to reading code line-by-line (skimming over half of them), then it's really bloody annoying to switch to reading sigil-by-sigil (as for writing, well, we all know that typing speed was never really a bottleneck for programmers).
My main complaints about Go are not that it needs more obscure syntax. The biggest problem with Go is basically that the core of the language's syntax is special, and only accessible to the compiler. This goes hand in hand with the language not offering generics out of the gate. This means that things like slices, maps, channels, etc. are all special. You cannot implement anything similar that uses the same syntax (even now that generics exist).
This lack of flexibility means that it's impossible to experiment with replacements for built-ins, and the lack of generics out of the gate meant so many things were simply impossible (like useful iterators).
Compare this to Rust, where almost everything like this is just a trait. If you want to offer a map replacement, you just implement the Index and IndexMut traits.
Overall, I don't think Perl is the best language design. It has some interesting ideas. Go is _also_ not the best language design. Is Rust the _best_? No, but it's better than both Perl and Go, IMO.
Many of us see that as an important feature, and a smaller set of people aren't too happy with the generics introduction for example. Or the recent iterators stuff they have added.
It makes codebases touched by a lot of people an absolute breeze to understand. There's no clever generic data structure/soup of traits/clever conditional types I have to understand from scratch. Everything is the same old boring maps and slices and for loops, nested sometimes. And functions mostly always return 2 values. There is no map/filter/anything. The name of function usually betrays the types. "getUserPreferencesBatch" is most likely a `func(userIDs []UserID) map[UserID]Preferences, error`. There's <1% chance it is anything else. People also tend to avoid relatively complicated datastructures like trees unless they are really really necessary. So you get boring, completely predictable, 70% optimal code.
Even when discussing implementation, people think in simple terms which leads to really simple implementations in the end across all team members. It basically makes drumming KISS into everyone really easy.
Now some people go all clean code or make functional wrappers and such, and that destroys all that's good in go.
But I don't want to read your cute custom hashmap code. Go really shot for the stars with its primary design goal of uniformity. The hashmap you're using is the same one I'm using, is the same one any random Go repo on Github I open uses. The value of this is immense—I instantly feel right at home, I know the exact semantics you're using—we're speaking the same language.
To wit, I would argue that Go didn't go far enough in restricting the user and certainly did not pick the right features to include. I don't think it's clear at all that had go shipped with sum types—enabling better error handling—and iterators, more built in generic data structures, and higher level abstractions around its concurrency, but no generics at all, that we wouldn't end up with a far better language. A more restricted one, with even less room for anything custom, but a better one.
Perl was pretty much first in the wave of interpreted languages from the late 80ies and 90ies. It set the bar on what to expect from such ecosystems.
But being the first meant it got some oddities and the abstractions are not quite right imho.
A bit too Shell-esque, specially for arguments passing and the memory abstractions are a bit too leaky regarding memory management (reference management fills too C-esque for an interpreted language, and the whole $ % @ & dance is really confusing for an occasional and bad Perl dev like me). The "10 ways to do it" also hurts it. It lead to a lack of consistency & almost per developer coding coding styles. The meme was Perl is a "write only language".
But I would still be grateful of what it brought and how influential it was (I jock from time to time how Ruby is kind of the "true" Perl 6, it even has flip flops!).
In truth, these days, I feel the whole "interpreted languages" class is on the decline, at least on the server. There are a lot of really great native languages that have come up within the last few years, enabled in large part by LLVM. And this trend doesn't seem over yet.
Languages like Rust, Swift, Go, Zig or Odin are making the value proposition of interpreted languages (lower perf but faster iterations) less compelling by being convenient enough while retaining performance. In short, we can now "have the cake and eat it too".
But the millions of lines in production are also not going awywhere anytime soon. I bet even Perl will still be around somewhere (distro tooling, glue scripts, build infra, etc...) when I retire.
Anyway, thank you Perl, thank you Larry Wall, love your quotes.
I wouldn't call making the right decision confused. It clarified and justified their desire not to use Perl. Only a confused CTO/VP Eng type would choose to use Perl in spite of all its entrenched disadvantages and much better more popular alternatives.
I was just saying that they were confused about what was going on with Perl because if the Perl 6 effort. I don't think that they were confused or wrong in their decision making.
FWIW, I don't think he wanted to hide his identity. He talked about just not wanting patients to google his full name and find his blog, as opposed to preventing people who read his blog from finding out his name.
I can think of two instances in the last 10 years where I've stayed at a hotel and had my room become disgustingly full of pot smell from someone else in the hotel. So yes, people smoke in hotels.
That said, I haven't smelled _cigarette_ smoke in a hotel in recent memory.
Removing a body part is also an effective way to get a pass code or pattern. If someone is willing to do you great harm to unlock your phone, I don't think it matters much what locking method you use.
The jurisdictions where a cop will with force put your finger to the fingerprint reader, but won't beat the keys out of you cover most of the first and parts of the second world.
And if you are not in one of those countries just politely ask for a blank white sheet of paper to sign and let them fill the rest. You will save everyone some time.
I would not necessarily trust that. Had an army coworker who regularly beat information out with rubber hoses where he was stationed. Sort of the evil version of don’t ask don’t tell.
People without safety and security awareness often underestimate a) time, b) the neccessary criminal energy, c) opportunity, and d) logistics (e. g. seclusion, tools).
But when one has technology that works for the attacker by conveniently elimininating the mentioned problems almost completely, than your set of security features is just a pathetic lie... as well as a self-delusion.
Which, to be honest, a lot of safety and security measures and technology are to most people. ;)
I'm working at a mid-size company, c. 5,500 employees. We have unlimited vacation (for US employees). I have taken about 6 weeks every year with no issues. I made sure to check that this was okay before accepting the offer, because unlimited can also mean "no one ever takes vacation".
The problem with unlimited vacation is that the majority of engineers never have the guts to ask for it, just like they don't know how to negotiate. Unlimited vacation is a benefit for the company not the employee. It avoids taxes and complications due to carryover constraints.
Unlimited vacation is in no way the same thing as 6 weeks of actual vacation. Look into it a bit more. You'll see that unlimited vacation is really just a scam. In your case it's worked out. Actual vacation days are tracked and part of your salary. For example, if you leave, you are entitled to the wages of your remaining untaken vacation days. Unlimited vacation is a favor that your manager is granting. The company is not obligated to grant your request. You have no legal right to take the days off as you do with real vacation.
I do understand the difference, thanks. I think the degree to which it's a scam depends on the employer. Some say they have unlimited vacation and then the work culture discourages using it. Fortunately, that's not the case where I work.
The other benefit of not-unlimited is having it paid out when you leave. But if I had 6 weeks a year PTO, I'd just use it all anyway, so the part where it gets paid out isn't a big deal to me.
PHP? I don't know how widely it's still used, but I'd guess it's more widely used than Perl. Also, PHP is not "as bad" as Perl. It's much, much, much worse. It's Perl without the charm.