> But perhaps most importantly, debuggers are an intricate piece of the puzzle of the design of a development platform—a future I become more interested in every day, given the undeniable decay infecting modern computing devices and their software ecosystems.
I agree with this sentiment, yet still I’m wondering if it’s fully justified. There has never been more bad software than right now, but there has never been more good software either, no?
It’s not super relevant to the main contents of the article. Just a bit that caught my attention with regards to how it made me think.
I think there's two 'poles' here--'good' in terms of feature rich, sure, there's loads of it.
'good' as in performant--an area that game dev types (rightly, IMO) criticize and harp on? There's far less of it, video games aside.
Think of the perceivable slowness of many web application you use daily, Windows 11's, well, everything UI-related, etc.
Hell, my 3 year old iPhone can't scroll Uber Eats at 60fps consistently. Is 'Uber eats' 'good'? From a functionality standpoint, yeah, of course. But is displaying a list of images and text and expecting it to scroll smoothly too much to ask?
Software can be 'good' in terms of functionality offered and 'bad' at the same time, depending on your perspective.
IIRC I think Mr Fleury has a background in game-dev, so his perspective is totally understandable. Modern games are remarkable feats of software.
Thinking selfishly, the absolute quantity of good/bad software isn't as important as the software you have to interact with on a day-to-day basis. Good software is invisible and under-appreciated, you use it for it's purpose and move on, bad software really sticks out.
Good point. The amount of bad software I’m forced to interact with regularly has gone up, mostly because there’s so many systems cobbled together in workflows now.
It’s a good question, what exactly is this decay and why is it called undeniable? Is that even true? If I think back on what programming was like thirty years ago up through today, everything about computing has steadily improved, gotten easier, more reliable, and higher quality, on average. All operating systems crash a lot less often than they used to. Computing devices from desktops & laptops, to phones, to routers & NASes, to household appliances, have all become faster, better, cheaper, and have more features and higher utility.
There are some ways I could see the author being somewhat justified, especially when it comes to the need for debuggers. Software is getting more layers. The amount of it and the complexity of it is going up. Debuggers are super useful for helping me understand the libraries I use that I didn’t write, and how my own code interacts with them. There are also a lot more people writing code than there used to be, and because the number of people writing code has been growing, that means the distribution skews toward beginners. I feel like the number of languages in popular use is also going up, and the diversity of coding environments increasing. I don’t know that I would frame all this as ‘decay’ but it does mean that we’re exposed to higher volumes of iffy code and abstractions over time.
Right. It's incredible that something like Linux is free. For a more recent example, look at Vs code. An even more recent example, look at how many open weight llms there are out there.
I definitely would not call VS Code good software, at least not overall. It's good in that it's not buggy, but it uses an absurdly high amount of system resources without any actual benefit. It is not ok that just to open a handful of small text files, it uses 1-2 GB of memory.
Yeah vs code was one of the first examples I thought of as well. It has its own set of issues for sure, but even as a former vim fanatic it’s amazing from both a default experience perspective and that of a power user.
HA! Comparing VSCode to Linux is like comparing an overweight, acne-ridden drug addict that lives in his mom's basement to an astronaut with 3 PhDs. They're barely even the same species.
Regarding 3: Shouldn’t the medical system be optimizing for patient outcomes rather than the business their in?
Regarding the first two: I think the anecdote being from 1995 suggests there would have been time to put together said mountain of research.
I’m not agreeing that this is shameful for the original doctor, but I do think it’s shameful if avenues for potential research are not taken because it’s inconvenient for the hospitals.
They gave morning infusions because it was convenient. To get my father the evening infusion we had to hire private duty nurses to come to his apartment.
Is it even up for debate that that’s definitely not what their primary mission is? Their market cap sits at 3.5 trillion, ranking them third behind Microsoft and nvidia. Unlike those other two, Apple makes most of that on selling iPhones and the like to consumers.
That’s not really at odds with the goal of empowering creatives.
A significant chunk of every iPhone and iPad release is features specifically for creatives.
This specific site doesn’t cater to creatives and will often be full of developers comments bemoaning those things, but I really challenge anyone to look at any of their Mac/iOS product releases in the last decade and point out how creatives aren’t still a big component of their DNA.
Like of the Nietzschan philosophy? So in the case of trump the idea is that his voters like him because he’s different from the “evil” aristocratic class that trump claimed to oppose (eg “drain the swamp”)?
I just think that most people (on both political sides) are not really better. If they would be given the position of power they would be corrupted and incompetent too.
So in a sense you got what you deserve - and your democracy is working.
They have definitely replaced jobs! There are tons and tons of sites and applications now that have been built with these tools and otherwise would have been built by software developers using wordpress or drupal or rails or whatever.
But that doesn't mean the developers who would have otherwise done that work were just disemployed by the success of these tools. No, they just worked on different things.
And I think that is a valuable lesson that can be applied (though I think not perfectly) to this LLM era.
I think the more optimistic interpretation would be that companies eliminating bullshit jobs would provide signal on which jobs aren’t bullshit, and then individuals and the job prep/education systems could align to this.
That’s very optimistic! I don’t fully agree with it, but I certainly know some very intelligent people that I wish were contributing more to the world than they do as a pawn in a game of corporate chess.