Even before LLMs became big I started hording solid technical books, as there was so much misinformation on Google/SO that any non-trivial technical question could not be answer without a high probability that the answer was fundamentally wrong.
LLMs are super helpful for learning, but without the foundation of a true textbook at your side they will very easily go off the rails into a world of imagination.
As an avid reader (and sometimes writer) of technical books, it's sad to see the, perhaps inevitable, decline of the space. I still remember in the early 2000s Barnes and Noble would still have massive shelf space devoted to every technical topic you could imagine. I could spend hours just exploring what languages and topics there were I didn't even know existed. Powell's Technical Books used to be an entire separate store filled with books on every technical topic imaginable.
The publishing industry veterans I've worked with told me it was even more incredible during the height of the dotcom boom: book sales in the 100,000 copy range was not that rare.
Today I can only think of two truly technical book stores that still exist: The MIT Press Bookstore in Cambridge, MA and Ada Books in Seattle, WA. The latter, while a delightful store, has relegated the true technical book section to the backroom, which unfortunately doesn't seem to get refreshed too often (though, part of the beauty of this is it still has many of the weird old technical books that used to be everywhere).
Same with the Stanford university bookstore. Was one of the better bookstores in the Bayarea. Used to have a whole room of technical, science, math books. It too is a shadow of its former self. So sad.
> told me it was even more incredible during the height of the dotcom boom
I was a developer in the 90s before Netscape even came out. I didn't have a computer at home and dialup barely existed. If you wanted to do computer stuff you had to read. If you wanted to try a library you had to buy a CD from a bookstore or mail in an order which would get posted to you.
I too am avid reader and was visiting five local bookstores on a weekly basis. Several of them had huge areas stocked with tech books. I had tried Amazon maybe six months after it launched and bought there sporadically. But almost any book i sought was available locally and the savings weren't worth the convenience of purchasing locally.
Then in a three month period in late Spring 2000 all the programming books disappeared. Then my choice was between Amazon with quick delivery and the local store with a slower delivery and a higher price. So been buying from Amazon ever since and I can't remember the last time I have visited a bookstore.
The UW bookstore in Seattle like many big science schools had a wondrous technical book section. Isles of Springer. The bookstore itself is a shadow of its former shelf.
My own college experience heavily soured me on both book stores and especially school run book stores. The markup was obscene and their buy back rates were worse.
Half price books and a few other book stores lulled me back a few times, but nonfiction books are kept around mostly as eye candy at this point.
All the US universities outsourced their bookstores.
Now I can't even walk in and browse what books the various departments are using for classes, anymore. Everything is now behind bars and completely inaccessible.
Weird, I have honestly never walked into a Barnes and Noble and had satisfaction with any of their technical content on the shelf. That pleasure died when we lost Borders.
Yeah, peak experience for me was when our town had both a Borders and B&N offering huge tech book sections. Then Borders closed. Then B&N became a toy store.
But a lot of is also in blogs and (video) tutorials. As well as Stack Overflow.
And all very searchable.
The old brick-of-paper approach to tech manuals just isn't a thing any more. I don't particularly miss it.
It was, if you think about, usually a slow and inefficient way to present information - often better at presenting what was possible than how to do make it happen.
> often better at presenting what was possible than how to do make it happen.
that, i feel, is the chilling aspect to this situation. does the lack of new books explaining what's possible, imply that our society's opportunites for growth are dwindling?
> Unfortunately, I cannot read technical books fast and definitely not fast enough to make the subscription be worth $500 per year.
For me I find the $500 to be a pretty clear win as far as value goes. My shelves are already overflowing with, while not "timeless", much slower aging technical books. But quite often, throughout a year, I'll want a deeper dive into a current topic than I can get from online resources + Claude. Quite often that dive involves wanting to look through multiple books (even if only using a few chapters).
I know I'm a dying breed, but, while I love AI for interactive exploration and learning, I find books more valuable in the era of endless YouTube tutorials and AI slop blog posts. Technical topics benefit from "big picture" thinking that basically doesn't exist in modern short-form web content.
Books still activate a different part of the brain than reading on a screen, including e-ink, so it's not you or a dying breed, people may turn out to not learn as deeply or as quickly.
> who are actually working with it only become more bullish
I have a feeling the word "actually" is doing a lot of work with this. I shipped AI facing user products a few years ago, then worked in more research focused AI work for awhile (spending a lot of time working with internals of these models). Then seeing where this was all headed (hype was more important than real work) decided to go back to good ol' statistical modeling.
Needless to say, while I think AI is absolutely useful, I'm bearish on the industry because current promises and expectations are completely out of touch with reality.
But I have a feeling because I'm not currently deploying a fleet of what people are calling "agents" (real agents are still quite cool imho), you would describe me as not "actually" using AI.
China is focusing heavily on AI applications. They have basically decided already to deal with their coming demographic bust with robuts/AI rather than immigration. Its not even about military applications, the US is just afraid that China will shoot so far ahead of us economically that they won't have any leverage over it in the future at all.
There's a lot of nonsense that comes out on both sides of the aisle. I wish there was a solid single source of truth to figure out what's really going on in China and what's really going on behind the scenes in the U.S.
Some talk about how China has some strategic issues, such as do they have a reliable supply of food and energy? (Zeihan etc.)
I guess the energy portion is being solved with renewables. And I guess if they solve the issue of demographic collapse with robots and AI, that's something.
But really, if there's less people and they're getting older, what's the point? What are they really working towards?
This question is also becoming a problem post-Trump immigration ban in the U.S.
Who knows what the U.S.'s demographics are going to look like now?
Trump inherited a U.S. with some of the best demographics of all nations on the planet, especially in the West. And he managed to throw that in the garbage.
> I wish there was a solid single source of truth to figure out what's really going on in China
What kind of sources are you looking for? The Five Year Plans are the best source of truth for what they are planning on doing nationwide. The annual Statistical Communiqué on National Economic and Social Development and China Statistical Yearbook from the NBS contain statistics on how that implementation is going. Then every year the NDRC delivers the Report on the Implementation of the Plan for National Economic and Social Development and on the Draft Plan to the National People’s Congress which packages up the statistics on how the plan is progressing.
They’re the most reliable source we’re going to get without being party insiders. There’s still Soviet-style inflation of figures to meet quotas but China has been cracking down on that for the last few decades because they want accurate data for the five year plans. I think it’s more of a problem with outer provinces, less so for the major manufacturing hubs.
Alternative sources to verify are a bit harder to find without knowing the languages (lots of the NRDC and NBS stats are available in English).
Yes, people also compare some of these statistics with export/import data and with data from other countries on the other side of these transactions, and the numbers match.
You could just go over there and live for a few years, you can be your own source. But yes, they have energy, no they don't have oil, yes they have lots of agriculture land, no they messed up some of their environment and that will take time to heal, yes they are working on it.
> But really, if there's less people and they're getting older, what's the point? What are they really working towards?
China wants to be a rich country even if their population stabilizes at only 900 million people or so. Mostly they want to avoid the middle income trap, which would have been a problem regardless of their demographics falling off a cliff. Automation is the best way to get around it, and they have enough tech, production know how and capacity, and smart people to pull that off.
China is going to continue doing what is best for it, and they haven't gone stupid like the USA has. Embracing AI for productive uses rather than just fixating on the slop produced is one place where they are racing past the west.
There's a lot of nonsense that comes out on both sides of the aisle. I wish there was a solid single source of truth to figure out what's really going on in China and what's really going on behind the scenes in the U.S.
I've always assumed that there is such a source of truth, but that I had never heard of it, wouldn't have access to it, and couldn't afford it if I did.
Reading a few tweets from Musk was all it took to correct that misapprehension. It's increasingly clear that nobody at any level of play knows jack shit about anything.
> There's a lot of nonsense that comes out on both sides of the aisle. I wish there was a solid single source of truth to figure out what's really going on in China and what's really going on behind the scenes in the U.S.
Isn't this simply the answer?
That what's going on is gaslighting of the public and that there are people behind the scenes and they don't want hoi polloi to know what they're up to?
This geo-politics (or politics) talk is 'intellectual' men's astrology.
When a woman asks me my astrological sign, I know she's a deeply unserious person. When a man says 'do they have a reliable supply of food and energy'...
I know this is mostly paranoid thinking on my behalf, but it almost feels like this is a conscious effort to attempt to destroy "personal" computing.
I've been a huge advocate for local, open, generative AI as the best resistance to massive take-over by large corporations controlling all of this content creation. But even as it is (or "was" I should say), running decent models at home is prohibitively expensive for most people.
Micron has already decided to just eliminate the Crucial brand (as mentioned in the post). It feels like if this continues, once our nice home PCs start to break, we won't be able to repair them.
The extreme version of this is that even dumb terminals (which still require some ram) will be as expensive as laptops today. In this world, our entire computing experience is connecting a dumb terminal to a ChatGPT interface where the only way we can interact with anything is through "agents" and prompts.
In this world, OpenAI is not overvalued, and there is no bubble because the large LLM companies become computing.
But again, I think this is mostly a dystopian sci-fi fiction... but it does sit a bit too close to the realm of possible for my tastes.
My kids use personal computing devices for school, but their primary platform (just like their friends) is locked-down phones. Combining that usage pattern with business incentives to lock users into walled gardens, I kind of worry we are backing into the destruction of personal computing.
Wouldn't the easy answer to this be increased efficiency of RAM usage?
RAM being plentiful and cheap led to a lot of software development being very RAM-unaware, allowing the inefficiencies of programs to be mostly obfuscated from the user.
If RAM prices continue rising, the semi-apocalytic consumer fiction you've spun here would require that developers not change their behaviors when it comes to software they write.
There will be an equillibrium in the market that still allows the entry of consumer PC's it will just mean devices people buy will have less available RAM than is typical. The demand will eventually match up to the change in supply as is typical of supply/demand issues and not continuously rise into an infinite horizon.
I believe that while centralized computing excels at specific tasks like consumer storage, it cannot compete with the unmatched diversity and unique intrinsic benefits of personal computing. Kindle cannot replace all e-readers. Even Apple’s closed ecosystem cannot permit it to replace macOS with iPadOS. These are not preferences but constraints of reality.
The goal shouldn’t be to eliminate one side or the other, but to bridge the gap separating them. Let vscode.dev handle the most common cases, but preserve vscode.exe for the uncommon yet critical ones.
That "dumb terminals" still need to run a modern web browser (likely Chrome) on a modern OS (likely Windows), these aren't exactly efficient with the available computing sources, so you could give up a lot of resources before you would actually trade it of for computing ability. Also resources have been risen exponentially for the last decades.
This is exactly the kind of thing you would expect to happen and to feel in an insane unsustainable bubble.
I am much more worried looking at these ridiculous prices on newegg that memory will be dirt cheap 3 years from now because the economy has imploded from this mass stupidity.
I was blown away by Gemini 3 at first but now from using it I have ran into all the dumb things it does because it is a large language model.
What I notice getting shorter is the time between the frontier model making me feel I will have no job prospects in the future to the model reminding me that LLMs are fundamentally flawed.
It is because I want to believe in AGI. I love the holy shit moment of a new model, it is so exciting. I don't want to face the reality that we have made an enormous mistake. I want to believe OpenAI will take over computing because the alternative of some kind of Great AI winter bubble burst would be such a horrible experience to go through.
That's not disproving OP's comment; OpenAI is, in my opinion, making it untenable for a regular Joe to build a PC capable of running local LLM model. It's an attack on all our wallets.
Why do you need a LLM running locally so much that's the inflated RAM prices are an attack on your wallet? One can always opt not to play this losing game.
I remember when the crypto miners rented a plane to deliver their precious GPUs.
Some models are useful; using whisper.cpp comes to mind to create subtitles for, for example, family videos or a lecture you attended without sending your data to an untrusted or unreliable company.
My first one- a Gateway 486/66- started with 4MB RAM (in 1993). It could run linux, X windows, emacs, and G++ in a terminal all at the same time, but paged so badly the machine was unusable during a compile. I spent $200 to double it to 8, then another $200 to double it to 16, then another $200 to double it to 32MB (over a couple years), at which point, the machine absolutely flew (no paging during compiles). It seemed like an obscene amount of money for a college student to spend, but the lesson taught me a lot about computer performance and what to upgrade.
I don’t think you need a conspiracy theory to explain this. This is simply capitalism, a system that seems less and less like the way forward. I’m not against markets, but I believe most countries need more regulations targeted at the biggest companies and richest people. We need stronger welfare states, smaller income gaps and more democracy. But most countries seems to vote in the absolute opposite direction.
It’s not a conspiracy, it’s just typical dumb short term business decisions amplified and enabled by a cartel supply market.
If Crucial screws up by closing their consumer business they won’t feel any pain from it because the idea of new competitors entering the space is basically impossible.
> You make an LLM decision tree, one LLM call per policy section, and aggregate the results.
I can never understand why people jump to these weird direct calls to the LLM rather than working with embeddings for classification tasks.
I have a hard time believing that
- the context text embedding
- the image vector representation
- the policy text embedding(s)
Cannot be combined to create a classification model is likely several orders of magnitude faster than chaining calls to an LLM, and I wouldn't be remotely surprised to see it perform notably better on the task described.
I have used LLM as classifier and it does make sense in cases of extremely limited data (though they rarely work well enough), but if you're going to be calling the LLM in such complex ways it's better to stop thinking of this as a classic ML problem and rather think of it as an agentic content moderator.
In this case you can ignore the train/test split in favor of evals which you would create as you would for any other LLM agent workflow.
> Just thinking maybe we're not seeing the end of software engineering for those of us already in it—but the door might be closing for anyone trying to come up behind us.
It's worth considering how aggressively open the door has been for the last decade. Each new generation of engineers increasingly disappointed me with how much more motivated they were by a big pay check than they were for anything remotely related to engineering. There's nothing wrong with choosing a career for money, but there's also nothing wrong about missing a time when most people chose it because they were interested in it.
However I have noticed a shift: while half the juniors I work with are just churning out AI slop, the other half are really interested in the craft of software engineering and understanding computer science better.
We'll need new senior engineers in a few years, and I suspect they will come from a smaller pool of truly engaged juniors today.
This is what I see. Less of door slamming completely shut, more like, the door was enormous and maybe a little too open. We forget, the 6 month coding bootcamp to 6 figure salary pipeline was a real thing for a while at the ZIRP apex.
There are still junior engineers out there who have experiments on their githubs, who build weird little things because they can. Those people were the best engineers anyway. The last decade of "money falls from the sky and anyone can learn to code" brought in a bunch of people who were interested in it for the money, and those people were hard to work with anyway. I'd lump the sidehustle "ship 30 projects in 30 days" crowd in here too. I think AI will effectively eliminate junior engineers in the second camp, but absolutely will not those in the first camp. It will certainly make it harder for those junior engineers at the margins between those two extremes.
There's nothing more discouraging than trying to guide a junior engineer who is just typing what you say into cursor. Like clearly you don't want to absorb this, and I can also type stuff into an AI, so why are you here?
The best engineers I've worked with build things because they are truly interested in them, not because they're trying to get rich. This is true of literally all creative pursuits.
I love building software because it's extremely gratifying to a) solve puzzles and b) see things actually working when I've built them from literally nothing. I've never been great at coming up with projects to work on, but I love working on solving problems that other people are passionate about.
If software were "just" a job without any of the gratifying aspects, I wouldn't do nearly as good a job.
heh. i am making software for 40 years more-or-less.
Last re-engineering project was mostly done when they fired me as the probational period was almost over, and seems they did not want me further - too expensive? - and anyone can finish it right? Well...
So i am finishing it for them, one more month, without a contract, for my own sake. Maybe they pay, maybe they don't - this is reality. But I want to see this thing working live.. i have been through maybe 20-30 projects/products of such size and bigger, and only 3-4 had flown. The rest did not - and never for technical reasons.
Then/now i'll be back to the job-search. Ah. Long lists of crypto-or-adtech-or-ai-dreams, mostly..
Mentoring, juniors? i have not seen anything even faintly smelling of that, for decade..
The principle applies to a world where people work in offices doing serious long term R&D work. The quote is entirely irrelevant to people in working open offices for projects that change direction quarterly building features designed to make PMs look busy.
LLMs are super helpful for learning, but without the foundation of a true textbook at your side they will very easily go off the rails into a world of imagination.
reply