Hacker Newsnew | past | comments | ask | show | jobs | submit | qingcharles's commentslogin

One of the PC games that worked great on the sorta-PC 186 RM Nimbus which a lot of British schools had in the 80s and 90s.

I'm thinking about how much money Anthropic etc are making from intelligence services who are running Opus 4.6 on ultra high settings 24 hours a day to find these kinds of exploits and take advantage of them before others do.

Expensive for me and you, but peanuts for a nation state.


NASA confirmed it is Venus, yeah.

I know some MAGAs. I promise you they believe it 100%. They often talk of ice walls and one asked me if the Artemis mission would "break through the firmament"?

There is a huge side of TikTok and Reels that most of us here would never find on our feeds which is dedicated to insane conspiracy theories and constitutes a large amount of the media that MAGAs etc consume.


If you think MAGA followers believe in flat-earth, you've deluded yourself badly.

I remember watching grainy B+W ones in the 80s via a dish. It might have been a slow scan signal back then? It blew my mind watching the Earth live as a disk, seeing the weather in realtime.

Trying to figure out where you can read it.

Vol I here: https://babel.hathitrust.org/cgi/pt?id=uc1.31822039258330&se...

Vol II here: https://babel.hathitrust.org/cgi/pt?id=uc1.31822039258355&se...

This is monetization of the commons, charging $1000 for a print out of something that's public domain.


This was a really neat set of views, thank you.

Just switched from 3.1 Flash Lite to Gemma-4 31B on the AI Studio API since there is a generous 1500/day on non-billed projects. It's doing fantastic.

I was asked by someone recently to try to set up an OpenClaw that would search for ordinances and other land registry information for all 3000+ counties/parishes in the USA to obtain and distill specific details on their support for building tiny homes.

What is OpenClaw doing here that Claude Desktop or Claude Code couldn't do?

Claude Desktop and Code are built for synchronous, human-in-the-loop interactions. Scraping 3000 janky municipal websites, you need a "fire-and-forget" background worker. Claw lets you kick off a massive job and just get a ping when it's done.

I'd also instantly hit Claude Desktop's rate limits with this I reckon. Since Claw uses APIs, you bypass those limits and can route the messy scraping to cheap models, saving expensive ones for the actual analysis. It also handles Playwright integration and state persistence out of the box so a crash doesn't wipe your progress.

If I'm wrong, I'm open to learning. I'm as new to this as everyone :)


I would first automate everything with scripts, and only use an agent for the parts that require it.

For example you mentioned playwright? That can be automated. It doesn’t need to be a free form tool that the agent uses at will.

If that means the scripts need to be adopted to changes, then that’s a separate, controlled workflow.

This approach can save you a ton of tokens, increasee reliability and observability, and it saves compute as well.

Sometimes it‘s useful to let the agent do things fully agentic, so you can then iteratively extract the deterministic parts.


I can't even get into my own damned account with the username, password and recovery email.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: