Hacker Newsnew | past | comments | ask | show | jobs | submit | 0x000xca0xfe's commentslogin

If you don't keep and cross-reference documents it is really easy to circumvent, e.g. by kids asking their older siblings to sign them up.

I don't think a bulletproof age verification system can be implemented on the server side without serious privacy implications. It would be quite easy to build it on the client side (child mode) but the ones pushing for these systems (usually politicians) don't seem to care about that.


Yep, it is easy to circumvent, and the silver lining of all of this is that regulators don't care. They care that these companies made an effort in guessing.

RISC-V chip designers at least seem to be more bullish on vectors. There is seriously cool stuff coming like the SpacemiT K3 with 1024-bit vectors :)

The 1024-bit RVV cores in the K3 are mostly that size to feed a matmul engine. While the vector registers are 1024-bit, the two exexution units are only 256-bit wide.

The main cores in the K3 have 256-bit vectors with two 128-bit wide exexution units, and two seperate 128-bit wide vector load/store units.

See also: https://forum.spacemit.com/uploads/short-url/60aJ8cYNmrFWqHn...

But yes, RVV already has more diverse vector width hardware than SVE.


It's a low clocked (2.1GHz) dual-issue in-order core so obviously nowhere near the real-world performance of e.g. Zen5 which can retire multiple 256-bit or even 512-bit vector instructions per cycle at 5+ GHz.

But I find the RVV ISA just really fascinating. Grouping 8 1024-bit registers together gives us 8192-bit or 1-kilobyte registers! That's a tremendous amount of work that can be done using a single instruction.

Feels like the Lanz bulldog of CPUs. Not sure how practical it will be after all, but it's certainly interesting.


I've read many, many science fiction books of that era and they were brimming with an optimism that vanished in contemporary books.

There used to be an underlying theme of "humanity will figure it out, even if we make mistakes".


It's not that easy to beat evolution, some will still have kids while those who only care about the fun will die out.


No need to wait: they've already fried themselves out of the evolution game with STDs. Any child they have will likely be retarded or diseased in some way.

Don't forget to include alcohol as a drug - "fetal alcohol spectrum disorders", FASDs, are a real thing.


> We do not work in "trust me bro" territory when it comes to signing software, anymore. I am sorry/not-sorry to say. It is very important to have a chain of trust that goes up somewhere above "goldenarm @ HN".

If you so deeply believe in giving up user freedom and delegating control to authority maybe you are at the wrong place here, check the title of this website: "Hacker News"....


Luckily this board runs with old DDR4 sticks. If you still have some lying around good for you.


It's part of their secret strategy to turn oldschool Windows dinosaurs into enthusiastic Linux power users. Next they'll introduce middle click pasting.


Now that GNOME wants to abandon it.


GNOME devs really are special. I wonder why.


It is not just GNOME devs. Try to interact with systemd-poettering or I-pwn-glibc-Drepper. For some reason the Red Hat centric guys are troublemakers.

More recently KDE devs also became troublemakers - first David "all must use systemd", then nate "I-can-ask-for-donations-at-will-by-placing-a-trojan-daemon-onto-people-whose-sole-job-is-to-ask-for-donations" (more about this guy here: https://jriddell.org/2025/09/14/adios-chicos-25-years-of-kde...) and of course the "there are no xorg-server users left on KDE, so all must use wayland". Developers became a LOT more like dictators in the last 10 years specifically. This was a change indeed. I am not sure what happened, but things changed. GTK is now also a pure GNOMEY dev-kit. Good luck trying to convince the GTK devs of anything that used to be possible in gtk2 or gtk3 - it is now GNOME only.


I'm pretty scared what userland piece of software will be re-written while ditching backwards compatibility and making the current body of support knowledge worthless. After all, we've replaced the display server (sort of), audio, init and service management, network commands (netplan) if not much more.

My bet would be on a rewrite of CUPS in Rust. Oh, your printer that worked for 20 years is now a useless brick? What a shame, at least now the printing subsystem is secure and blazing fast.


Not even Rust zealots want to touch printing ;)


Well, as Apple has basically abandoned CUPS, and not everyone uses the OpenPrinting fork, things in that space are getting "fun".


Please, printers have never worked, it's a side effect when they do.


> My bet would be on a rewrite of CUPS in Rust.

Please, don't give them any ideas.


https://pointieststick.com/2025/03/10/personal-and-professio... for the sake of completeness here's Nate Graham version of events.


*turn it from default-on to default-off


It's still a change. GNOME dictates onto users what the developers think the users should use or have. I find that not acceptable.


I once watched a co-worker completely bork a customer system by accidentally middle-clicking while moving his mouse after copying an ls -l of /usr/bin (where pretty much everything was a symlink to the real executables in /bin).

Yeah, he shouldn't have been logged in as root, but the point remains that middle-mouse paste can be extremely dangerous and fat-finger-prone.


I love Linux, but the cut and paste situation is really terrible. The middle mouse paste isn't a problem for me--it's that there are two separate "clipboard" buffers, which just causes all sorts of problems.


Having two separate clipboard buffers is a feature I intentionally use.


Yup, both have their uses. If you use a clipboard manager or have the clipboard synchronized between devices/remote desktops/VMs, the primary selection comes in handy for stuff you don't exactly want saved to disk, crossing VM boundaries, or transmitted over the network. I use middle-click pasting primarily for its separate buffer.


You and I both.



Except it's not a bug that found use. It's intentional behavior. From https://specifications.freedesktop.org/clipboard/latest/:

> The rationale for this behavior is mostly that [having a unified clipboard] has a lot of problems, namely:

> - inconsistent with Mac/Windows

> - confusingly, selecting anything overwrites the clipboard

> - not efficient with a tool such as xclipboard [(tool that maintains a history of specifically CLIPBOARD; it would be messy to keep a history of all selections)]

> - you should be able to select text, then paste the clipboard over it, but that doesn’t work if the selection and clipboard are the same

> - the Copy menu item is useless and does nothing, which is confusing

> - if you think of PRIMARY as the current selection, Cut doesn’t make any sense since the selection simultaneously disappears and becomes the current selection


The selection buffer is easier to understand if thought about more simply. Middle click to “put my selection here”.

The actual clipboard is a separate feature in my mind.


You can unify the middle mouse selection and the regular clipboard in KDE if you wish. Personally I find keeping them separate very convenient.


There are a number of DE-independent clipboard managers that can do that as well as other features, like keeping a clipboard history so you can copy in series then paste in series, or having keyboard shortcuts transform the clipboard contents by way of a command, so you can e.g. copy some multi-line text then paste it as a single line joined by spaces.


I use "autocutsel" to synchronize the cut buffer and clipboard in X. Not sure what Wayland might need to do this or if it even has a similar concept.

I love select to copy and middle-click to paste.

https://www.nongnu.org/autocutsel/


That problem has been solved by terminals whose readline awaits actual user input (actual enter from the keyboard) even when you paste a command with single line break or a multiline command. Most linux terminals do that nowadays, and it's also great for giving you a chance to review that oneliner you've copied from the browser, which could contain something different than what was shown.


It’s a right pain reverting that on modern desktops.


Shift+Insert has always been my preferred method of pasting into a terminal after too many mishaps with right-click or middle-click paste.


> GNOME dictates onto users what the developers think the users should use or have. I find that not acceptable.

Every operating system (or DE) does that. Hell, every piece of software does that. They're all just a bunch of opinions wrapped in a user interface.

Some may provide more opportunities to change the defaults, but those defaults still remain.


They're probably referring to gnome's history of controversial opinions that many users don't like, such as:

- "simplifying the UI" by removing many useful features (like systray icons)

- "what makes you think sharpness is a metric?"

- claiming fractional scaling is dumb because "monitors don't have fractional pixels"

- "we know what users want" while ignoring most user feedback

- "we're not copying mac OS" while blatantly doing so

- "consistency is key" then changes entire UI paradigm every release

- "what's the usecase for <insert well-known feature>?"

- intentionally obscuring how to access / in the file picker

And in general just being incredibly tone-deaf and abusive to their own users on the forums. Torvalds has been calling out their "users are idiots and are confused by functionality" stance for over 20 years now.


Yes, but the problem is the GNOME organization is headed by opinionated morons with zero clue how to design a user interface.


I rather like GNOME, which presumably also makes me a moron.

Or perhaps we're all just people with differing opinions on what constitutes a "good" user interface.


There are people who like Windows too. I also consider them morons.


This can be said about literally any software? And as GP points out, it's not "dictating what you can use or have" - you can turn it back on.


This is like, the least bad thing GNOME have ever done. Middle-click pasting makes no logical sense and only exists as a holdover from before copy-paste conventions were established. Nobody would design it this way today.


[flagged]


I would say the issue itself is almost irrelevant... I think it's mainly your dogmatism. Speaking in absolutes as if you always know everything, that there can seemingly only be one right answer and there can be no other valid perspectives or opinions. To me it just screams low emotional intelligence and a lack of critical thinking, humility and empathy.

https://en.wikipedia.org/wiki/Splitting_(psychology)


GNOME is doing something right for a change and fixing a common source of security issues.

If you like it, just keep the behavior enabled.


Never in my life have I heard of this security issue.


defaults matter a lot!


Developers change defaults all the time and make things far worse.

Vim 9.0 default changes required a 6 line vimrc to undo the damage.


Yes, that's the primary reason that made me switch to neovim instead.


They already did that by forcing "AI" into the OS.


PC mice haven't had three buttons for decades!


Third button has been "hidden" below the mouse wheel for well more than those 10 years, just press the wheel down and you'll hear a mouse button click.


And most Linuxes have option for dual click (right and left mouse button) to simulate middle mouse button.

Useful, as the wheel button is usually first to die in cheap mice.

Not useful, because it made it impossible to play Death Stranding on Linux :(


You'll be surprised to know that there are still some mice that don't support that. Admittedly, I've only had that happen once in the last 15 yrs in a budget "gamer" mouse I instantly returned and replaced with a Logitech g903 at the time (though I've switched mice twice since, and both supported it)


Ironically, Microsoft pioneered the scroll wheel.


popularized, not pioneered.



Remember Xerox PARC, the people that developed the first computer GUI?

https://archive.is/sKLL

> The three button Alto mouse enabled the first bitmapped and overlapping windows display, known as a graphical user interface (GUI). The Alto dates to March of 1973


My dude, my mouse has 5 buttons. No idea what you're talking about here.


I'm down to one. Less is more.


Is that one of those innovative designs with the charging port on the bottom of the mouse?

Sometimes more is more.


It sounds dumb but the battery lasts long and charges quickly, so I think they made the right decision.


We have those maggots (BSFL) sometimes in our compost naturally and I would never eat anything made with them.

The problem is not even the animal/maggot itself but the fact that it consumes ANYTHING. Old apples, coffee grounds, house plants, dead rats, everything.

The incentives to produce them more cheaply by feeding them trash (actual trash not mango peelings) is obvious and just too risky. When cost is the only reason they matter anyways, why waste money on quality ingredients or good QA?


That number isn't very useful either, it really depends on the hardware. Most virtualized server CPUs where e.g. Django will run on in the end are nowhere near the author's M4 Pro.

Last time I benchmarked a VPS it was about the performance of an Ivy Bridge generation laptop.


> Last time I benchmarked a VPS it was about the performance of an Ivy Bridge generation laptop.

I have a number of Intel N95 systems around the house for various things. I've found them to be a pretty accurate analog for small instances VPSes. The N95 are Intel E-cores which are effectively Sandy Bridge/Ivy Bridge cores.

Stuff can fly on my MacBook but than drag on a small VPS instance but validating against an N95 (I already have) is helpful. YMMV.


After spending many hours optimizing some routines I now think performance optimization is a great benchmark for identifiying how generally smart an AI is at helping with some specific piece of code.

Solutions are quite easy to verify with differential testing and produce a number for direct comparison.

Less code is usually better and you generally can't "cheat" by adding more cruft so it nullifies the additive bias. Good optimization requires significant understanding of the underlying structures. Everything has performance tradeoffs so it requires systemic thinking and not just stringing independent pieces together.

So far I've found that Gemini Pro 3 was the best at reasoning about tricky SIMD code but the results with most models were pretty underwhelming.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: