Hacker Newsnew | past | comments | ask | show | jobs | submit | karmakurtisaani's commentslogin

I once read a comment here or reddit explaining that the X11 developers moved to Wayland because the X11 code has turned into an unmaintainable mess that can't be worked with anymore. So the reasons are not drama, but just plain old tech debt.

Openbsd has brought in x11 into their own codebase: https://xenocara.org/

This is why openbsd is great.

I don't care about the drama that happens in Linux land at all.


This pre-packaged talking point is often repeated without evidence. The vast majority of X.org developers, including all of the original ones, simply moved to other venues at one point or another. Only a few, like Daniel Stone, have made contributions to both. And it shows in how many lessons had to be re-learned.

The drama was mostly over whether or not Wayland should have been the replacement. AFAIU, everyone agreed X11 development was effectively unsustainable or at least at a dead end.

Wayland is not a solution, just a name for some protocols... It's either KDE or Gnome (with it's weird quirks) or some alternative.

So is X11, though the reference implementation of X11 is also widely agreed to have some serious problems going forward on top of problems with the protocol itself.

Ah, one of these battles that are very hard to fight to gain essentially nothing.

Edit: or, when you can't do actual math, you complain about notation.


The gain is pedagogical: giving kids a good intuition about angles is so much easier when the constant you're working with represents an entire turn around the circle (360°) rather than a half-turn of 180°. The advantage of using tau instead of pi is much smaller in other situations, but when it comes to measuring angles in radians, it's huge. And kids who have a better understanding of angles and trigonometry are just a little bit more likely to become good engineers. So persuading math teachers that there's a better way to teach trig is an investment in the future whose potential payoff is 20-30 years (or more) down the road.

I'd really be curious to see any substantial proof for that claim.

The first time pupils encounter pi isn't when measuring angles. At least over here, that's still done in degrees, which is much easier to explain, and also latches onto common cultural practice (e.g. a turn of 180 degrees). So I suppose that already makes them good engineers.

But the first time pupils encounter pi is when computing the circumference and surface of a circle. While the former would look easier with the radius (tau * r), it looks just as weird when using diameter or when using it for the surface.


I don't know of any studies yet comparing the two approaches, but https://www.tauday.com/a-tau-testimonial is the story of one student who finally "got it" when using tau instead of pi. I strongly suspect she's not unique.

If there's more data available, I don't yet know where to find it.

P.S. Yes, angles are first presented in degrees in most contexts, and understanding sines and cosines is easier when given the degree units you're familiar with. But radians do need to get introduced at some point during trig, and it's exactly the study of radians which should be done using tau (the equivalent of 360°) rather than pi (180°). Because a right angle, 90°, is a quarter of the way around the circle, and that's tau/4. A 45° angle is tau/8, one-eighth of the way around the circle. There's no need to memorize formulas when you do it this way, it's just straight-up intuitive (whereas 45° = pi/4 is not intuitive the same way).


PI is to clever by half.

Probably fired a lot of their best people in the past few years and replaced it with AI. They have a de-facto monopoly, so we'll just accept it and wait patiently until they fix the problem. You know, business as usual in the grift economy.

>They have a de-facto monopoly

On what? There are lots of CDN providers out there.


They do fare more than just CDN. It's the combination of service, features, reach, price, and the integration of it all.

There's only one that lets everyone sign up for free.

The "AI agents" are on holiday when an outage like this happens.

This didn't happen at all. You're just completely making shit up.

Yeah I am a bit.

A lot of people here suggesting they'd be great mathematicians if only it wasn't for the pesky notation. What they are missing is that the notation is the easy part..

Indeed, confused people say things that don't make sense.

Not at all. Over and over I find really intimidating math notation actually represents pretty simple concepts. Sigma notation is a good example of this. Hmm, giant sigma or sum()?

You think changing sigma to sum() would make it easier to understand the 5 paper, 1000 page proof of the geometric Langlands conjecture?

Imagine how much unnecessary time would be added to a course about series if the lecturer had to write sum() instead of ∑ every time. If you find it hard to remember that ∑ means sum, math might not be for you, and that’s fine.

it's not so much remembering what ∑ means insomuch as that it's completely impossible to google the first time you run across it. It'll be in some PDF that doesn't allow you to copy-paste the symbol and you won't know what it's called. Rinse and repeat for any of the million symbols mathematicians use, never mind that loads of symbols are context dependent even if you could google them.

I hope mathematicians have a better reason than "it's tradition" for making the entire field completely opaque to anyone who hasn't studied math extensively.


Basic notation like sums is covered in every undergraduate math course. Any non-standard notation will be introduced by the author using it. Nobody is trying to obscure anything from you.

Yes, mathematical notation is not very discoverable at all.

Wait until you learn about integration. Measures, limits and the quirks of uncountable spaces don't become simpler once you call the operation integrate().

It's like saying that learning Arabic is the easy part of writing a great Saudi novel. True, but you have to understand that being literate is the price of admission. Clearly you consider yourself very facile with mathematical notation but you might have some empathy for the inumerate. Not everyone had the good fortune of great math teachers or even the luxury of attending a good school. I believe there is valid frustration borne out of poor mathematical education.

Well yeah, but this empathy and frustration is simply misplaced. I have empathy for people who didn't get good education, and they should be frustrated towards their bad schooling. Math notation is simply the wrong target.

If they can't see that, it's hard to think they have much chance with the actual math. "A mathematician is a person who knows how to separate the relevant from the irrelevant", a saying I was told in school.


That's funny that you would bring up something you learned in school.

> What they are missing is that the notation is the easy part.

This is so wrong it can only come from a place of inexperience and ignorance.

Mathematics is flush with inconsistent, abbreviated, and overloaded notation.

Show a child a matrix numerically and they can understand it, show them Ax+s=b, and watch the confusion.


The fact that there is a precise analogy between how Ax + s = b works when A is a matrix and the other quantities are vectors, and how this works when everything is scalars or what have you, is a fundamental insight which is useful to notationally encode. It's good to be able to readily reason that in either case, x = A^(-1) (b - s) if A is invertible, and so on.

It's good to be able to think and talk in terms of abstractions that do not force viewing analogous situations in very different terms. This is much of what math is about.


Well, obviously they will be confused because you jumped from a square of numbers to a bunch of operations. They’d be equally confused if you presented those operations numerically. I am not sure what it is you want to prove with that example. I am also not sure that a child can actually understand what a matrix is if you just show them some numbers (i.e., will they actually understand that a matrix is a linear transformer of vectors and the properties it has just by showing them some numbers?)

> a bunch of operations.

Sorry, the notation is bit confusing. The 'A' here is a matrix.


I know it is a matrix, the notation is not confusing at all. I am saying that the concept of a matrix as a set of numbers arranged in a rectangles and the concept of operations on a matrix are very different things, the confusion will not come from notation.

You must be correct, because this interaction is completely devoid of any confusion between the two people attempting to communicate clearly.

I do not have any confusion with the notation, I am confused about what the argument you’re trying to convey with English words.

Ceci n'est pas une pipe.

This is funny. “Mathematics notation is confusing to me because I refuse to learn it. I refuse to learn it because mathematics notation is confusing to me.” Okay sure, be happy with yourself.

> This is so wrong it can only come from a place of inexperience and ignorance.

Thanks for the laughs :D

> Show a child a matrix numerically and they can understand it, show them Ax+s=b, and watch the confusion.

Show a HN misunderstood genius Riemann Zeta function as a Zeta() and they think they can figure out it's zeros. Show it as a Greek letter and they'll lament how impossible it is to understand.


Louvre get sort of boring, since the time period they cover stops at the time when art gets more and more interesting (mid 1800s). Before that every painting is basically Jesus or boobs.

Still well worth a visit definitely.


This year they made a brilliant thing: they put haute couture one-off fashion items on display throughout the royal wing.

Who knew Loubutin and Alexander MacQueen shoes or Dior and Gucci handbags would feel so absolutely natural among the dresses and tapestries and jewellery :)


But then you have all the Egyptian wing no?

Yep, lots of stuff from different periods until the 1800s. Interesting, but surprisingly kind of repetitive.

Also, I'm pretty sure cats were domesticated way before 3.5 years ago. I think it was even as far back as, hmm, over 80 years ago.

Edit: you are correct to downvote me, it was not a good joke.


Have cats been domesticated as of 2025? Last time I had a cat at home 10 years ago, it felt like he domesticated me!

Truthfully, I was thinking about this while writing my comment. I settled on thinking on very loose definition of 'domesticated'.

By any definition cats are “barely domesticated” and watching wild big cats makes that clear.

We negotiated an uneasy truce that involved us changing more than the cat has.


Fair. Once cats reached the internet we had no chance left.

There is some evidence to support this idea https://en.wikipedia.org/wiki/Felix_the_Cat .

Some ancient non-touchscreen art depicting felines over 100 years ago? Revolutionary!

If you're not a total doofus, you should be able to look around you and see how things still today are worse off because of the nazis than they would have been without them. But we have doofi among us, so you might just be one.

Juniors are juniors because they haven't yet struggled with mistakes of their own creation. In a few years we should see some pretty strong senior engineers emerging.

One cool thing about aging is you can tell old jokes from ancient shows like Futurama, and the kids think it's original and hilarious.

I steal lines from Cheers on the daily.

Nah, just starting see the results of vibe coding.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: