Hacker Newsnew | past | comments | ask | show | jobs | submit | Nevermark's commentslogin

That might be an absurd comparison, but we can fix that.

If you were being charged per character, or running down character limits, and printing on printers that were shared and had economic costs for stalled and started print runs, then:

You wouldn’t “need” to understand. The prints would complete regardless. But you might want to. Personal preference.

Which is true of this issue to.


>If you were being charged per character, or running down character limits, and printing on printers that were shared and had economic costs for stalled and started print runs,

and the system was being run by some of the planet’s brightest people whose famous creation is well known to disseminate complex information succinctly,

>then:

You would expect to be led to understand, like… a 1997 Prius.

“This feature showed the vehicle operation regarding the interplay between gasoline engine, battery pack, and electric motors and could also show a bar-graph of fuel economy results.” https://en.wikipedia.org/wiki/Toyota_Prius_(XW10)


Are you implying in-implementation means shipped?

Maybe you are saying something that makes sense. But your comment doesn’t read that way.

A “concrete solution by Apple” would be a solution that shipped.

Cook got accused of acting illegally by a judge. Which is very unusual. It is pretty clear that Apple is dragging their feet as hard as they can get away with.


"A thing is being implemented."

Liars only: Apple is doing nothing.


They need to give the non-code service a different name.

And a slightly lower price.

If it succeeds they can adjust pricing later.

Otherwise they are messing with their new and old customers heads, regarding a service with a name that ought to be reliably interpretable. And seriously messing with their own credibility. Wrong kind of A/B test.

This is incompetence which i would normally discount. But Anthropic seems to be falling all over themselves to irritate customers.


> You can build a snappy app today by using boring technology and following some sensible best practices.

If you are building something with similar practical constraints for the Nth time this is definitely true.

You are inheriting “architecture” from your own memory and/or tools/dependencies that are already well fit to the problem area. The architectural performance/model problem already got a lot of thought.

Lots of problems are like that.

But if you are solving a problem where existing tools do a poor job, you better be thinking about performance with any new architecture.


Counterfactuals are weak opinion, at best.

Given that Apple is doing well, the onus is on someone claiming that Apple would have done better, having a strong argument.

Not "could" have done better, because things could obviously have gone better, worse, or anything else, given any substantive or random difference. Could means nothing.

(And I say this as someone very disappointed with how Cook handled that.)


> Counterfactuals are weak opinion, at best.

Ah, "If you can't definitively and completely prove a negative then you're wrong (but also I'm like, totally not carrying water for those people)" is definitely not a weak opinion, though.

That said, maybe you should read the discussion a bit more carefully before jumping in with "OMG PROOOOOOF" or whatever the fuck this was supposed to be? The entire, plain English discussion, revolved around one thing not being the only possible "fact" just because it happened. None of the posts were particularly long, and none used challenging words.


My point isn’t that anyone’s view is wrong. I can’t make that claim either.

I hate what Cook did.

I would be happy and open to anyone who can point out how Apple was supposed to handle the actual threat of major tariffs in their components and systems better than he did.

But simply asserting a counter factual, a plausible way it might have been better, isn’t that. What would Cook be expected to do with that?

But what?

Not dismissing that there was a better way. There must be. It’s very worthwhile figuring out, even as a counter factual. That’s how we all learn.l

Not judging anyone. My answer is just or even more weak! I have really thought about this too, and come up with nothing so far.

(I appreciate and take note that my comment didn’t communicate my point well enough. It’s important to recognize weak reasoning. But that wasn’t meant to discourage, or show a lack of respect for another person’s efforts. I want a better answer too.)


I’d rather hear from someone suggesting, counterfactually, that they would have done worse had they not capitulated. What’s that argument like?

You want motivated reasoning?

It’s not clear what you are saying, other than what you want to hear.


Apple’s software has a kind of reliable predictability that many appreciate.

But “best” is far too strong a word.

For starters, most if not all their software can be described as simpler also-rans.

And in line with that approach, for a company that innovates in hardware, it does not apply that effort to software.

With two exceptions in the last two decades. The iPhone and Apple Watch operating systems & interfaces were very creative efforts. Which genuinely matched the hardware innovation.

Vision’s OS, on the hand, basically iOS-ified hardware that deserved to be treated like the first device to be positioned above and beyond the Mac. The natural interface doesn’t fall below the Mac’s, like a touch screen does. It fat exceeds it, given a keyboard-trackpad.

Instead, software wise, we get another media and toy kiosk.

I am stunned that Tim Cook didn’t see the opportunity to leave his mark with a device that took the capability crown further than the Mac, instead of falling for the 3D as cute feature un-vision.

Pro hardware. Toy software.

He has been a great CEO. But if he let Steve and his own legacy down anywhere, that is where.

That, the predictable but mostly stalled vision of software apps. And all the odd software glitches on all their devices that seem to keep cropping up, that suggest poor underlying models to me.

Their underlying systems software are a high point. The hardware integration is stand out.


The huge strike-out they made with the Vision Pro still blows my mind. I'm in the camp of people who would have possibly shifted my entire working setup to that thing if they'd made just a few less dumb choices with it, and it might have been worth it even at the high price. I still occasionally waste my time checking out the latest to see if they've made any headway towards making it useful, because I'm still recovering from the shock that they haven't. The only way I can see the current state making any sense is if they just wanted to squeeze as much field usage data as possible from early adopters of an overpriced prototype, but that seems so far outside of how Apple normally positions its products that it's hard to believe.

> I'm in the camp of people who would have possibly shifted my entire working setup to that thing if they'd made just a few less dumb choices

That describes me too. I even did for a while. But it just made the incomprehensible lack of any software ambition more painful.

The software is the only reason the Vision isn't worth the price. A real Pro OS, paired with an Studio M5-Ultra, or with its own M5-Ultra, would be an amazing work environment.

(The only hardware they would need to upgrade for the latter, i.e. its own Ultra, would be making live-battery swapping convenient. Which they should have already done.)


“Natural” is a word often used in opposition to science.

It really has 1000 meanings. Usually whatever the speaker wants it to mean.


> If not for protecting media giants - [ … ]

As soon as we start conditioning ethics, we give up and undermine the principles behind those ethics.


It is really easy to way over think, or over feel, AI.

Sometimes it's just a really good interface that matches the task well.

Think of all the people that still avoided getting a computer a decade or two ago, because "online" was so unnatural and creepy to them. Obviously, the internet had and has those places. And frankly a lot of social media still is.

But it can also just be wikipedia, making flight reservations, etc. When that is all it is doing, what you want it to do, that is all it is.

An automated language interface can just be a really good note collector/collator.

Personally, I look forward to the wise, well dressed, well spoken, waist-up robot bartenders we have been promised by movies for decades. Not creepy at all!


But if we choose some random mean body part X, then people i, whose Xi < Xm, won't be very happy.

Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: