Hacker Newsnew | past | comments | ask | show | jobs | submit | TheTon's commentslogin

I admit I started reading with some skepticism. It didn't read like PR, so I assumed I was reading fanfic. By the midpoint, she managed to convince me otherwise.

I think the author is walking a tightrope between convincing the reader that she wrote this herself and that there's more depth to her than what we see on stage or in pop media. Writing this blog is definitely a tougher assignment than doing podcast interviews or behind the scenes videos.

You are right, of course, a good editor could make this better, but I think she's deliberately avoiding that here. A pop star is unwise to fire a good producer without a better replacement, but sometimes they have to bring out the piano and do an acoustic performance live.


There isn’t a single one way to be a dedicated gamer.

Inevitably everyone has finite time and access to games and has to make choices about what to play.

As a Mac guy, I always found the game platform wars weird because even on the weakest gaming platform there are still more good games than anyone can individually play. And even on Windows, probably the strongest gaming platform, you’re still missing out on many significant games.

I totally understand buying a system because it has some game that you absolutely must play. I bought an OG Xbox back in the day because I thought I desperately needed to play Deus Ex: Invisible War when it didn’t come to Mac. Got burned on that one, but at least I had Halo before it came to Mac (and was in the end much better there than on Xbox due to expanded online multiplayer).

What I actually don’t get is folks who have to play the hot game of the week every week. Just seems expensive in terms of money, time, and space for different systems, and you only scratch the surface of the games.


This is a big miss for me. I can’t use my TVs 120Hz VRR mode without HDMI 2.1.

I realize the Xbox Series X is beleaguered at this point, but apart from playing games that are on Steam but not Xbox, I can’t see why I would prefer the Steam Machine.


After commenting i looked up the actual capabilities of the port and it turns out while the port is officially only HDMI 2.0 it actually still supports 120Hz, HDR and VRR anyway. So basically it only doesn't support Display Stream Compression for 144Hz and beyond.

I quickly tested this by connecting my PC running Linux with a RX 6800 to my TV (LG C4). 120Hz, VRR and HDR were all available.


At 4K? Or are you limited to a lower resolution due to bandwidth constraints?


Yes, 4K120Hz! My TV could do 144Hz but i couldn't select it so 4K120 seems to be the limit.


Try for yourself. I get 4k120Hz when connecting my laptop directly via HDMI.


Yeah I have tried it for myself. I am limited to 4K60 when using the HDMI 2.0 port on either my M1 Mac mini or M1 Pro MacBook Pro and LG B2 TV. I do get 4K120 with VRR with newer Macs with HDMI 2.1 as well as my Xbox Series X. It has been my understanding that 4K120 with HDR and VRR requires HDMI 2.1, which is why those HDMI 2.0 limited systems don’t work. Not having a Steam Machine myself, I would assume its HDMI 2.0 port would be similarly limited.

Edit: I should add, I do get 4K120 VRR and HDR on the M1 Macs when connected to a monitor via Thunderbolt or Thunderbolt to DisplayPort adapter, and I would expect a Steam Machine to be similar using DisplayPort, but my TV only has HDMI input and so can’t work in this mode (and a Thunderbolt to HDMI adapter doesn’t work either).


I get 4k120Hz when connecting the output from a HDMI 2.0 port on my laptop to the HDMI 2.1 port on my TV (Sony TV, it has 4 HDMI ports, but for some reason only ports 3 and 4 support HDMI 2.1, 120Hz, VRR).


As a kid, I was marginally decent at competitive math. Not good like you think of kids who dominate those type of competitions at a high level, but like I could qualify for the state competition type good.

What I was actually good, or at least fast at, was TI-Basic, which was allowed in a lot of cases (though not all). Usually the problems were set up so you couldn’t find the solution using just the calculator, but if you had a couple of ideas and needed to choose between them you could sometimes cross off the wrong ones with a program.

The script the author gives isn’t a proof itself, unless the proposition is false, in which case a counter example always makes a great proof :p


I used to do the same thing. I'd scan for problems on the test amenable to computational approaches and either pull up one of my custom made programs or write one on the spot and let it churn in the background for a bit while I worked on other stuff without the calculator.


But games are full fledged GUI apps. At a minimum they have a window.

It’s really unclear what it means to support old games but not old apps in general.

I would think the set of APIs used by the set of all existing Intel Mac games probably comes close to everything. Certainly nearly all of AppKit, OpenGL, and Metal 1 and 2, but also media stuff (audio, video), networking stuff, input stuff (IOHID etc).

So then why say only games when the minimum to support the games probably covers a lot of non games too?

I wonder if their plan is to artificially limit who can use the Intel slices of the system frameworks? Like hardcode a list of blessed and tested games? Or (horror) maybe their plan is to only support Rosetta for games that use Win32 — so they’re actually going to be closing the door on old native Mac games and only supporting Wine / Game Porting Toolkit?


Games use a very small portion of the native frameworks. Most would be covered by Foundation, which they have to keep working for Swift anyway (Foundation is being rewritten in Swift) and just enough to present a window + handle inputs. D3DMetal and the other translation layers remove the need to keep Metal around.

That’s a much smaller target of things to keep running on Intel than the whole shebang that they need to right now to support Rosetta.


I don’t agree. My point is their collective footprint in terms of the macOS API surface (at least as of 2019 or so) is pretty big. I’m not just speculating here, I work in this area so I have a pretty good idea of what is used.


Could you give examples at least of what you think that big collective footprint might include?

Bear in mind that a large chunk of Mac gaming right now that needs translation are windows games translated via crossover.


As I said in my first comment, it's at least Cocoa (Foundation + AppKit), AVFoundation, Metal, OpenGL, and then all of the lower level frameworks and libraries those depend on (which may or may not be used directly by individual games). If you want a concrete example from something open source, go look at what SDL depends on, it's everything I listed and then some. It's also not uncommon for games to have launchers or startup windows that contain additional native UI, so assume you really do need all of AppKit, you couldn't get away with cutting out something like NSTableView or whatever.

So my point remains, if Apple has to continue providing Intel builds of all of these frameworks, that means a lot of other apps could also continue to run. But ... Apple says they won't, so how are they going to accomplish this? That's the mystery to me.


With the exception of AVFoundation, I’d covered all of those in my comments. That’s not a lot of surface area. Games are typically not using a significant portion of AppKit beyond what I already mentioned, and AVFoundation is likely also a very thin wrapper that is maintainable.

I’m assuming Apple isn’t going to arbitrarily restrict what runs but will remove things to just the subset that they believe are needed for games such that other stuff just implicitly won’t work.


Is it practical for Apple to produce a set of frameworks for Intel that run some useful set of old games but that do not run any useful set of non game software?

I grant it’s probably possible to do, but I think that is a lot more work and more error prone than just continuing to ship the major frameworks as they were.

From Apple’s perspective I’m sure they have a few big goals here:

1. Encourage anyone who wants to continue offering software on Mac to update their builds to include arm64.

2. Reduce download size, on disk size, and memory use of macOS.

3. Reduce QA burden of testing ancient 3rd party software.

These are also the same motivations Apple had when they eliminated 32 bit Intel and when they eliminated Rosetta 1, but they were criticized especially for leaving behind game libraries.

Arguably, arbitrarily restricting what runs gets them the biggest slice of their goals with the minimum work. Devs are given the stick. People typically only play 1 game at a time and then quit it, so there isn’t a bunch of Intel code in RAM all the time because of a few small apps hanging out, and they have less to test because it’s a finite set of games. It just will chafe because if they do that then you know that some unblessed software could run but Apple is just preventing it to make their lives easier.


> Is it practical for Apple to produce a set of frameworks for Intel that run some useful set of old games but that do not run any useful set of non game software?

They already have the frameworks supporting intel. They can just start pruning away.

Some teams will draw the short straw of what needs to continue being supported, but it’s likely a very small subset of what they already maintain today.


If you'd like to see an interesting parallel, go look at how Microsoft announced supporting DirectX 12 on Windows 7 for a blessed apps list - basically because Blizzard whined hard enough and was a big enough gorilla to demand it.


That's one implementation, yeah, just have a list somewhere of approved software and make an artificial limitation. But their announcement is so vague, it's hard to say.

And then the next question is why? It's not like they've ever promised much compatibility for old software on new macOS. Why not let it be just best effort, if it runs it runs?


Ditto. Happily using 7, but if they ever break it I’m switching to Apple Passwords.


I did that last year and haven’t looked back. And I can share passwords with my family without paying an arm and a leg.

Pity. I used 1P for many, many years and recommended it to everyone I knew. I feel like it’s completely lost the plot, though.


The author doesn’t attempt to address the issue that Apple and iOS are the only remaining effective bulwark against Google’s complete control of the web.

Think about what Google’s end game looks like if they are able to convince lawmakers and regulators around the world to force Apple to allow Chrome on iOS. Google will continue to spam standards proposals and implementations that Apple or Mozilla will be unwilling to adopt for various reasons. Google will continue to advertise Chrome heavily, and push users of its other services to install and use Chrome exclusively. Google search will prioritize sites that use Chrome specific technologies. Google Gemini will generate code that uses Chrome specific APIs.

When Chrome reaches sufficient market share, Google will start to use Chrome to disadvantage computing platforms that they do not control completely. New features will come to Android and ChromeOS first. Bugs may go unaddressed on other platforms.

I realize it’s frustrating as a web developer to have to deal with browser specific issues, or to be unable to take advantage of necessary APIs or platform features from your web app. It is also frustrating to be blocked from the App Store because Apple wants to avoid competition in some area. The current situation is unfair and far from ideal. Apple are not the good guys here.

But, framing the issues as being entirely about Apple and not addressing the situation with Google doesn’t work. And, unfortunately, many commentators (including the author of this article) as well as regulators (including the EU), don’t seem to get this. If these folks get what they are asking for, we aren’t going to enter a golden age with a single web platform that is feature rich and open to all equally, we are all just going to get crushed by Google.


Your argument against Google's monopoly would make a lot of sense if it was impossible to uninstall Chrome from Android; but that's not the case; to the contrary, it's Apple's Safari that's impossible to uninstall from Apple's iOS.

Your argument against Google's monopoly would make a lot of sense if it was impossible to install Apple's Safari on Google's Android because of Google, but that's not the case, either, it's actually Apple that has discontinued Safari outside of their own ecosystem some 10+ years ago. Which, BTW, was a few years after Steve Jobs predicted that Safari will be the only browser on the planet, on both Macs and PCs, and that it'd be good.

Your argument against Google's monopoly would make a lot of sense if it was impossible to use Google's Android without a Google Account tracking your every move; but this is not the case, either, because you can easily sideload F-Droid and Aurora Store, and side-load any of the free Play Store apps as published and signed by Google, without any Google accounts, and uninstall Chrome, YouTube and most of the other pesky apps, yet still have access to your banking apps, to YouTube through free clients like NewPipe or PipePipe, and to lots of other stuff, all without any signs of any Google Accounts. Can you even install a third-party YouTube client on iOS? Ironically, you can on Android. In fact, you don't even lose any major functionality by foregoing Chrome and a Google Account on an Android; even the experience of watching YouTube is actually superior with PipePipe. I have several extra phones without a Google Account, and they're fully usable without any unexpected limitations; sync is the only thing that's missing.

Yet to the contrary, NONE of these things are possible on iOS.

On iOS, you can't even use even the "premium" pre-installed apps like Pages, Numbers, Keynote, GarageBand or iMovie, without assigning them to an Apple Account first. You can't install any apps or stores, either. You can't do anything without at least an Apple Account. Yet it's Apple that's the last bastion of our privacy?! How?!


> Your argument against Google's monopoly would make a lot of sense if it was impossible to uninstall Chrome from Android…

Wrong. "Chrome is already installed on most Android devices, and can't be removed." https://support.google.com/chrome/answer/95319


Google's support article is wrong/misleading. You can uninstall all app updates for Chrome. You can disable Chrome. Once disabled, it cannot run again, unless you expressly enable it. It's basically equivalent to an uninstall for most purposes.

The latest trend in OS design are an immutable system partition, so, obviously, you cannot modify the underlying system image, neither on macOS, nor iOS, nor Android, but what evidence do you have that doing an overlay disable isn't enough?

I've been using Android for years, and have not seen funny business after I disable Chrome. You can use Brave or Vivaldi or Yandex Browser or Opera in place of Chrome at all times. Or Firefox in many cases. I routinely have fully functioning test devices with stock Android without any Google Accounts or any Chrome. Everything just works the way it should. Including the banking apps installed through Aurora Store through F-Droid, as well as the streaming apps like Amazon Prime Video etc. Again, all of this works without a Google Account in any way on my side as an end-user, and it's expected to continue working even in 2027 even if the trial they've announced goes through worldwide. It works on any Pixel device, it works on any Motorola device, it even works on Samsung, too.


> I've been using Android for years, and have not seen funny business after I disable Chrome.

You have not seen any funny business because Chrome WebView, which many applications depend on, is a separate application. Developer settings let you change it to another application, but only from a hardcoded list of package names and only if they're installed to /system. There are also no non-Chromium WebView implementations available to my knowledge.

So no, unless you're OK with breaking applications that use WebView, you can't remove Chrome from an Android smartphone.


> Which, BTW, was a few years after Steve Jobs predicted that Safari will be the only browser on the planet, on both Macs and PCs, and that it'd be good.

Safari is good. As a user, I could not care even an iota less about how "annoying" it is to develop over-built, shitty websites to work right on Safari. Web developers as a general rule don't really seem like they give a shit about delivering good work, that respects the users wishes and devices and privacy, so if it makes it harder for them to have to write garbage like fucking Confluence or whatever for two platforms claiming to comply with "standards", sounds good to me, I don't care. Works great for reading documents and watching videos. Works great checking a menu of a restaurant from a QR code. I don't want it or need it to be my entire operating system, accessing my camera, my microphone, my location, my goddamn serial ports, running gobs of terrible quality, remote, slow code ensuring my brand new computer feels the same as my brand new computer from 10 years ago, to do what a barebones platform API app could do talking to the exact same JSON-RPC APIs their dogshit React app is talking to.

> Your argument against Google's monopoly would make a lot of sense if it was impossible to use Google's Android without a Google Account tracking your every move

And this argument would hold water if it was solely about being forced to do something horrific to your privacy instead of led to or being tricked into doing it. It holds as well as "well you can just not buy an iPhone". Give me a break! Google is not out to empower anyone. They are out to own general computing and the mountain of data if produces, and turning the browser, the one platform they have control over, into the operating system, is how they are going to do that. And in a stroke of brilliance, for the last 15 years they've "allowed" the "choice" to sidestep their overreach, which leads to braindead arguments like "well, at least you can sidestep it, therefore its really not that bad" from libertarian brained bozos who can't see the forest for the trees.

Apple is by no means free of sin. There are a million things I would change about the App Store monopoly. But that isn't the world we live in. We live in a world where one company controls and inspects the conduit to the internet for a vast majority of the population, and one controls it for the vast majority of the remainder. Whatever their reasons, the latter are holding back the Kraken ready to envelop and consume everything, and I'm not going to poo poo their efforts because it doesn't immediately comply with whatever half-assed, hostile "standard" the former pushes out of its rectum.


I feel the same, I agree that the web has gone downhill with all the endless JavaScript wasting all the available CPU cycles. (With all the rest CPU cycles being wasted by the swap-in/out because of the memory bloat of web browsers, again.) This is why these days I ALWAYS enable Low Power Mode in any browser or system that provides such a functionality; macOS has finally added this a few years ago — better late than never.

But I feel like ALL browser vendors are not doing enough to combat this bloat. There have to be resource limits, warning messages/icons, and stop-gap measures to avoid pointless JavaScript wasting our electricity; but NONE of the browsers do this to an extent I'd wish they'd do; in fact, Chrome has actually been ahead of Firefox and Safari in reigning these sites, probably because it has to run in production on 4GB ChromeOS machines costing $99, whereas all the Firefox and Safari devs are probably using 48GB machines costing $2399 as their benchmarks. So, the reality, is that, ironically, Chrome is again the leader even in this area. Because Chrome on a $99 4GB ChromeOS machine feels snappier than Firefox on a $999 MacBook, given enough open tabs.

Your point about feature bloat sounds good in principle, but is not practical in reality. In reality, if things don't work in Safari, you're simply asked to install an app from the App Store. Or if you have to configure a keyboard on a Mac, you have to use a Windows machine with the native keyboard configuration tool, instead of VIA in Chrome WebHID or WebUSB. Why in your opinion are these alternatives not worse than having these sorts of things as web standards as written by Chrome?


The author here. It wasn't addressed in this post because it was treated separately several years ago in the same series (linked at the top):

https://infrequently.org/2022/06/apple-is-not-defending-brow...

TL;DR is that the premise of the argument is false, or at least almost entirely so, and deprives Apple of agency, when in fact it has all the power in the equation.


That article also does not appear to have anything to say about the validity of "standards" that are nothing more than Google's feature creep for web browsers. At some point, you need to actually defend the idea that a web browser should be able to enumerate what Bluetooth and USB devices you have connected. Dancing around such issues is what's making your "Assault on Standards" claim sound so hollow. You need to justify how your position doesn't simply boil down to "Apple should follow Google's lead".


That's simply a misunderstanding of how features come to the web. There is no immaculate conception for web APIs. No magical room in which they are dreamt up, or spring fully-formed from the head of Zeus.

Instead, they come from open, honest, iterative design (when done well), and shipping ahead of others is risky, but that's why we designed the Blink Launch Process to demand so much pre-work (specs, tests, origin trials, good faith attempts to include other vendors in design, etc.) in order to launch that way.

Some background on these points here:

https://infrequently.org/series/effective-standards-work/

https://youtu.be/1Z83L6xa1tw?si=939PBH4_idtZGI6Y

As to, "should Apple follow Chromium's lead", perhaps ask "how would that be different than today?"

See:

https://infrequently.org/2023/02/safari-16-4-is-an-admission...

And:

https://infrequently.org/2025/06/the-ghost-of-christmas-past...


You're still dodging the issue. Your article title accusing Apple of an "assault on standards" is implicitly treating Google's proposals as a fait accompli that Apple is resisting, which is not at all what the situation is for many of the Chrome features you are trying not to be specific about.

You say that shipping ahead of others is risky, but can't seem to acknowledge when the negative outcome comes to pass and other browser vendors aren't interested in adopting questionable feature proposals.


I'm simply pointing out that Apple declined to try to constructively solve the problems developers expressed, demurred from engaging in design work in many areas, and did not ship alternatives instead (as it could have, and did in the past when Safari/WebKit were not on a starvation budget).

The downsides to this are not lost on me. Why do you think I'm making an issue of it publicly now? We tried literally everything else. This is last resort stuff. The goal is always more collaboration, and through it, better, better-funded, and more capable browsers. Apple is the unique obstacle to all of that today.


Please consider the possibility that some proposed features should not exist. The objections to many of Chrome's features are fundamental, not aesthetic, or complaints about nuances of how it's implemented. Many people outside Google simply do not want the browser to be a full-fledged OS, especially if that means weakening privacy or security controls of the host OS.

Sometimes, the right response to a feature proposal is simply "no". But you're seemingly unwilling to accept that as a valid answer. The alternative you're not seeing is that of not having the dubious features in the browser.


But those features do exist as long as you're willing to pay Apple's tax.

I feel like that's already explained in the originally linked article here.

If you don't want Bluetooth from your browser, you can always install Firefox on Android.

I feel like it's 2005, and you're arguing that web browsers should not have access to a camera.

Or is camera access by a web browser still not a standard today in 2025, either, thanks to Apple, I may guess?


Or let me tell you as a Firefox user on macOS.

I'd much rather have to switch to Brave or Vivaldi for a video phone call, or keyboard configuration, or NFC, than install half a dozen of outdated third-party XXX-only apps with full permissions and questionable security practices or distribution methods.

The better question to ask here, is, why would you NOT want to have a CHOICE to have these things in a secure browser by SEVERAL distinct major vendors like Google, Microsoft, Brave and Vivaldi, and Yandex, and Opera, and others?

Again, I don't even use Chrome. I replace it even on Android. So, I am not concerned with Google taking me over, because they clearly aren't.

But how am I more secure when I have to install lots of dodgy apps to get the most basic things like video conferencing working?


Thanks for the link. I read it.

Alas, I think you and I are probably too far apart on the premises we accept to have a useful discussion, but I appreciate learning about your perspective and I appreciate your reply.


Typically you wait for the new chip.

Sometimes there are hybrid coders that can use some of the resources on the chip and some shader code to handle new codecs or codec features after the fact, but you pay a power and performance penalty to use these.


No regrets here, but I did use Google Code a fair bit prior to GitHub and I had an experience that made me think maybe Google regretted that product in some ways.

Around 2005-6 I wrote a Mac OS X client for Xbox Live. The idea was I wanted notifications on my computer when my friends came online and started to play certain games, so I could turn on my Xbox and join them. This is a feature of the Xbox mobile app today of course, but back then all you could do was either be on the Xbox or sit around refreshing a web page, so the app was useful. I published the source and the binaries on Google Code, partly because I just wanted to share the app for free, and partly because I wanted to be transparent that I was handling the Xbox login credentials safely.

One day the app blew up and got a lot of coverage in tech news and link aggregators (like Digg, haha) and I suddenly had a ton of users. Eventually I figured out why. It wasn't that my app was so great exactly, but rather the idea that Google was writing a Mac client for Xbox made a great tech news story. However, that part of the story wasn't true, the project had nothing to do with Google, I was just hosting it on Google Code because it was at the time the most convenient place for a small open source project.

The episode made me wonder how often that happened. How many other projects on Google Code became part of a news cycle because people misinterpreted them as being written or endorsed by Google? Was that part of why Google Code was shut down?


> How many other projects on Google Code became part of a news cycle because people misinterpreted them as being written or endorsed by Google? Was that part of why Google Code was shut down?

I don't remember the exact details, and I was way in the backend (Kythe), not the frontend part of it. But my extremely hazy recollection is it probably had more to do with the gwt deprecation than anything else. There was headcount for awhile put on making an angular (?) replacement for the old gwt frontend, and I guess that didn't extend to also making a replacement for Google code.

Again, super fuzzy recollection here, from someone 2 teams away.


Wow, thanks for the insight. It's sort of crazy to think about how big GitHub has become, and how much Microsoft paid for it, but of course it wasn't the first product in the space at all. Right time, right place, right features, I guess, and maybe Google Code was missing a bit of each of those.


As I recall, GitHub popularized (maybe even invented) the pull request. That's what enabled open source to take off - being able to send patches to (or even fork) other projects with minimal friction.


The officially given reason for shutting down Google Code was that it was never intended to be a product and was just a community service thing that they did because SourceForge had ceased to be a reasonable place to host open source projects. Once a bunch of other forges popped up (including but not limited to Github) there was no longer any real need for it.

I thought it sounded pretty plausible for that era of Google.


I knew someone who worked on it at the time, and he was also convinced they could use it to force people to use the super-slow feature-lacking Mercurial, and was personally hurt that everyone wanted to use git instead because it let you rebase things.


This may explain why Microsoft Github hasn't completely enshittified their design yet – Microsoft doesn't want to be associated with https://github.com/RonSijm/ButtFish and https://github.com/zevlg/teledildo.el


Several years ago, a coworker made an Arduino "mouse" that traced a square with a 1KHz report rate. Multiple systems running Windows 10 and reading mouse updates via either regular window events or the Win32 raw mouse API would occasionally lose events, meaning the square would drift over time. We tried both with a custom app and with drawing programs like Paint.

We could not reproduce the issue on systems running macOS or Linux, and we chalked it up to a bug in Windows. It was hard to know if it affected real mice but I expect it did. I haven’t tried retesting with more recent versions of Windows to see if it is fixed, maybe it has been.

Anyway, I’m not disputing OPs claim, I can totally believe it, but I always thought it was funny that pro gamers on Windows with high end mice could be losing the occasional movement and apparently nobody noticed that.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: