This is the wrong way to see it. If a technology gets cheaper, people will use more and more and more of it. If inference costs drop, you can throw way more reasoning tokens and a combination of many many agents to increase accuracy or creativity and such.
No company at the moment has enough money operate with 10x the reasoning tokens of their competitors because they're bottlenecked by GPU capacity (or other physical constraints). Maybe in lab experiments but not for generally available products.
And I sense you would have to throw orders of magnitude more tokens to get meaningfully better results (If anyone has access to experiments with GPT 5 class models geared up to use marginally more tokens with good results please call me out though).
Well, how many more dogs would you need to help you write your university thesis? It's a logical fallacy to assume that more tokens would somehow help - especially that even with cursory use you would see that LLMs, once they go off the road, they are pretty much lost, and the best thing you can do with them is to give them a clear context.
Jungle Book - Walt Disney was involved with this, they started animating in 1965.
Rest of the films you mentioned - The Disney Renaissance the parent parent’s comment mentioned - starting with The Little Mermaid in 1989, and it seems generally accepted this lasted for 10 years?
Not latencies, think data privacy / keeping queries and data from leaving sovereign borders. This way, if there is some local instance / everything is local than the datacenter and service are subject to local laws and regulations (and alternatively you're not subject to foriegn the laws and regulations (and agencies).
That's not quite correct. The "sovereignty" pitch here is largely illusory when dealing with a US-based company like OpenAI.
The US CLOUD Act (Clarifying Lawful Overseas Use of Data Act) explicitly gives US authorities the power to compel US-based companies to provide data stored on servers, regardless of where those servers are physically located. This effectively undermines any meaningful data sovereignty claims.
Consider the actual arrangement being proposed:
- OpenAI (US company) maintains control of the infrastructure
- OpenAI controls the models and their development
- OpenAI maintains the security protocols and access rights
- The data merely sits physically within national borders
This isn't sovereignty - it's a limited hosting arrangement that remains fully under US legal jurisdiction. US intelligence agencies can still access this data through legal mechanisms that bypass the host country's laws entirely.
locality is good for resilience and latency but for privacy? how does it work?
How can one audit that the bytes going from a DC in country A to a DC in the US is not the user queries but some telemetry data for example? Presumably you don't get to look at the unencrypted packets
OpenGL is quite dated for VR/AR. In the Apple ecosystem they supported OpenGL 4.1 for quite some time before moving to Metal, which was announced 2 years before Vulkan.
If you spent the time developing an in house graphics API since open standards weren’t moving forward, why would you rewrite everything a second time just a few years later? Shouldn’t you expect to get a decade or two out of your existing API and only do the massive rewrite when the benefits become more substantial?
Vulkan & OpenGL applications can translate to Metal with MoltenGL and MoltenVK, respectively.
Vulkan and DirectX are the favored graphics rendering technologies for VR.
Godot supports Vulkan rendering via OpenXR.
To get a vibe for Apple’s general posture in this regard it is worth noting that Vulkan rendering through OpenXR on macOS is technically possible via MoltenVK, but macOS does not have an official OpenXR runtime. You’d need to use third-party workarounds or wait for broader support.
> If you spent the time developing an in house graphics API since open standards weren’t moving forward, why would you rewrite everything a second time just a few years later? Shouldn’t you expect to get a decade or two out of your existing API and only do the massive rewrite when the benefits become more substantial?
I have a natural inclination to agree with this thinking, but I think it's important to recognize that this is the sunk cost fallacy at work[1].
In an ideal world, Apple would have just built DirectX and sold the Xbox too. But you can't look at it from an executive's perspective, you have to look at it from the developer's point-of-view. This insistence on high-investment, low-ROI APIs is why the Mac doesn't have games. If you run the Metal playbook with VR again, you will have developers outright abandon you. We've already seen what happens.
Apple's GPUs support a decent chunk of the Vulkan featureset, you can go boot it up on an M1 with Asahi. Same goes for OpenXR. These are things that Apple neglects because they want to use their customerbase as leverage to market proprietary APIs. This hurts users, because Apple has neither industry-leading standards nor the leverage to force the industry to adapt. And they sure as hell lack the humility to just support both in the name of fair competition.
APIs are the last reason there aren't 'major' games on macOS. You've got architecture changes; PPC to Intel was a big loss of game compatibility, and then again when x86-32 support was removed from OS X nuked most of a user's Steam library.
And there's the chicken/egg problem of gamers just not being present in large enough numbers on macOS. The platform already has a fairly small marketshare in the overall PC space, the number of gamers are vanishingly much smaller; Steam stats put macOS at 1.58%, less than Linux.
APIs are the exact reason. Why can't you run Proton on MacOS? WoW64 works. Rosetta and Wine work. Is there any technical limitation besides API support preventing the Macbook from working like a Steam Deck?
Proton relies on Linux sys APIs not available on macOS, but the Porting Toolkit is available. I've been able to "play" Noita on my M2 Air (granted the perf sucks, but that's what I get for owning an Air). This discussion hasn't been centered around kernel APIs, but rather graphics APIs (D3D/Vulkan), if you're going for that "gotcha!".
Crossover is another option, though I have no need to pay for it as I own a Windows PC/consoles.
It’s more that devs can’t be arsed to write non-mobile games in anything but DirectX unless they’re being paid to (as the console vendors do). Vulkan support is quite rare in commercial games, it’s almost entirely DirectX or Sony/Nintendo’s things. If Apple somehow flipped a switch that turned on Vulkan support, almost nothing would change.
The single biggest things Apple could do to bolster gaming on their platforms is to pay studios to do it or for Apple to license DirectX from MS. Anything else will barely move the needle.
> If Apple somehow flipped a switch that turned on Vulkan support, almost nothing would change.
That's not entirely true. Whiskey being depreciated to support Codeweavers was a headline story this week - something that outright would not need to exist if Apple users could run upstream DXVK instead of GPTk.
> pay studios to do it or for Apple to license DirectX from MS.
That doesn't work either! Paying Eidos and Capcom and Hello Games did not start an avalanche of ports. Apple could license DirectX from Microsoft, but they could also just support Vulkan 1.2 and get perfect DX12 coverage through translation.
The bigger point is that the Metal-only route isn't working. We can argue over the merits of Vulkan until the cows come home, but the simple issue is that Metal doesn't get ports. Native APIs on Apple platforms just get ignored.
The bigger point is that the Metal-only route isn't working.
For macOS, no. For iOS, yes, and that's where Apple makes almost all their revenue. Apple wants your primary target to be iOS. If you decide to do a macOS port, that's nice but not essential. Of course this doesn't work for AAA games, but that's a sacrifice they're happy to make.
I don’t think it’s quite there yet, but the iOS Assistive Access mode is a step in the right direction. Eventually phones will have to have an “old people” mode, right? I especially worry how my grandparents will comply with 2FA requirements, passkeys, etc.
I did a .ipsw restore of my M1 Mac mini to 15 Sequoia RC last week and have since gone in and lowered the Security Policy to install ZFS kexts. I wonder if your issue is a bug relating to your multi MacOS boot setup? Have you posted this on MacRumors or elsewhere?
reply