Their software is better than most (if not all) of closed-source universe. That's true, but the problem is, they were better in the past.
I'm using both Linux and macOS close to 20 years (Linux is even more than 20, IIRC), and macOS (aka Mac OS) used to be snappier, more stable, more uniform and had incredibly low number of papercuts around the UI. Now it has some nasty thorns here and there, while Linux is improving steadily and not regressing much as macOS.
Apple needs to overhaul their software stack. They can use a lot of sanding and polishing to bring the shine back. They need another "Snow Leopard" release, as many people say.
On the other hand, even with all these bells and whistles, they can't even get close to the composability of Linux systems. Doing so will also damage their bottom line, so they won't, and that's OK.
When Apple released its BSD-based OS X at the turn of the century, I was at Rice learning on Solaris machines, and also started dual booting Linux on my personal desktop at the time. My first few years in the working world were spent on Dells running Windows, so by the time I bought my first laptop in 2006, I was excited to spend my dollars on an unusual-looking white Macbook specifically because it had a *nix shell and the developer experience was vastly better to me than any machine I used at my day jobs. I still prefer working on Macs because ever since, they have just worked and Windows has gotten progressively worse (I know, because I have helped my parents with their Surface laptop). Unfortunately, Mac OS X has been less robust in the last several years, and I'd love to see them turn this around, both for the developer experience and for regular consumers. I still like using Photos, but I don't use their cloud for those, and I've been amazed over the years just how uninformative the Photos app on Mac can be when it flakes out and I have to try a rain dance just to get it to sync with my iPhone. That's pretty abysmal for a company that used to just work, but I believe it comes from the top. Steve Jobs used to enforce quality, and I want to see that again!
Similar experience here, started with the same G4 ("white") iBook. That was an amazing machine. Under the hood it was hard to distinguish many differences with Linux/BSD of the time. The UI on top (OSX Tiger) was peerless -- I recall being very excited for the introduction of Spotlight. I'd say the decline came around 2012-2013 or so. Hardware was still great, but they were no longer updating the GNU stuff and anti-features like SIP made it harder and harder to run the applications I want (gdb for example). I gave up not long after they introduced the touchbar
These days I'm happier (or at least content) without a Mac. My FW13+Linux setup may not be as nice as the latest macbook, but it does exactly what I want and if it doesn't, I have options.
Apple's current software is such a joke I almost regret ever having invested in the Mac ecosystem. I still run Mojave for its 32-bit app support for (apple's own) apps that have no contemporary equal.
Apple weathered the passing of Steve surprisingly well, however the cracks still show. Apple's very best is exclusively reserved for those products/devices/software with Jobs' fingerprints on them.
I still run an original iPhone SE as well. The entire tech sphere has gone in such a poor direction, I've increasingly divested myself from tech. If it no longer works with my system, I simply stop using it. It's a happy ("insecure") place.
> I'd say the decline came around 2012-2013 or so.
I think it started slightly earlier: 10.7 Lion in 2011 introduced the new full-screen mode that was completely broken on multi-monitor setups, as though Apple entirely failed to test on or even anticipate what was at most a moderately "power user" hardware configuration. They've introduced lots of useless features over the years (eg. Game Center), but that full-screen mode was the first time I recall OS X having such an in-your-face usability regression that was so obvious and avoidable.
10.7 also dropped Front Row, which was a disappointment to me, but is at least understandable in the context of Apple TV existing as a separate product they wanted to steer users toward. Losing Rosetta in 10.7 was also somewhat justifiable, and didn't hurt me much since my first Mac was an Intel machine and I didn't have much of a library of PPC-only applications.
I'm a Linux guy who doesn't really like Macs but has intermittently been required to use them. On the whole I have a grudging respect for Apple (their hardware is peerless), but seeing one screen turn to "brushed steel" when the app on the other was put into full screen mode kind of blew my mind because "UI is worse than Windows" was not, at the time, a failure mode I believed the company was capable of.
The problem is, in the age of the Internet, old operating systems decay. Even MacOS 10.13 is effectively unusable as a primary workstation, NOT because Apple has abandoned it, but because Firefox, Chrome and Homebrew have abandoned it. Yes there are alternatives, but my point stands.
SIP is anti-feature for a certain class of users, but the right tradeoff for most consumers. At least you can disable it. And even as a developer I leave it enabled.
It's really easy to fail to see this in the heat of things.
macOS has a feature where it puts an orange dot on the top right corner of your screen whenever your microphone is recording. That orange dot is normally part of the menu bar, and completely unobtrusive, but will still show up on top of full-screen windows (e.g. it'll show up on top of games if you're on Discord talking to friends), which is distracting as hell.
As horrendously annoying that little dot is, what's the alternative? Either you have an uncircumventable marker saying you're being recorded, or you don't. Any way to turn that thing off that doesn't involve disabling SIP would be trivial to exploit by anybody who managed to plant malicious recording software in the first place.
More annoying is when you use something like SoundSource (a paid app which adds per-app volume control and input/output redirection to macOS... a feature that by all rights should be built in in any reasonable OS) you get a permanent purple dot indicating a third party tool is intercepting audio.
Again, I get it, but as a power user this kind of stuff is just infuriating.
It's also annoying that macOS doesn't already have at least basic per-app volume mixing.
So much pain in macOS is in areas like this, trying to hack basic features back into the anemic OS.
Apple's "OS" updates typically focus on end-user applications that I don't use and never intend to. Meanwhile the core of the OS, and even the desktop environment, feels stagnant compared to many Linux distros.
Actually, the date was 29 June 2007. This is when GPLv3 was released and Apple could no long continue using the fruits of open source labor without giving anything back. That's when MacOS X's UNIX underlying began to ossify. Sure Apple kept backporting important security things, but it froze in time all of the GNU utilities that made UNIX on MacOS X good.
I’m honestly unconvinced that the “or later” clause of the GPLv2 license is legally valid. Can anyone think of any example where contract terms get to be reinvented by a self-interested third party whenever they choose?
IIRC, the FSF generally insists on getting assigned the copyrights on all GNU software, so the FSF can re-license any new version of their software under any license they choose to, which is currently GPLv3.
Users/vendors can still choose GPLv2 for the older versions of GNU software, which I think is what Apple does for the few remaining GNU software they ship.
The whole experience you're having with the rain dance is because the cloud does just work. It's a vanishing a tiny percentage of people that don't use it.
I hear ya. I'm not in the target market. Surprising, I know, considering how many SaaS platforms I've launched which maintain photos and videos in the cloud!
Many other iPhone/Macbook users have been shocked I don't turn on Messages on my Mac due to a bad experience with sync in the first year that was possible, and I had a similar bad experience with photos in iCloud early on. Maybe the sync is fast now, but my usage would put my in a higher iCloud tier than I'd like, and I still feel more at ease juggling many Photos libraries on external hard drives. I avoid Google Photos like the plague, and even though I trust Apple more (for now), I'd still rather not entrust to them my family's personal photos and videos.
As someone who has had the pain, if you're open to some prodding - one of your external hard drives might be broken right now. Don't risk it. Just pay a few bucks a month to avoid missing your memories. :) I don't think they've ever had a data loss.
It doesn't even have to be Apple. There's Backblaze and Arq and Tarsnap, amongst others. Encrypt the hell out of it and make sure there's a globally redundant copy of your files. If a thief or fire/tornado/earthquake/tsunami takes out your physical drives, where are you?
Always good advice! I did start backing up at a location other than my home many years ago, but it’s not in a cloud. The one time I tried Backblaze I wasn’t impressed but I recognize there are other good alternatives and certainly agree with your strong convictions!
> Maybe the sync is fast now, but my usage would put my in a higher iCloud tier than I'd like
You can use Messages on the Mac without storing messages in iCloud. iPhone, iPad and Mac can all send and receive the same account’s messages, effectively staying in sync without actually syncing them to iCloud’s servers.
The thing where Linux (and Android, and Windows at least circa 2023) blows Apple out of the water is in UI latency. The built-in animations on Apple's software are sometimes hundreds of times slower than on their competitors, in ways which can't be accounted for.
Improving interface response times is the single best thing Apple can do to improve their UX. I don't need an interface which throbs, wiggles, jiggles, shines, and refracts, I need an interface that's snappy and fast.
As far as I know, MacOS is the _only_ desktop OS with this problem. The only way to fix this problem on MacOS is to do everything inside a virtual machine running anything but MacOS.
> The built-in animations on Apple's software are sometimes hundreds of times slower than on their competitors, in ways which can't be accounted for.
You can turn down the animation times for most of this with "defaults write" commands. Set them to 0 or as small as you want. Here's a good list to get started:
As others have noted, the "Reduce Motion" doesn't fix anything (neither does Reduce Transparency).
These terminal commands don't fix the problem- there are still lengthy animations, e.g. when swapping desktops or opening folders. These are tasks I sometimes do multiple times per second on Linux.
Hilariously, this is what the Gnome 2 people would have called an "Unbreak Me" option, something they tried culturally to eliminate more than a decade and a half ago. With... not total success, I guess, but the resulting environment tends to have a very high level of "work and not suck by default" quality -- something that steadily evolving commercial software tends to struggle with maintaining.
> Enable the “Reduce motion” setting in System Settings.
> This is always the default answer to this question online, and I’m sick of it! It doesn’t even solve the problem, but rather replaces it with an equally useless fade-in animation.
You're missing the point here. The "old" Apple would never have tolerated a janky feature that inverts responsibility onto the user and behaves poorly out-of-the-box. Back then it was either lightning fast, jank-free, and intuitive -- or else it doesn't ship.
But this eroded over time. Nowadays both Mac and iOS are bloated pieces of crap that reek of design by committee. A lot of people blame Alan Dye (and they are probably right to do so) but there are other factors too. With Steve and Jony gone, they need someone who cares to step in and assert control once more
> Back then it was either lightning fast, jank-free, and intuitive -- or else it doesn't ship.
That's kinda rose tinted history. System 7 (1990s Mac OS) for example crashed and locked up a whole lot, in my experience. The UI was fantastic and had great consistency, and the developer docs were of a quality that would blow minds today. But the software was not as solid as all that.
Windows was the same or maybe worse at the time. BSOD was common and a nightly reboot was a good idea until NT/Win2000. Solaris and BSD would have months of uptime on similar hardware, so it was a software problem. PC OSes were just not there yet. Windows 2000, OSX, and Linux gradually fixed that.
That's all basic uptime. The UI design drift of MacOS is another story.
I have an ‘old’ model (iPhone 14 pro max) and text frequently misses characters due to the lag/input delay. It’s most pronounced when using safari for some reason.
In any case, it’s odd that hardware is multiples better yet it doesn’t always nail something as basic as typing
In my experience, iOS only misses keys during the time the keyboard is loading (which can be over a second- crazy!)
But I often have input lags where I will press several keys, and then a period of time (which can be multiple seconds) will pass before my taps are registered.
The 14 Pro Max launched less than four years ago, and should not be slower than an Android which launched a decade prior.
I think a big part of this in recent years is SwiftUI just not being fully-cooked and Apple trying to shove it into a bunch of areas without enough attention to performance. Not sure how it is on iOS, but for example, the Settings app feels chuuuunky if you navigate through the panes with up and down arrow keys. I wasn't able to make a selectable list view that worked consistently and didn't feel like a regression compared to an equivalent AppKit view
> I think a big part of this in recent years is SwiftUI just not being fully-cooked and Apple trying to shove it into a bunch of areas without enough attention to performance.
FWIW, SwiftUI got a huge performance boost for iOS/macOS 26+, and Instruments 26 has been nice for finding performance bottlenecks. You may find the SwiftUI performance auditor in a free/FOSS project of mine (https://charleswiltgen.github.io/Axiom/commands/ui-design/au...) helpful as well.
Why it took 4 years to get to near-UIKit levels of performance I couldnt say, but I've had a great experience working with it on an app that's 97% SwiftUI.
Hmm, I guess I couldn't have known about that since I don't have iOS and haven't upgraded to macOS 26 yet, but performance auditing did seem a bit opaque last I tried.
Any specific improvements you've seen on the mac side?
It's odd to see this comment, since I've always had the opposite experience (at least when comparing Windows and MacOS -- I haven't used desktop linux much in the past 20 years). On MacOS, when I click something, something happens, or at the very least starts to happen (and I get some visual indication). While in Windows I often click on something and get no indication that something happened or started happening, so I click again, and then suddenly perform the action twice. This most often happens when opening programs, but it happens in other places too sometimes.
I’ve found Mac OS to be snappier than any of the dozen or so Linux DEs I’ve tried. I use Fedora with XFCE and it’s ok in responsiveness, I’ve got PopOS on another machine. It’s good. But I’ve got MacOS on my other two machines and they just feel so much snappier. And the Macs are 6-7 years old. The other machines are newer (2/3yo).
In any case have you tested on the same machine for the most apt comparison? Agr may not be the best predictor of performance when io and memory may be more productive of snappiness than the latest CPU.
Input devices and monitors can make a difference as well.
For Windows, my last experience on a personal install was Windows 10 and that was yeeeaaars ago, so... Grain of salt :)
It's not the default, but IIRC Windows could be configured to have zero animations, and I found it to be quite responsive as such.
I'm not talking about the speed of opening programs, but more of the speed of every-second interactions: Unfolding a folder (or other interactions within a program with keyboard or mouse), alt-tabbing across windows, moving between desktops, etc. At least on Windows, I saw far fewer IO-blocking animations than I have on MacOS.
You're right about the "something starts to happen": Apple hides delays behind sigmoidal animations throughout much of their OS. For those who aren't aware of the trick, the delay between the start of the animation and the tail where it starts appears to just be an animation that started on the interaction.
Package management, too. I recently got a MacBook for work, but it’s sitting on my desk and I’m continuing to use my Lenovo. Managing software updates is much better on Linux. As is managing windows (via Niri in my case). macOS really feels like a downgrade.
I don’t disagree, I just moved back to Linux from macOS myself (Tahoe was the last drop for me).
But did you try Homebrew and its extensions? It works pretty well for managing both terminal and GUI apps, and has some useful extensions like Brewfile, MAS, etc. Its not perfect, but for single-user Macs with an up-to-date OS version, it works quite well.
For managing windows I agree Mac OS sucks. But the third party window managers I use for MacOS are better than any other first or third party window managers I’ve ever used. Windows has far better window management than any Linux distro’s default WM. (But it’s terrible in every other way)
Except on Linux you have to remember which of the several different package managers each specific system uses. Do I use apt, apt-get, pacman, yum, dnf, flatpacks, build from source?
Homebrew on MacOS is miles ahead in terms of DE in my experience. But yeah I guess by default the “App Store” is meh.
There is no such thing as DX with any digital tool. Its just pain and suffering all the way down. Sooner you realize it and make peace with it the better.
> Except on Linux you have to remember which of the several different package managers each specific system uses. Do I use apt, apt-get, pacman, yum, dnf, flatpacks, build from source?
How often are you switching systems that you can't remember the package manager?
You could just alias your package manager to something more memorable if it's really a problem, but I feel like this argument only really applies to servers where you may be logging into a variety of different distributions every day.
Nix is not the same as nixos, and in this case the distinction matters. It has to step carefully around Apple's updates. This further highlights the fact Apple lacks the same quality package management as some linux distros. Nixpkgs (on macos), Ports, and Homebrew packages are toys compared to the EFFORT that goes into maintaining Debian and Redhat packages.
In terms of package management SOFTWARE, however, nix (and guix, lix, etc.) are state of the art and work fairly similar in both linux and macos. A deeper integration with the OS would have been nice.
Package managers are wonderful until you step near our outside of the packaged software - then you better hope you're on a big distro otherwise you may be in uncharted territory.
This is the biggest thing that irks me after coming from windows. Everything feels so sluggish. I wonder why the internet ist full of people complaining about that. I guess they just dont work fast enough to be bothered by that?
Why would you use that feature? MacOS doesn't REALLY have multiple desktops (Spaces). That is merely a pre-release feature (for 10 years or so, I think). As evidenced by the many critical user journey bugs it has that don't get addressed.
I use both linux (with a decent tiling window manager; the tiling management being the least important part of it) and macos. And certain things are just not possible to do with macos. On linux I can have 300+ open terminal windows AND CAN find the one I need when I need to. On macos 20 (counting in Termianl tabs, which are implemented as windows, underneath) is about the high mark that it gets annoying to work on. On macos, you can't effectively work on multiple projects that use the same software (editor + terminal, for example). You can work with different Applications, though, and that is managed pretty well (better than most linux window managers that I have seen).
Every year or so I try adding a couple of Spaces, and always regret it a couple of hours later, switching back to a single Space (+ a few fullscreen apps).
I've used spaces since 2013, they work well enough. The animation bug is annoying though. On displays higher than 60Hz, the animation is slower because they made it frame-based instead of time-based, or something silly like that.
I love the three finger gesture to move between them though, it's like moving pieces of paper around. You can also work around the bug I mentioned by swiping faster, but yeah I wish they'd just fix it so we can move on.
Of course it can be used. But it is very buggy (as in missing or not well-though-out behaviors), which is unlike the typical polish Apple human interaction folks deliver. For example switching between Spaces and then between apps and windows and creating a new app window don't work as expected in some combination of steps and for some apps. There are several other "corner" cases that show the features were not laid out in a full design to exhaustively decide the desired behavior in each case. Which is very much like when someone bolts on a feature to a system without fully nail down its interaction with all other adjacent and relevant features.
I'm just responding to your "Why would you use that feature?" question. I use it because I like it, and it works well for me. I'm not disagreeing that they have some bugs and design issues to work out. It seems pretty obvious MacOS doesn't get as much attention as iOS when it comes to these things.
Linux benefits long term from the fragmentation that hurts it in the short-term. Competing projects means it is harder for software to go too far down the wrong road. Go to far and somebody emerges to replace you. And popular ideas emerge that others can copy from.
With macOS, you really have no choice to use what Apple offers. You can hope they listen to dissent but they may not depending on priorities. And things have to be bad enough to jump platforms before real dissent registers. And things have to get pretty bad for that.
Same issue with Windows of course.
With GNOME, KDE, COSMIC, and the Linux rat pack, it is easy to switch experiences without ditching Linux entirely. And somebody has probably even patched your DE of choice to address the papercuts you do not like.
I've been using Apple since IIe in the 80s and all of the UI iterations. People make iOSification comments about macOS, and there have definitely been annoyances as they are seemingly trying to unify the UX. Maybe it'll make sense when they have touch controllable macOS systems, but making things that work well for fingertips and assuming they will work equally as well operating by a mouse is just bad.
As for Linux, I don't think I've ever used a system with UI for any serious amount of time. >99.999% of my usage is on headless systems through a terminal. As god intended.
A major reason Snow Leopard was well received was because of how performant it felt along with the bug fixes. What isn't mentioned anywhere near as much is that it dropped a lot of hardware (PPC). The last G4 Powerbook got about 1.5y of OS support before it was dropped.
iOS 26 is slated to drop a bunch of iPhone models. macOS is dropping all all macs with Intel CPUs.
A Snow Leopard release isn't great news for a lot of people.
Aren't they? The last Intel macs were being sold less than 3y ago, and by the time macOS 27 releases they'll be less than 3.5y old.
The broader point is that a "Snow Leopard" release has historically resulted in a lot of hardware being left behind, and many of the devices that could have benefited the most from optimizations were cut off.
Everything (or 90%) on the iphone was taken from the mac and put into the iPhone. The software dvelopment for the PC environment, then stripped down and streamlined for the phone is why the iPhone was the revolution that it was.
Apple, like you, can only think in terms of revenue and profit generated. "iPhone makes this much profit = iphone gets this much development".
That thinking has led us into this stagnated crap, because it's a terrible way to do software. Worse, what's happening now is apple is taking its iThing software and trying to migrate it to the mac. The Mac is now getting destroyed by the iPhone development.
The smart move is further development of the Mac to explore ways to bring new features to the iPhone or future apple devices. To simply go "this = profit = all development goes there" personifies a lack of wisdom.
True, it's better than most for sure and I agree it used to be better.
Though a lot of other software for windows and linux are really not that great so the bar is probably on the lower end.
What metrics or experiences lead you to that conclusion?
I've used basically all of the major operating systems for 30+ years and I cannot stand macOS. I use a Mac as one of my work devices, and off the top of my head:
* Basic things such as window management require third party tools to get things that are table stakes everywhere else. Even with third party tools doing anything with a "full screen" mode is not going to work the way you expect.
* You can't have separate scroll directions for your trackpad and your external mouse.
* External peripherals in general are a disaster. Every time I connect or disconnect from a docking station my windows are left in awkward positions sized larger than my screen and I need to drag them around
* macOS seems to store a different set of monitor orientations based on what USB port I connect my dock to - same dock, same monitors, 2 different layouts I had to configure independently. I don't even know how you could accomplish that if you wanted it - and absolutely no one wants that.
* Multiple monitors is constantly an afterthought, whether it's menus, the dock, layouts, what have you
* The Settings app is impossible to find anything in. You have to search, and that works OK sometimes, but the layout has no rhyme, reason, or comprehensible order
* Safari. Enough said.
I could keep going, but I absolutely do not associate Apple with quality software.
not the op but for me safari has been deteriorating for years:
1. ios performance in scrolling and loading (especially on my ipad pro m2) is unbearably slow, just stutters everywhere when loading a page in the background
2. tap-and-hold to open a link menu is so strange; sometimes it highlights text instead of showing the menu, sometimes it works ok, there is some kind of strange ui timing issue at work
3. on ipad and ios the tab overview display scrolling is absolutely appalling like 4fps level slow... completely unbearable to scroll through tab previews
4. developer tools are abysmal compared to chrome
5. on desktop performance is also extremely slow compared to chrome, its night and day
6. battery usage is so bad on ipad, just leaving some tabs open they run down the battery and chug memory (i know this is more a web thing, but they should at least freeze tabs in the background or make it an option)
7. just strange bugs on ipad, when tapping a text field the keyboard pops up, then suddenly disappears and the pops up again... just a terrible app
etc etc lots of paper cuts, but the performance issues are the biggest for me... i like the tab groups + auto save and icloud sync and built-in spellcheck, but its getting harder and harder to resist the alternatives
The Tab Overview is due to some background tab reload.
I could basically sums up your experience as Safari is appalling at multi tab resource management. And it has been the case for 14+ years and counting.
It wasn't until Safari 18 before I have most of the rendering issues gone on sites I visit. Safari 26 is completely gone. I haven't encountered one since Safari 26.1.
With a lot of features done, I just hope Safari turn its attention to performance and snappiness of the browser. Multi Tabs doesn't work. For people who uses more than 30+ Tabs is when it start getting slow. Safari used to have an option to unload background tab and that usually fix 80% of the problem but it was taken out some years ago.
If you tap into the address bar, start typing your search, type enough for it to be specific enough that autosuggest crap
clears, and “On this page” appears. Wildly undiscoverable in practice.
3. Grievance 1 and 2 compound together. Whenever Safari (or a Safari update) breaks a feature, you cannot inform a user that they can use another browser as a work around (because all browser engines are forced to use Webkit on iOS/iPadOS)
4. Bad dev tools. This has been seeing much needed improvements (e.g. being able to type an entire word in the css pane instead of a new character on each line), but it still feels 7 years behind.
5. No way to report bugs. There is a "bug reporter" at bugs.webkit.org, however each bug is auto-tagged with a link to an internal bug tracker within Apple. This means that those who are trying to fix bugs and those are trying to report bugs have a wall between them. There is no way to have a discussion to try to narrow down what the bug is, why the bug happens in one case but not another, what's really the cause of the issue, or why the bug matters more than whoever is assigned it might realise. When reporting a bug to Apple, it's more useful to talk to an actual wall because it might fall on you giving you an actual response.
6. Performing incorrectly is more important than getting the performing correctly. This one takes some explanation, but it's a little tricky. I'll give three examples then show how they're all the same issue:
Example 1: The cool homepage.
I was working on a website. Two months before launch I decided to spend a month of time juicing up the homepage, then one more month on polish.
On the homepage I decided to use the brand's pre-existing graphics and turning them into a parallax animation (inspired by: https://www.firewatchgame.com/)
It had:
- Regular content on the page
- Parallax layers with simple vector graphics inside them so that when the user scrolled down the page, the user saw a parallax animation of the landscape changing. (e.g. far clouds, far mountains, close clouds, close mountains, hill, foreground, simple bubble particles closer than the regular content on the sides to strengthen the depth of field illusion, etc...)
- Other vector graphics following a motion path animation
- This was done in 2017, so before CSS got scroll driven animation support, or motion-path support. It was also done without JS.
- Everything worked brilliantly, until we discovered that one particular iPhone model rendered an empty blank white page.
- I lost the last month trying pulling the effect apart trying to diagnose the bug (and with Safari's buggy dev tools being no help I had to do it in the dark). I was able to determine when the bug would trigger, and had to tear down my whole homepage and rebuild it with 2 fewer parallax layers before launch and 3 days of polish for the rest of the website before launch.
(you can see the final result at https://myobrace.com, but I really would have liked the extra time for polish, if you're wondering how I achieved the effects without JS and/or scroll-driven animations, I used css's perspective and transform rules to position the elements back in the z-axis then scaled them up so they appeared the correct size with the regular page content so as the page scrolled, the elements further back appeared to scroll at a different speed. I then used SMIL for the motion paths in the SVG elements).
Example 2. The texture
I wanted to add a repeating texture to buttons so they didn't feel so flat without needing a separate network request to download an image.
I tried generating one with SVG but the SVG 1.1 filter effects implementations aren't all hardware accelerated.
I tried generating one with CSS which worked everywhere but Safari.
You can see a texture here where the texture is generated entirely within CSS, and it doesn't work in Safari (but I didn't hide the seams because the result looks like a cool mosaic and I wanted to share the technique): https://codepen.io/spartanatreyu/pen/Yzbmvbr
(if you're curious about the actual texture I used, I hand drew a minimal noise texture in photoshop that could be repeated without showing seams, then base64 encoded it and inlined it within the CSS file so it could be loaded without needing an extra network request. You can see my development version here: https://codepen.io/spartanatreyu/pen/YzoexGg?editors=1100 (the final version is locked behind a login wall in a child-friendly education webapp))
Example 3. Asset downloading
I made a webapp for kiosk machines that downloads 100mb+ of video assets when logging in for the first time. iPads have a kiosk mode so I supported iPadOS' Safari mode so that the iPads could be used in commercial settings as kiosk machines.
When the final assets were added, the iPad machines would randomly crash during the asset downloading process.
With Safari's completely broken debugging experience, I eventually learned that as Safari downloads a video, as soon as it tries to put that downloaded data somewhere, it has to copy it across to the new place it's being stored, and if you're copying more than 3mb, it crashes the browser.
The fix was to download and store each video in 1mb chunks. This slowed down the installation speed by a bit over 300%, but at least Safari didn't crash any more.
---
Now back to: "Performing incorrectly is more important than getting the performing correctly."
It turns out Safari on iOS/iPadOS has an invisible time/performance budget. Anytime Safari hits that budget, the browser stops what its doing.
- Drawing texture to screen? How about we stop drawing all textures to the screen, including text. Websites don't need to draw any text right?
- Rendering a texture in CSS? How about you have the color white covering everything else instead.
- Downloading a video that's more than 3mb? How about I crash the browser when the download completes.
Compare this to Firefox and Chrome, as they run out of their budget, they stop starting new work so they old work can finish before starting their next task. The page may take a few milliseconds longer to get to the correct result on slower devices, but the result IS correct.
Even worse:
- Safari has no way of informing the code how close it is to the budget.
- The budget can only be found by trial and error.
- If the iOS/iPadOS device has other apps in the background, the budget is smaller.
- Each device has a different budget, so you have to penalize all Safari devices to the smallest supported budget of the oldest supported device.
- If you hit the budget on the most basic functionality (e.g. a homepage, a button, downloading required assets), then your website / webapp may as well not exist to those Apple users.
To be fair, Safari 18 ( finally ) improved a lot on what what reported in both State of CSS and JS. With 26 even better, it is gotten to the point where I believe hopefully 27 it will be a non-issue most of the time. As long as they continue to grind through everything for the next few years and not stop / pulling out resource on Safari Team.
Agree on the time/performance budget. It is pain stupid. As has been the case for so many years. And yet nothing has been done about it.
Since Catalina (maybe since Yosemite), apple has gone down the path of iOSification of its destkop operating system; dumbing it down and trying to own all use cases. Any professional desktop users have long since been chased away, and whatever professionals apple cannot shake: video and music production, have been so shoehorned in to a stupid naïve vision of what their work should look like, it borders on a joke.
No serious computer user can use a Mac anymore, and this is an unfortunate departure from Steve Jobs' Mac where he expended great effort to ensure the Mac remained a serious desktop OS.
The most egregious example of this stupidity is the dumbing down of the Disk Utility app - an app rarely if ever used by normies, and so dumbed down the pros don't want to use it either. Really leaves you scratching your head what the decisionmaking process there was.
Where Steve Jobs' would draw lines in the sand and ask developers and users not to cross it, chairman cook put NATO wire and basically forced users to do as told (safari extensions got nuked, app store apps don't load older versions of software and there's some weird exclusivity agreement, HFS+ support got dropped and apple refused updates to machines that didn't follow, etc. etc. etc. etc.)
The settings app being hot garbage is apple trying to unify their toy phone OS with the desktop OS.
Safari nuked 3rd party extensions so everything has to go through apple's extensions "store".
Apple treated its core base, the ones who saved Apple from collapse in the 90s, like expendable slave. Worse actually; apple actively chased them away like lepers.
This has led to a systemic core rot in apple's software and ecosystem, one that will take years to rectify.... if apple even chooses to do so.
> * You can't have separate scroll directions for your trackpad and your external mouse.
My latest Mac OSX wtf was sometimes the terminal window shrinks by 1 or 2 columns every time I wake the computer up from sleep, but only when connected via thunderbolt USB C hub to external monitor. Terrifying to imagine how that must be. By contrast, Linux/BSD desktops don't generally seem to pull this kind of weird mindfuck horror movie shit? Like it either works or it's completely, obviously, totally broken. Not some weird subtle in-between thing.
- Let’s hope they don’t change the way macOS manage windows. All the additions they made to accommodate Windows users are useless.
- I don’t have any issue on searching macos settings. Could you provide an example?
- safari is a great browser, i use it as main browser since years and i’d never go back
I think you could keep going saying things that are not true.
> You can't have separate scroll directions for your trackpad and your external mouse.
The worst. There are even separate toggles in Settings for mouse and trackpad scrolling direction, but changing one changes the other. It is truly amazing that this has persisted for 15 years.
Opinions vary, but I've never found Apple software to be particularly good. Their hardware is almost always exceptional.
I'd go further and say I am constantly frustrated by how difficult their software can make basic tasks. I often find many of their UX patterns unintuitive, or even feel user hostile at times. Small example, I really want to view passwords as I type them in. I constantly miss type passwords on touch screens. User error maybe, but frustrating experience.
XCode is my least favourite IDE that I use regularily.
100% agree. As someone who used both Mac and PC for 30+ years, and still use both, Mac OS (and iOS) aren't very intuitive. Lots of hidden functions. The way they organize settings is tough to find. It's always a struggle.
My experience is similar. Great hardware. Software is good until there is something I want to do that isn't very obvious, then it's either a hassle or not possible.
My favourite example being looking for the volume mixer, and after looking online the top advice seemed to be to pay for a 3rd party application for that... Wtf?
There are so many basic gaps in functionality and so many underbaked & poorly designed Mac OS features that I end up papering over with paid 3rd party applications.
In order for that to actually be a money-making strategy for Apple, those third-party apps that address weaknesses in the OS would have to be sold through the Mac App Store so that Apple gets a cut. I've been a Mac user since before there was a Mac App Store, and I've never bought such a utility through the App Store. I have paid for several such apps over the years in ways that did not generate any direct revenue for Apple, and most of those apps likely could not be distributed through the App Store because of how they muck around with private APIs and other OS internals.
Those third-party apps do increase the overall appeal of Apple's platform, but suggesting that Apple might want to encourage that situation rather than improve their OS themselves sounds like a broken windows fallacy.
XCode is one of the worst pieces of software in history. Imagine writing a code editor that couldn’t keep its syntax highlighting from crashing for multiple years.
You must be a fetus. Apple was leagues ahead of everyone else with the inception of the Mac all the way through Windows 7...
Microsoft finally caught up around that time, but has since added a whole new dimension of enshittification that the only conclusion that can be reached about tech as a whole is that it all sucks and will always suck.
> Apple’s software is the best in the non-free software world compared to Google's or Microsoft's, IMO. But that doesn't mean it can't be better.
20+ years ago, software was so horrible that we were just tolerating it, and every new OS release was a big deal because there was hope things would get better! Today an OS release comes out and I have to be bothered by automatic "you must upgrade messages" to even care.
People forget how horrible it used to be, and if you still use windows, how much worse it could be when vs. Apple (and let's not get started on Linux).
I was using (and writing) software as long as 35+ years ago and I disagree with your assessment that we were “just tolerating it” 20 years ago. 20 years ago, I was using Mac OS X Tiger on a new Intel-based MacBook Pro and it ran like a dream, and had software which mostly followed Apple’s human interface guidelines. Now I run macOS Tahoe and curse under my breath at the lack of design consistency and the iPad-ification of the interface. I’m also shown ads, and in some cases ads that can’t be dismissed or disabled, for things like iCloud and Apple Music.
When it comes to the software, I’d take the Tiger experience over the Tahoe one hands-down.
I used 20+ years ago as a guideline, not an absolute. Of course the intel MBP came out in 2006 (or 2007?) and was an absolute dream setup where hardware caught up with Windows while the software was pretty good as well (I was using a Mac since 2004 or so).
I don't think software is improving today, which is why I have to be nagged to upgrade. I don't think it worse, but my computer usage probably varies greatly from yours.
Maybe. I personally couldn’t afford to switch until 2004. And I grew up with PCs (well my first computer was an Osborne). Even then, it felt expensive and slow until the Intel switch.
Same here. Two decades ago, I was excited to install updates to commercial software I used because they fixed bugs and brought useful new features. These days I fear updates because they introduce new bugs, remove features I care about, and come with new anti-features that I actively do not want.
The macOS Tahoe release is a great example of this. I can't think of a single thing I prefer about it and could easily name ten things I hate about it.
> 20+ years ago, software was so horrible that we were just tolerating it,
Absolutely not, especially not on an Apple thread.
By example, the iPod released in 2001. Anyone who used those early knows the user experience was competitive with the current experience. In 2006, I was using the version of iTunes then which was probably objectively the best desktop music app ever created. There are features then that were just there, that were pioneered, or now absent, like an automatically sorted "least listened to" playlist that are now nearly impossible to find. Sync alone is still an headache the OS community just does on the side, and no one is even bothering to compete on it anymore.
Amarok was way better than iTunes in that era. Massively better UI, separation of playback queue from collection browsing, plugin ecosystem, better metadata fetching including lyrics support... And its dynamic playlists were way more capable too.
I had an iPod in those days and Apple's firmware updates that periodically broke third-party sync (while bringing no improvements) is the reason that to this day I've never bought Apple hardware for myself from Apple since that time. Used hardware only.
Every time I had to use iTunes was regrettable. The app was an insanely massive download for the time. It tried to install fucking Safari on Windows for no reason. The UI was somehow simultaneously a sprawling mess and feature-deprived.
Maybe there was a brief period where iTunes was genuinely an interesting app, but even by the mid-aughts, it had been totally surpassed by a number of open-source music players.
But Amarok at that time was only available on Linux. I assume most iTunes fans of the time never got to try it.
It's worst in case of freedom, which is the most important aspect for me. Every release they are slowly turning in the screws and make it harder and harder to install apps from developers who haven't jumped through all the hoops that Apple forces them to. I hope this change in leadership will change this strategy.
Google is worse. Most of their apps are cloud only with no E2EE. Also, they are much more user hostile when deciding what goes in the store (they make money off spying, but apple makes money off hw, so this makes sense).
Both those ecosystems are rapidly enshittifying (apple cannot even reliably process keystrokes with subsecond latency, and google is banning sideloading).
We need a third, actually user-serving and open alternative. Maybe the new CEO will slow or reverse the bleeding on the iOS / MacOS side.
Google has so far allowed installing apps without their explicit permission. So it's much higher on freedom index, imo. And there's no obligation to use Google cloud apps. There's alternative for every Google cloud app.
That’s good to know. Is there a list? Maybe a vocal community of computer literate people with money could loudly move to banks that do work (regardless of which phone they have).
That's like saying there's no freedom in USA because I didn't get a visa to visit. We are talking about the freedom of Google devices. And you are talking about banks not letting you install their apps on a non Google OS. Totally different things.
P.S. even if it is buggier than Windows, MacOS has a lot fewer bugs than our app! We do encounter bugs in Linux, but they are almost invariably fixed in up to date distros. Unfortunately we are forced to support old enterprise Linux distros.
not only Safari, several other apps such as Music (which also has several annoying quirks)
never understood why they did not get their own lifecycle if they have dedicated teams for each of those apps
If you're interested, it's to reduce cost. It's incredibly expensive to build something like Music or Maps. If each version is tied to an OS version, it keeps you from having to explode your testing and fixing cycle over time.
This is especially notably when you want to support all the latest OS features.
My company keeps the testing cycle smaller by only adding new OS-dependent features to its mobile app when the minimum supported OS version gets incremented and a feature is supported in every supported OS version. That means that the iOS app is only now getting features that were added in iOS 15 in 2021.
It's a deal when they stop updating. It is true they provide OS updates for longer than most, but many people use devices, especially ipads for way longer than the OS supported period. And those people are stuck on an old unsupported browser without being able to update or install a 3rd party one.
Chrome and Google being bad doesn't make Apple's restrictions good. That said, Android lets you install a 3rd party browser which can choose to keep supporting old devices. iOS locks everything to using the safari engine.
Unlike Android indeed, when you maintain a perfectly working phone that happens (by accident or force of nature) to live longer than the official lifetime some executives in a remote office had decided to grant it, the web browser cannot be updated any more. Just the single most security sensitive piece of software of any computer. Who would have guessed people were going to complain!
And neither does Google. The latest version of Chrome requires the version of Android released in 2019. The latest version of iOS supports my iPad released in 2019.
Their legendary "goto fail" debacle as well as the ease with which ios has repeatedly been jailbroken would disagree. I think geohot once quipped: "My lawyer could write a better malloc."
[1] Actually, the defect was that creating a root account was a unprivileged action, so anybody could create a root account on your machine with a password of their choice. The most obvious presentation is that you could login to root by pressing enter twice with the empty password; the first time creating root with the empty password and the second time logging you in.
I think of it as BSD style, though of course it could be suggested/mandated elsewhere -
[...]Use a space after keywords (if, while, for, return, switch). No braces are used for control statements with zero or only a single statement unless that statement is more than a single line, in which case they are permitted.[0]
As I look, GNU guide is less specific, but examples[1] show the same style.
The good thing is that -Wmisleading-indentation [2] (comes along with -Wall) catches this indentation error.
CryptoKit isn't relevant to `goto fail`, which was the origin of this thread, given CryptoKit merely implements primitives and not TLS.
If you really are doubting what gets used for TLS, open up Console.app, start streaming, run `nscurl https://example.com/` (or load it in Safari, etc.), and you'll see logging like:
Most of the main apps on Apple TV shouldn't require a password anymore; you log in on your phone to authorize. The next Apple TV should simplify this further...
> Apple’s software is the best in the non-free software world compared to Google's or Microsoft's
But it's worst in the Apple software world compared to Apple's. In fairness, Microsoft has also been in steady tragic decline for a while. I don't know about Google.
I haven't really had to work with microsoft software but apple's software quality is abysmal beyond the OS (and even the OS has places that are a joke, like the bluetooth stack).
I'd rather use nano than having to write code on xcode.
Apple’s software has a kind of reliable predictability that many appreciate.
But “best” is far too strong a word.
For starters, most if not all their software can be described as simpler also-rans.
And in line with that approach, for a company that innovates in hardware, it does not apply that effort to software.
With two exceptions in the last two decades. The iPhone and Apple Watch operating systems & interfaces were very creative efforts. Which genuinely matched the hardware innovation.
Vision’s OS, on the hand, basically iOS-ified hardware that deserved to be treated like the first device to be positioned above and beyond the Mac. The natural interface doesn’t fall below the Mac’s, like a touch screen does. It fat exceeds it, given a keyboard-trackpad.
Instead, software wise, we get another media and toy kiosk.
I am stunned that Tim Cook didn’t see the opportunity to leave his mark with a device that took the capability crown further than the Mac, instead of falling for the 3D as cute feature un-vision.
Pro hardware. Toy software.
He has been a great CEO. But if he let Steve and his own legacy down anywhere, that is where.
That, the predictable but mostly stalled vision of software apps. And all the odd software glitches on all their devices that seem to keep cropping up, that suggest poor underlying models to me.
Their underlying systems software are a high point. The hardware integration is stand out.
The huge strike-out they made with the Vision Pro still blows my mind. I'm in the camp of people who would have possibly shifted my entire working setup to that thing if they'd made just a few less dumb choices with it, and it might have been worth it even at the high price. I still occasionally waste my time checking out the latest to see if they've made any headway towards making it useful, because I'm still recovering from the shock that they haven't. The only way I can see the current state making any sense is if they just wanted to squeeze as much field usage data as possible from early adopters of an overpriced prototype, but that seems so far outside of how Apple normally positions its products that it's hard to believe.
> I'm in the camp of people who would have possibly shifted my entire working setup to that thing if they'd made just a few less dumb choices
That describes me too. I even did for a while. But it just made the incomprehensible lack of any software ambition more painful.
The software is the only reason the Vision isn't worth the price. A real Pro OS, paired with an Studio M5-Ultra, or with its own M5-Ultra, would be an amazing work environment.
(The only hardware they would need to upgrade for the latter, i.e. its own Ultra, would be making live-battery swapping convenient. Which they should have already done.)
It's uneven in my experience. OS-wise (Android, ChromeOS), I've had some big and frustrating problems. On the other hand, I really like some of their web apps (Drive, Docs).
For servers, yes, but use Safari for like 5 minutes versus Chrome and it’s clear the reverse is true for desktops, especially if you’re not running with 32+GB of RAM. Google Drive, Photos, etc. are not as good as Chrome.
This is not to say that Apple’s desktop software is great, only that the bar is a lot lower than it had to be when people had to be convinced to buy licenses.
Ah yes, the company that still can't their gesture and backswipe UX functioning properly 7 years after its introduction, and with Apple giving them 2 years to study it beforehand.
A decade to produce a non-functioning gesture bar / system. Such a titan among titans.
I still prefer macOS to desktop Linux or (yikes) Windows, but the margin has gotten smaller over the last several years. Unfortunately, that's less because Linux or Windows have gotten that much better, and more because macOS has stalled (and even gone backwards in some ways).
Maybe 20 years ago, today it's no better than anything else - well designed in some aspects, total trash in others. The stewards of xcode, spotlight and siri (among many other stinkers) are disqualified from the category of "best"
Android and Windows are better than iOS and macOS in many non-trivial ways. They have their own problems too, but as a user of all of them I don't prefer the Apple software. Apple's hardware, on the other hand, is clearly superior.
Android has a far better OTA update system than iOS. The notification system is much better and the default keyboard is better too. It supports multiple user profiles that you can switch between instantly, with their own separate apps and settings and home screens, a long requested feature for iPads that is inexplicably still absent on iOS.
Windows has a better desktop compositor and window manager than macOS. It supports Nvidia GPUs with CUDA. It also has WSL so you can use real package managers instead of homebrew.
"winget configure" is pretty great in Windows - you can store your personal .config file on GitHub and use it whenever you set up a new PC to install everything you want, uninstall all the cruft you don't want, and set all the Windows config you want via registry keys.
and people wonder why they have random regressions in updates. this is it. unit tests and other types of tests are a cornerstone of software stability and does the bulk of the job of preventing regressions
What do you mean? Most if not all Apple's software is not even the best in their own category, let alone "in the non-free software world compared to Google's or Microsoft's". If we look at only these three and leave other competitors, you want to tell us that Safari is better than Chrome (Edge is the same now), Pages is better than Docs and Word, Numbers is better than Sheets and Excel, Keynote is better than Slides (arguably) or PowerPoint, Mail is better than Gmail or Outlook, iCloud better than Google Drive or OneDrive (ok lol), Facetime better than Meet or Teams, Apple Maps better than Google Maps or Bing Maps, Siri better than Google Assistant or Copilot... ?
Outside the two.. Fina Cut better than Premiere Pro or Resolve or Avid, Logic Pro better than Pro Tools or Ableton or many others, Motion better than After Effects, Pixelmator better than anything from Adobe or Affinity..
Come on, my dude. Only thing I haven't mentioned is OS only because that's a religion and I don't fall into MacOS one.
Apple's hardware game is strong. Software isn't, never has been.
> Apple’s software is the best in the non-free software world compared to Google's or Microsoft's, IMO.
Apple does xcode, known for being perpetually broken and an ungodly mess of whatever design it had. Isn't it enough proof to completely reject your claim?
In my opinion Android (especially the Google Pixel flavour) is vastly more intuitive and logical than i(Pad)OS these days. I almost need to consult a manual to change my wallpaper on iOS. Anything to do with file management or notifications is also just plain bad on iOS. The keyboard is bad. Background downloads don't work reliably. If I want to transfer photos from a computer onto an iPhone I need special software and then cannot delete those pictures on the phone itself. I can choose between 3 multitasking paradigms on iPad – terrible!
It has one feature I wish everyone else would copy: Miller columns. But even after NeXT used them 35+ years ago, they have remarkably little penetration into other OSes.
I use Pathfinder on MacOS, and it's generally a lot better than finder, but there are features I wish would carry over from other OSes. Windows file check boxes are incredibly useful
Miller columns is the biggest waste of screen space that's possible in an OS, and MacOS chose it for Finder. And Apple has been dying on that hill for decades. It's one of the main reasons Finder is awful and Apple's design choices are a joke.
If you are reading Hackers News for the most part, you are out of touch with the normal computer users and that was said over and over again with the introduction of the Mac Neo which appears to be a hit among normal everyday computer users who have never heard of Hackers News, a family member recently just bought one of the new Mac M5 PowerBook's and I expected some cry for help setting it up. Guess what there was none.
In the answer to your question, there is nothing better overall across hardware and software top to bottom and that applies to computers, smartphones, tablets, and watches across five ecosystems.
I find it hard to believe this comment isn't sarcastic. Apple's software, atleast in particular macos, is horrendous - to the point I ditched my m2 macbook for a thinkpad because of how bad it was. It's like a toy OS.
I was a kid with unrestricted, unsupervised internet access, and it definitely affected many things in my life. If I happen to have a child in the future, they won't go through that.
The Brazilian government passed a law requiring age verification for every site categorized as 16+. It can't be self-declared, so companies usually resort to facial scans and ID verification. I DO NOT want photos of our Brazilian children going to foreign agents who are PROVEN to profit from and do God-knows-what with our biometric data. And the funniest part? The same law says 'regulation shall not, under any circumstances, authorize or result in the implementation of mass surveillance mechanisms,' but also mandates that these measures must be 'AUDITABLE.' In other words, someone needs access to that data. It’s all so stupid and incoherent.
People who are less tech-literate FIERCELY support the measure, and whenever someone opposes it, they claim that person supports digital child abuse...
Anyway... the responsibility of protection should come from the parents, not from companies that profit off your biometric data.
I guess the opposite case might not be as interesting to many, but I achieved basically unfiltered internet access as a child, and it has been immensely helpful for me as a person. Everything I am today -- a programmer, technically literate, a founder of a startup with momentum, I am because I had freedom and autonomy as a child (which was not granted to me, rather achieved by me). Many of the people of my age who grew up with strict controls and supervisory parents seem kind of lost and uninformed to me, now that they are turning into adults. I feel this narrative is surprisingly rarely heard on HN, but I cannot be the only one?
I think the same for me, I’m pretty sure I wouldn’t be in my career if I had been restricted to an hour a day on a filtered iPad.
But I also think the internet has more potential for harm now. Widespread social media makes it easy for predators. YouTube actively incentivises content creators to produce brain numbing shit instead of the more amateur and educational content I was exposed to. Instagram creates vicious dopamine hooks that children have no mental defense against.
Also sorry to sound egotistical but I think I was an outlier that drifted into doing educational things, many or most kids will spend every moment they get just playing video games.
That being said, I’m in favour of parents doing the parenting, not the government.
> Also sorry to sound egotistical but I think I was an outlier that drifted into doing educational things, many or most kids will spend every moment they get just playing video games.
I am in the same predicament as both of you, having grown up with unfiltered internet access, and not wanting it to have went any other way (I love my life, actually!)
There is a condescending tendency when people hear what I said above, to tell me that I am an outlier, or, God forbid, a "genius", and other equally worrying conclusions regarding my character.
I agree that, today, there are millions more ways that children can fall for objectively negative things, that have been completely, and intentfully engineered to be terrible in a way which can be exploited for profit.
But also, I simply think that, with enough access to mind-numbing content, for long enough... people will simply realize that, actually, they don't want that. At least, not just that.
Adults are not a good term for comparision in the matter of less aggressive addictions, like with social media, because they already have lives they want to escape, with responsibilities and whatnot.
These are not scientifically sourced claims, but, in my experience, children have a lot more time, energy, curiosity, and will/intent to create, for one reason or another, and they have been doing those things since time immemorial.
This is just a consequence of having access to ~the entirety of all human knowledge at their fingertips, with no restrictions, and with an incredible amount of free time at their disposal.
I think the HN crowd is full of outliers. You folks are unrestricted internet success stories. Congrats! For every one of you there has to be 100 or 1000 gaming and social media addicts.
That being said, I’m in favour of parents doing the parenting, not the government.
This aspect of parenting is really hard. If your kid is 10 years old and all their classmates have Roblox, saying 'no' to your kid does isolate them socially, because all the other kids are talking about what they did in Roblox at school and play Roblox together after school. To make it worse, some primary schools even allow kids to play Roblox at school during breaks or the teachers make TikTok videos, making kids want to have Tik Tok as well (TikTok-teachers are a real phenomenon), etc. So, even when you are trying, it gets undermined by others. Trying to fight it is kind of pointless, because most other parents don't see the issue.
Same for e.g. instant messaging, it is basically Sophie's choice: you allow them into these addiction machines or you isolate them socially. It would be much easier if social media and certain types of addictive games were just not allowed under 16. Just like we don't sell cigarettes or alcohol to kids.
I also completely agree with the counterpoint that age verification on the internet is generally bad.
Luckily, some things can be done without grave privacy violations. E.g. where high schools 10-15 years ago would gloat about being iPad or laptop schools, more and more are completely banning smart phones and laptops during school time.
At any rate, it's perfectly possible to hold both views at the same time: social media and addictive games should be forbidden under 16 and the age verification initiatives are terrible for privacy.
Maybe we should just ban Facebook, TikTok, etc. no more addiction, no more age verification needed :).
Yeah you have a good point. I don't have kids so I didn't really think about this social pressure aspect.
I think if a perfect system existed that could gate websites behind age verification, without any privacy compromise and assure the user of this, I would support it. There are zero-knowledge proofs of course, but they're a black box, and the user still has to trust that the system has been implemented correctly. Unless mandated by law, companies have no incentive to build a perfectly private age verification system.
As someone who grew up without TV, I would say that it's fine to be a little bit isolated socially. You learn to develop real social skills and the time wasted playing Roblox can be better invested anyway.
I also had the same experience (not just with Internet - I had unfiltered access to basically any and all reading materials), and I felt that on the whole it was a massively positive experience for me. I feel really sad for all the children today who mostly grow up in much more closely controlled environments. I understand why parents do that, but I'm also not at all convinced that most parents actually know what is good for their kids - just believe that they do.
I am happy that I grew up in simpler times. I have to thank Linux Developer Resource CD-ROM sets, FreeBSD CD-ROM sets, etc. to make me a Unix fan, a programmer and technically literate. We lived in a small rural town in the north of The Netherlands, and the only way to access the internet was by using 25ct per minute dail-up, to which my parents said "no".
So instead every time I got a new Linux or FreeBSD CD-ROM set, I would go through all the documentation and try everything out, and read source code. I got Pascal and C books through the local library, where you had to order the book and usually wait two or three weeks.
But I also didn't have the omnipresent cameras (you could still do dumb stuff as a kid and not get filmed/photographed). No pressure to show a fake version of yourself on social media. No pressure to be always available through instant messaging.
I feel like it was the best time to be a kid. Access to information was relatively easy (albeit slower than on the internet), but without all the terrible downsides for kids. Without all the dopamine shots and highly addictive social media and games. Without the all-ways present tracking of your every move.
Though even the kids slightly after me probably still had a good time. Early 2000s, Internet access became more ubiquitous, but it still took almost 10 years for the worst of addictive websites, etc. to rise. I sure miss the early web.
I don't know which age you are but as a millenial, the internet we had during our teens was really different than what we have today. There was no real social network, no (or little) addictive patterns, conversations happened on Skype or MSN and on php forums. Even newer platforms like Reddit were very different than what they are now.
Unfortunately, internet evolved to become much more predatory and addictive, with platforms like Meta running world-scale ops that they know lead to addiction, depression, scam and sexual harassment.
I honestly would like to give my children the same experience of the internet as the one I had. Unfortunately I fear that it may not be possible anymore. That's not to say that we should run a surveillance experiment with everyone connected to the web.
I'm roughly the same as you in terms of information access, though whether I was a child is debatable; was 14 when I got my first dialup connection. My family wasn't tech-adjacent so it was me who pushed for it; the only control in place was the amount of time I'd spend there.
The only control I have in place on my son in terms of content is whether something is scary or if he won't be able to understand most of it, because arguably he's still too young for many things.
But once he's 12 I don't think I want to restrict most things in terms of content, and by 16 I personally don't care if he watches hardcore midget porn, as long as I have the chance to contextualise and explain the industry.
But.
What I'd rather control (or ban, even) is rather all ML-driven doomscrolling platforms and the "social media" that turned people no longer social. The Internet you and I grew up in no longer exists (or it's a small hidden fraction of it), and now it's a wasteland of engagement traps and corporate revenue directed dark patterns.
You and I learnt to separate wheat from chaff, research, deep dive, and what not. Internet is now, by and large, instant gratification loops and user tracking. I don't want my son (or myself, actually) pulled into that. Porn is literally healthier: you bust a nut and go on with your day, but I see some people wasting hours on end, reel-after-reel, with increasingly targeted ads shoved to their face. Hard pass on that.
Age control, if any, should lie in the hands of the parent/guardian. Make it by law a setting on the routers (new devices are <18 until admin approves them), or the ISPs for mobiles. I'm okay with that. Absolutely not on random third parties handling personal information filling the gap for every random website.
All of that leaving aside the fact that zero knowledge proofs solve this problem without sharing any sensitive information.
But of course, the corporations benefiting from this are not interested in pushing those, IMO reasonable, age controls.
I mean... access to adult content at that age is really, really bad. It really messed up my brain. Gore videos, chatting with adults, etc. But I learned many good things, too. It's a double-edged sword.
Seeing people squish at a young age - and I am not being flippant here - helped reduce my teen "I'm immortal! I'm unstoppable!" phase.
I saw very quickly that what separates a live person from a very deceased flat person was a moment of sillyness/forgetfullness/stupidity. "I didn't SUSPECT that is even possible to happen to a person!" - "We're....fragile?!" - "Ah, bike helmet... I think they're REALLY GOOD idea...."
PSA's just aren't listened to by teenagers. But something that's real - that happened, with the security camera timestamp in the corner... kids learn safety.
> helped reduce my teen "I'm immortal! I'm unstoppable!" phase.
I mean, is that good?
Isn’t another way of looking at that to say that it poisoned an innocent time and left you aware and afraid of death when you might otherwise have been enjoying the end of your childhood without that burden?
In general parents might want their kids to be a little more mindful, but not grow up too soon.
I don't see how this "child protection" enforcement would help in case of small obscure websites with porn and gore? No way their admins gonna comply. I doubt ISPs would go that far to DNS whitelist compliant websites only.
The admins of sites like that DGAF about anything or anyone. They enjoy the chaos and shock.
If you expect admins of edgelord websites to respect the laws of different countries or even care about kids, I suggest checking out 4Chan’s response to various attempts to regulate them.
I never said this would help... in fact, I’m against this kind of measure, at least the way it’s being done. But I wouldn’t be surprised if Brazilian ISPs are forced to block this sort of thing (just look at what happened with Twitter (X) the year before last).
For me, it didn't mess up my brain at all, it showed me a much broader range of what humanity really is, which is exactly what I wanted to understand at that time. I understood the depravity humans will exact upon others, or those they see as lesser (such as the treatment of animals, or prisoners, "the enemy" whoever/whatever that may be). I also saw unfiltered sharing of valuable knowledge, science, tech stuff, software, games, music, culture...
The uncensored internet taught me more than I could ever have been taught in school, and I'll be forever grateful for that. It didn't take me long to understand that I could generally hate no ethnicity or people or country, and the people who do are manipulated by their government or other powerful figures in their life (or disproportionately swayed by experiences in their life). Humans are pretty much all the same, we all have far far more in common than we do differences. I have a stronger perspective of this than my immediate ancestors (demonstrated over and over throughout my life) and I do credit my exposure to the open internet for a huge amount of that.
There is one huge and problematic difference now, though: the uncensored internet of the 90's is nothing like the disinformation-saturated internet of today.
> I was a kid with unrestricted, unsupervised internet access, and it definitely affected many things in my life. If I happen to have a child in the future, they won't go through that.
I've heard this a few times, but what was so bad? And, sorry to break it you, reality has some bad bits to it - do you think being ignorant of these is useful, or that it just sets you up for a bigger fall?
Why do you think removing independence (nannying) from another human being is the answer? Would you want to be nannied for ever, by corporations and governments?
To me the question is more who is going to nanny me, and ideally its myself (the mature option), but in my experience starting as a child and going into adulthood, mental health can break this down to where people can't nanny/take care of themselves. In that case, the question at hand is: who is going to protect you from yourself? The state? Your family? Your friends?
Oftentimes the answer is "nobody". There's just nobody you can rely on to get the level of care you require. There are lots of arguments like Bowling Alone for how the breakdown of community has contributed to this separate issue.
In my view, by constructing and supporting legislation like this, people are implicitly admitting that parents, teachers, schools, communities, and all the rest are failing at their job of keeping moderation local and raising the next generation.
But the thing is, unfortunately this is a true statement in too many cases, including mine. My parents failed to parent me well enough, and my counselors were either instrumental in my own trauma or failed to address my issues soon enough, and as such I developed a sex addiction in adolescence fueled by persistent ongoing stress from my upbringing that I continue to seek treatment for to this day. Could content moderation laws have cured my parents' narcissism? Nope. Could they have prevented me from needing to act out to relieve the stress of my early relational trauma? Nope. Could they have helped match me with more competent therapists? Nope.
Could they have caused me to go to rehab for alcohol abuse instead of porn? Maybe.
For all his statements I disagree with, I subscribe to Gabor Mate's view that traumatized individuals are compelled to be addicted to something. At that point, there are a lot of things to become addicted to other than the ones you can content moderate, given the (false) assumption that it's possible moderate enough of it.
Pornography was necessary but not sufficient for me to have it that bad coming out of childhood. Early exposure to it was only incidental. My upbringing was far more significant a cause in this. But unlike which websites I was allowed to visit as a child, a 100% chance of having emotionally involved parents isn't something you can legislate into existence.
What I feel isn't being talked about enough in this discussion is that this implicit realization that the world just sucks sometimes leads to justification that someone else needs to step in to protect children's fragile minds if the formerly trusted institutions aren't. The big option left is the platforms and systems hosting the tech themselves so they're targeted instead.
My opinion? If your parents aren't able to raise you to be free of significant trauma spawning "hungry ghosts" that you will need to turn to your unfettered internet access to feed, whether TikTok or LiveLeak or elsewhere, lest you are bombarded by stress every waking moment... then the situation was hopeless to begin with. You can't fix that problem with laws. You should have just had better parents, as awful as that sounds. And because of nothing more than bad luck, you're just going to have to unpack that problem with the healthcare system for years/decades, because there's not much else we know of that can meaningfully address childhood trauma that severe.
However, I don't think the medical establishment will necessarily help. Or looking outside generally - this will probably only compound or defer the problem. You will have to deal with it yourself in the end. I believe everyone already has all they need in themselves to do this.
Making parents control devices is too much. People do what’s “normal” right now normal is to give unrestricted access to kids when they’re 10 or 11.
It takes incredible conviction and force of will to keep your kids off the phone till they’re 16. Fewer than 1% of parents manage it. The problem is that the teenager wants a thing that everyone else has and it’s hard to keep saying no.
I think internet connected smartphones should be illegal for kids under 16 to own or use. It’s a tough sell tho.
Great. I started developing in the Docker era, and while I can see some flaws, it is one of the easiest, most reliable tools I constantly use. I can't imagine how people dealt with those problems before Docker
reply