I'm an engineer on the HD Mapping Team at Zoox (autonomous vehicles) and we are looking for a mid-career full stack engineer with a flair for automation and dev tooling. The role is about helping us scale our map production and management systems. Geospatial data and 3D visualization (deck.gl) experience are nice to haves but by no means required.
If the role sounds like a good fit and you would like to talk to someone directly you can email me at carl chatfield gmail. (Please fill in the gaps).
Usual perks apply, but please read the linked job description for details.
I've been meaning to set up a bi-weekly dinner for hacker types who live mid peninsula, specifically near San Mateo. I have a group of 4 or so in mind and have a good place to host, but would like a slightly larger group.
If anyone would be interested in helping to get something stood up, send electronic post to carl chatfield snail (mail run by g)
I consider it the most beautiful piece of code I've ever written and perhaps my one minor contribution to human knowledge. It uses a method I invented, is just a few lines, and converges in very few iterations.
People used to reach out to me all the time with uses they had found for it, it was cited in a PhD and apparently lives in some collision plugin for unity. Haven't heard from anyone in a long time.
It's also my test question for LLMs, and I've yet to see my solution regurgitated. Instead they generate some variant of Newtons method, ChatGPT 5.2 gave me an LM implementation and acknowledged that Newtons method is unstable (it is, which is why I went down the rabbit hole in the first place.)
Today I don't know where I would publish such a gem. It's not something I'd bother writing up in a paper, and SO was the obvious place were people who wanted an answer to this question would look. Now there is no central repository, instead everyone individually summons the ghosts of those passed in loneliness.
The various admonitions to publish to a personal blog, while encouraging, don't really get at the 0xfaded's request which I'd summarize as follows:
With no one asking questions these technical questions publicly, where, how and on what public platform will technical people find the problems that need solving so they can exercise their creativity for the benefit of all?
Clearly we need something in between the fauxpen-access of journals and the wilde west of the blogosphere, probably. Why wouldn't the faded ox publish in a paper? Idk, but I guess we need things similar to those circulars that British royal society members used to send to each other...except not reserved for a club. The web should be a natural at this. But it's either centralized -> monetized -> corrupted, or decentralized -> unindexed/niche -> forgotten fringe. What can come between?
I wonder if there could be something like a Wikipedia for programming. A bit like what the book Design Patterns did in 1994, collecting everyone's useful solutions, but on a much larger scale. Everyone shares the best strategies and algorithms for everything, and updates them when new ones come about, and we finally stop reinventing the wheel for every new project.
To some extent that was Stack Overflow, and it's also GitHub, and now it's also LLMs, but not quite.
May I suggest "PASTE": Patterns, Algorithms, Solutions, Techniques, and Examples. "Just copy PASTE", they'll say.
Ward Cunningham once, of all places in an Github issue [0], explained how the original C2 Wiki was seeded.
> Perhaps I should explain why wiki worked.
> I wrote a program in a weekend and then spent two hours a day for the next five years curating the content it held. For another five years a collection of people did the same work with love for what was there. But that was the end. A third cohort of curators did not appear. Content suffered.
A heroic amount effort of a single person, and later the collective effort of a small group, worked in the mid-90es. I'm skeptical that it will be repeatable 30 years later. Despite this, it would be the type of place, that I'd like to visit on the web. :(
Yup, that was always very much the plan, from the earliest days. Shame it soured a bit, but since the content is all freely reusable, maybe something can be built atop the ashes?
This is _not_ at all the same thing. Grok just ripped off Wikipedia as its base and then applied a biased spin to it. Check out the entry on Grok owner Elon Musk; it praises his accomplishments and completely omits or downplays most of his better-known controversies.
Yes exactly! It would need some publicity of some kind to get started but it's the best solution, certainly? And all of the tools and infrastructure already exist.
> Clearly we need something in between the fauxpen-access of journals and the wilde west of the blogosphere, probably.
I think GP's min-distance solution would work well as an arxiv paper that is never submitted for publication.
A curated list of never-published papers, with comments by users, makes sense in this context. Not sure that arxiv itself is a good place, but something close to it in design, with user comments and response-papers could be workable.
Something like RFC, but with rich content (not plain-text) and focused on things like GP published (code techniques, tricks, etc).
Could even call it "circulars on computer programming" or "circulars on software engineering", etc.
PS. I ran an experiment some time back, putting something on arxiv instead of github, and had to field a few comments about "this is not novel enough to be a paper" and my responses were "this is not a publishable paper, and I don't intend to submit it anywhere". IOW, this is not a new or unique problem.
You can (and always were encouraged to) ask your own questions, too.
And there are more sites like this (see e.g. https://codidact.com — fd: moderator of the Software section). Just because something loses popularity isn't a reason to stop doing it.
StackOverflow is famously obnoxious about questions badly asked, badly categorized, duplicated…
It’s actually a topic on which StackOverflow would benefit from AI A LOT.
Imagine StackOverflow rebrands itself as the place where you can ask the LLM and it benefits the world, whoch correctly rephrasing the question behind the scenes and creating public records for them.
The company tried this. It fell through immediately. So they went away, and came back with a much improved version. It also fell through immediately. Turns out, this idea is just bad: LLMs can't rephrase questions accurately, when those questions are novel, which is precisely the case that Stack Overflow needs.
This is an excellent piece of information that I didn’t have. If the company with most data can’t succeed, then it seems like a really hard problem. On the side, they can understand why humans couldn’t do it either.
Seriously where will we get this info anymore? I’ve depended on it for decades. No matter how obscure, I could always find a community that was talking about something I needed solved. I feel like that’s getting harder and harder every year. The balkanization of the Internet + garbage AI slop blogs overwhelming the clearly declining Google is a huge problem.
And discord is a terrible tool for knowledge collection imo. Their search is ok, but then I find myself digging through long and disjointed message threads, if replies/threading are even used at all by the participants.
When I grew up shakes fist at clouds I had a half dozen totally independent forums/sites to pull on for any interest or hobby no matter how obscure. I want it back!
It's true though, and the information was so deep and specific. Plus the communities were so legitimate and you could count on certain people appearing in threads and waiting for their input. Now the best you have are subreddits or janky Facebook groups .
Agreed, it’s the discoverability that’s the real problem here at the end of it all. All the veterans are pulling up the drawbridges to protect their communities from trolls, greedy companies, AI scraping, etc. which means new people can’t find them. Which then means these communities eventually whither and stop being helpful resources for us all.
> where, how and on what public platform will technical people find the problems that need solving so they can exercise their creativity for the benefit of all?
The same place people have always discovered problems to work on, for the entire history of human civilization. Industry, trades, academia, public service, newspapers, community organizations. The world is filled with unsolved problems, and places to go to work on them.
This is a perfect example of an element of Q&A forums that is being lost. Another thing that I don't think we'll see as much of anymore is interaction from developers that have extensive internal knowledge on products.
An example I can think of was when Eric Lippert, a developer on the C# compiler at the time, responded to a question about a "gotcha" in the language: https://stackoverflow.com/a/8899347/10470363
Developer interaction like that is going to be completely lost.
Yuck. I don't know if it's just me, but something feels completely off about the GH issue tracker. I don't know if it's the spacing, the formatting, or what, but each time it feels like it's actively trying to shoo me away.
It's whatever the visual language equivalent of "low signal" is.
Still gh issues are better than some random discord server. The fact that forums got replaced by discord for "support" is a net loss for humanity, as discord is not searchable (to my knowledge). So instead of a forum where someone asks a question and you get n answers, you have to visit the discord, and talk to the discord people, and join a wave channel first, hope the people are there, hope the person that knows is online, and so on.
Yeah, I suspect that a lot of the decline represented in the OP's graph (starting around early 2020) is actually discord and that LLMs weren't much of a factor until ChatGPT 3.5 which launched in 2022.
LLMs have definitely accelerated Stackoverflow's demise though. No question about that. Also makes me wonder if discord has a licensing deal with any of the large LLM players. If they don't then I can't imagine that will last for long. It will eventually just become too lucrative for them to say no if it hasn't already.
Discord isn’t just used for tech support forums and discussions. There are loads of completely private communities on there. Discord opening up API access for LLM vendors to train on people’s private conversations is a gross violation of privacy. That would not go down well.
I think most relevant data that provides best answers lives in GitHub. Sometimes in code, sometimes in issues or discussions. Many libs have their docs there as well. But the information is scattered and not easy to find, and often you need multiple sources to come up with a solution to some problem.
I agree that there will be some degradation here, but I also think that the developers inclined to do this kind of outreach will still find ways to do it.
I believe the community has seen the benefit of forums like SO and we won’t let the idea go stale. I also believe the current state of SO is not sustainable with the old guard flagging any question and response you post there. The idea can/should/might be re-invented in an LLM context and we’re one good interface away from getting there. That’s at least my hope.
I used to look at all TensorFlow questions when I was on the TensorFlow team (https://stackoverflow.com/tags/tensorflow/info). Unclear where people go to interact with their users now....Reddit? But the tone on Reddit is kind of negative/complainy
I had a similar beautiful experience where an experienced programmer answered one of my elementary JavaScript typing questions when I was just starting to learn programming.
He didn't need to, but he gave the most comprehensive answer possible attacking the question from various angles.
He taught me the value of deeply understanding theoretical and historical aspects of computing to understand why some parts of programming exist the way they are. I'm still thankful.
If this was repeated today, an LLM would have given a surface level answer, or worse yet would've done the thinking for me obliviating the question in the first place.
Had a similar experience. Asked a question about a new language feature in java 8 (parallell streams), and one of the language designers (Goetz) answered my question about the intention of how to use it.
An LLM couldn't have done the same. Someone would have to ask the question and someone answer it for indexing by the LLM. If we all just ask questions in closed chats, lots of new questions will go unanswered as those with the knowledge have simply not been asked to write the answers down anywhere.
You can prompt the LLM to not just give you the answer. Possibly even ask it to consider the problem from different angles but that may not be helpful when you don't know what you don't know.
You can write a paper, submit the arxiv, and you can also make a blog post.
At any rate, I agree - SO was (is?) a wonderful place for this kind of thing.
I once had a professor mention that they knew me from SO because I posted a few underhanded tricks to prevent an EKF from "going singular" in production. That kind of community is going to be hard to replace, but SO isnt going anywhere, you can still ask a question and answer your own question for permanent, searchable archive.
If only those who voted to close would bother to check whether the dup/close issue was ACTUALLY a duplicate. If only there were (substantial) penalties for incorrectly dup/closing. The vast majority of dup/closes seem to not actually be dup/closes. I really wish they would get rid of that feature. Would also prevent code rot (references to ancient versions of the software or compiler you're interested in that are no longer relevant, or solutions that have much easier fixes in modern versions of the software). Not missing StackOverflow in the least. It did not age well. (And the whole copyright thing was just toxically stupid).
I think they should have had some mechanism that encouraged people to help everybody, including POSITIVELY posting links to previously answered questions, and then only making meaningfully unique ones publicly discoverable (even in the site search by default), afterwards. Instead, they provided an incentive structure and collection of rationales that cultivated a culture of hall monitors with martyr complexes far more interested in punitively enforcing the rules than being a positive educational resource.
Has anyone tried building a modern Stack Overflow that's actually designed for AI-first developers?
The core idea: question gets asked → immediately shows answers from 3 different AI models. Users get instant value. Then humans show up to verify, break it down, or add production context.
But flip the reputation system: instead of reputation for answers, you get it for catching what's wrong or verifying what works. "This breaks with X" or "verified in production" becomes the valuable contribution.
Keep federation in mind from day one (did:web, did:plc) so it's not another closed platform.
Stack Overflow's magic was making experts feel needed. They still do—just differently now.
Oh, so it wasn't bad enough to spot bad human answers as an expert on Stack Overflow... now humans should spend their time spotting bad AI answers? How about a model where you ask a human and no AI input is allowed, to make sure that everyone has everyone else's full attention?
The entire purpose of answering questions as an "expert" on S.O. is/was to help educate people who were trying to learn how to solve problems mostly on their own. The goal isn't to solve the immediate problem, it's to teach people how to think about the problem so that they can solve it themselves the next time. The use of AI to solve problems for you completely undermines that ethos of doing it yourself with the minimum amount of targeted, careful questions possible.
You're absolutely correct, but the scary thing is this: What happens when a whole generation grows up not knowing how to answer another person's question without consulting AI?
[edit]
It seems to me that this is a lot like the problem which bar trivia nights faced around the inception of the smartphone. Bar trivia nights did, sporadically and unevenly, learn how to evolve questions themselves which couldn't be quickly searched online. But it's still not a well-solved problem.
When people ask "why do I need to remember history lessons - there is an encyclopedia", or "why do I need to learn long division - I have a calculator", I guess my response is: Why do we need you to suck oxygen? Why should I pay for your ignorance? I'm perfectly happy to be lazy in my own right, but at least I serve a purpose. My cat serves a purpose. If you vibe code and you talk to LLMs to answer your questions...I'm sorry, what purpose do you serve?
I and many others already go the extra mile to ask multiple LLM's for hard questions or for getting a diversity of AI opinions to then internalize and cross check myself.
All the major AI companies of course do not want to give you the answers from other AI's so this service needs to be a third party.
But then beyond that there are hard/niche questions where the AI's are wrong often and humans also have a hard time getting it right, but with a larger discussion and multiple minds chewing the problem one can get to a more correct answer often by process of elimination.
I encountered this recently in a niche non-US insurance project and I basically coded together the above as an internal tool. AI suggestions + human collaboration to find the best answer. Of course in this case everyone is getting paid to spend time with this thing so more like AI first Stack Overflow Internal. I have no evidence that an public version would do well when ppl don't get paid to commend and rate.
I was making a point elsewhere in this thread that the best way to learn is to teach; and that's why Stack Overflow was valuable for contributors, as a way of honing their skills. Not necessarily for points.
What you need to do, in your organization, is to identify the people who actually care about teaching and learning for their own sake, as opposed to the people who do things for money, and to find a way to promote the people with the inclination to learn and teach into higher positions. Because it shows they aren't greedy, they aren't cheating, and they probably will have your organization's best interests at heart (even if that is completely naïve and they would be better off taking a long vacation - even if they are explicitly the people who claim to dislike your organization the most). I am not talking about people who simply complain. I mean people who show up and do amazing work on a very low level, and teach other people to do it - because they are committed to their jobs. Even if they are completely uneducated.
For me, the only people I trust are people who exhibit this behavior: They do something above and beyond which they manifestly did not need to do, without credit, in favor of the project I'm spending my time on.
>> But then beyond that there are hard/niche questions where the AI's are wrong often and humans also have a hard time getting it right, but with a larger discussion and multiple minds chewing the problem one can get to a more correct answer often by process of elimination.
Humans aren't even good at this, most of the time, but one has to consider AI output to be almost meaningless babble.
May I say that the process of elimination is actually not the most important aspect of that type of meeting. It is the surfacing of things you wouldn't have considered - even if they are eliminated later in debate - which makes the process valuable.
In 2014, one benefit of Stack Overflow / Exchange is a user searching for work can include that they are a top 10% contributor. It actually had real world value. The equivalent today is users with extensive examples of completed projects on Github that can be cloned and run. OP's solution if contained in Github repositories will eventually get included in a training model. Moreover, the solution will definitely be used for training because it now exists on Hacker News.
I had a conversation with a couple accountants / tax-advisor types about them participating in something like this for their specialty. And the response was actually 100% positive because they know that there is a part of their job that the AI can never take 1) filings requires you to have a human with a government approved license 2) There is a hidden information about what tax optimization is higher or lower risk based on their information from their other clients 3) Humans want another human to make them feel good that their tax situation is taken care of well.
But also many said that it would be better if one wraps this in an agency so the leads that are generated from the AI accounting questions only go to a few people instead of making it fully public stackexchange like.
So +1 point -1 point for the idea of a public version.
LOL. As a top 10% contributor on Stack Overflow, and on FlashKit before that, I can assure you that any real world value attached to that status was always imaginary, or at least highly overrated.
Mainly, it was good at making you feel useful and at honing your own craft - because providing answers forced you to think about other people's questions and problems as if they were little puzzles you could solve in a few minutes. Kept you sharp. It was like a game to play in your spare time. That was the reason to contribute, not the points.
hehe yea this existing of course. like these guys https://yupp.ai/ they have not announced the tokens but there are points and they got all their VC money from web3 VC. I'm sure there are others trying
AI is generally setup to return the "best" answer as defined as the most common answer, not the rightest, or most efficient or effective answer, unless the underlying data leans that way.
It's why AI based web search isn't behaving like google based search. People clicking on the best results really was a signal for google on what solution was being sought. Generally, I don't know that LLMs are covering this type of feedback loop.
That seems like a horrible core idea. How is that different from data labeling or model evaluation?
Human beings want to help out other human beings, spread knowledge and might want to get recognition for it. Manually correcting (3 different) automation efforts seems like incredible monotone, unrewarding labour for a race to the bottom. Nobody should spend their time correcting AI models without compensation.
Speaking of evals the other day I found out that most of the people who contributed to Humanities Last Exam https://agi.safe.ai/ got paid >$2k each. So just adding to your point.
I think this could be really cool, but the tricky thing would be knowing when to use it instead of just asking the question directly to whichever AI. It’s hard to know that you’ll benefit from the extra context and some human input unless you already have a pretty good idea about the topic.
Presumably over time said AI could figure out if your question had already been answered and in that case would just redirect you too the old thread instead.
thanks for sharing that, it was simple, neat, elegant.
this sent me down a rabbit hole -- I asked a few models to solve that same problem, then followed up with a request to optimize it so it runs more efficiently.
chatgpt & gemini's solutions were buggy, but claude solved it, and actually found a solution that is even more efficient. It only needs to compute sqrt once per iteration. It's more complex however.
yours claude
------------------------------
Time (ns/call) 40.5 38.3
sqrt per iter 3 1
Accuracy 4.8e-7 4.8e-7
Claude's trick: instead of calling sin/cos each iteration, it rotates the existing (cos,sin) pair by the small Newton step and renormalizes:
// Rotate (c,s) by angle dt, then renormalize to unit circle
float nc = c + dt*s, ns = s - dt*c;
float len = sqrt(nc*nc + ns*ns);
c = nc/len; s = ns/len;
Thanks for pushing this, I've never gone beyond "zero" shotting the prompt (is it still called zero shot with search?)
As a curiosity, it looks like r and q are only ever used as r/q, and therefore a sqrt could be saved by computing rq = sqrt((rxrx + ryry) / (qxqx + qyqy)). The if q < 1e-10 is also perhaps not necessary, since this would imply that the ellipse is degenerate. My method won't work in that case anyway.
For the other sqrt, maybe try std::hypot
Finally, for your test set, could you had some highly eccentric cases such as a=1 and b=100
Thanks for the investigation:)
Edit: BTW, the sin/cos renormalize trick is the same as what tx,ty are doing. It was pointed out to me by another SO member. My original implementation used trig functions
yours yours+opt claude
---------------------------------------
Time (ns) 40.9 36.4 38.7
sqrt/iter 3 2 1
Instructions 207 187 241
Edit: it looks like the claude algorithm fails at high eccentricities. Gave chatgpt pro more context and it worked for 30min and only made marginal improvement on yours, by doing 2 steps then taking a third local step.
I can relate. I used to have a decent SO profile (10k+ reputation, I know this isnt crazy but it was mostly on non low hanging fruit answers...it was a grind getting there). I used to be proud of my profile and even put it in my resume like people put their Github. Now - who cares? It would make look like a dinosaur sharing that profile, and I never go to SO anymore.
I too, around 2012 was too much active on so, in fact, it had that counter thing continuously xyz days most of my one liners, or snippets for php are still the highest voted answers. Even now when sometimes I google something, and an answer comes up, I realize its me who asked the same question and answered it too.
I don't disagree completely by any means, it's an interesting point, but in your SO answer you already point to your blog post explaining it in more detail, so isn't that the answer, you'd just blog about it and not bother with SO?
Then AI finding it (as opposed to already trained well enough on it, I suppose) will still point to it as did your SO answer.
Looks like solid code. My only gripe is the shadowing of x. I would prefer to see `for _ in range`. You do redefine it immediately so it's not the most confusing, but it could trip people up especially as it's x and not i or something.
I once wrote this humdinger, that's still on my mostly dead personal website from 2010... one of my proudest bits of code besides my poker hand evaluator ;)
The question was, how do you generate a unique number for any two positive integers, where x!=y, such that f(x,y) = f(y,x) but the resulting combined id would not be generated by any other pair of integers. What I came up with was a way to generate a unique key from any set of positive integers which is valid no matter the order, but which doesn't key to any other set.
My idea was to take the radius of a circle that intersected the integer pair in cartesian space. That alone doesn't guarantee the circle won't intersect any other integer pairs... so I had to add to it the phase multiple of sine and cosine which is the same at those two points on the arc. That works out to:
(x^2+y^2)+(sin(atan(x/y))*cos(atan(x/y)))
And means that it doesn't matter which order you feed x and y in, it will generate a unique float for the pair. It reduces to:
x^2+y^2+( (x/y) / (x^2+y^2) )
To add another dimension, just add it to the process and key it to one of the first...
It looks like you have typos?
(x^2+y^2)+(sin(atan(x/y))*cos(atan(x/y)))
reduces to
x^2+y^2+( (x/y) / (x^2/y^2 + 1) ) - not the equation given? Tho it's easier to see that this would be symmetrical if you rearrange it to:
x^2+y^2+( (xy) / (x^2+y^2) )
Also, if f(x,y) = x^2+y^2+( (x/y) / (x^2+y^2) )
then f(2,1) is 5.2 and f(1,2) is 5.1? - this is how I noticed the mistake. (the other reduction gives the same answer, 5.4, for both, by symmetry, as you suggest)
There's a simpler solution which produces integer ids (though they are large): 2^x & 2^y. Another solution is to multiply the xth and yth primes.
I only looked because I was curious how you proved it unique!
Hhhhmm. Ok. So I invented this solution in 2009 at what you might call a "peak mental moment", by a pool in Palm Springs, CA, after about 6 hours of writing on napkins. I'm not a mathematician. I don't think I'm even a great programmer, since there are probably much better ways of solving the thing I was trying to solve. And also, I'm not sure how I even came up with the reduction; I probably was wrong or made a typo (missing the +1?), and I'm not even certain how I could come up with it again.
2^x & 2^y ...is the & a bitwise operator...???? That would produce a unique ID? That would be very interesting, is that provable?
Primes take too much time.
The thing I was trying to solve was: I had written a bitcoin poker site from scratch, and I wanted to determine whether any players were colluding with each other. There were too many combinations of players on tables to analyze all their hands versus each other rapidly, so I needed to write a nightly cron job that collated their betting patterns 1 vs 1, 1 vs 2, 1 vs 3... any time 2 or 3 or 4 players were at the same table, I wanted to have a unique signature for that combination of players, regardless of which order they sat in at the table or which order they played their hands in. All the data for each player's action was in a SQL table of hand histories, indexed by playerID and tableID, with all the other playerIDs in the hand in a separate table. At the time, at least, I needed a faster way to query that data so that I could get a unique id from a set of playerIDs that would pull just the data from this massive table where all the same players were in a hand, without having to check the primary playerID column for each one. That was the motivation behind it.
It did work. I'm glad you were curious. I think I kept it as the original algorithm, not the reduced version. But I was much smarter 15 years ago... I haven't had an epiphany like that in awhile (mostly have not needed to, unfortunately).
The typo is most likely the extra /, in (x/y)/(x^2+y^2) instead of (xy)/(x^2+y^2).
`2^x & 2^y ...is the & a bitwise operator...???? That would produce a unique ID? That would be very interesting, is that provable?`
Yes, & is bitwise and. It's just treating your players as a bit vector. It's not so much provable as a tautology, it is exactly the property that players x and y are present. It's not _useful_ tho because the field size you'd need to hold the bit vector is enormous.
As for the problem...it sounds bloom-filter adjacent (a bloom filter of players in a hand would give a single id with a low probability of collision for a set of players; you'd use this to accelerate exact checks), but also like an indexed many-to-many table might have done the job, but all depends on what the actual queries you needed to run were, I'm just idly speculating.
At the time, at least, there was no way to index it for all 8 players involved in a hand. Each action taken would be indexed to the player that took it, and I'd need to sweep up adjacent actions for other players in each hand, but only the players who were consistently in lots of hands with that player. I've heard of bloom filters (now, not in 2012)... makes some sense. But the idea was to find some vector that made any set of players unique when running through a linear table, regardless of the order they presented in.
To that extent, I submit my solution as possibly being the best one.
I'm still a bit perplexed by why you say 2^x & 2^y is tautologically sound as a unique way to map f(x,y)==f(y,x), where x and y are nonequal integers. Throwing in the bitwise & makes it seem less safe to me. Why is that provably never replicable between any two pairs of integers?
I'm saying it's a tautology because it's just a binary representation of the set.
Suppose we have 8 players, with x and y being 2 and 4: set the 2nd and 4th bits (ie 2^2 & 2^4) and you have 00001010.
But to lay it out: every positive integer is a sum of powers of 2. (this is obvious, since every number is a sum of 1s, ie 2^0). But also every number is a sum of _distinct_ powers of 2: if there are 2 identical powers 2^a+2^a in the sum, then they are replaced by 2^(a+1), this happens recursively until there are no more duplicated powers of 2.
It remains to show that each number has a unique binary representation, ie that there are no two numbers x=2^x1+2^x2+... and y=2^y1+2^y2+... that have the same sum, x=y, but from different powers. Suppose we have a smallest such number, and x1 y1 are the largest powers in each set. Then x1 != y1 because then we can subtract it from both numbers and get an _even smaller_ number that has distinct representations, a contradiction. Then either x1 < y1 or y1 < x1. Suppose without loss of generality that it's the first (we can just swap labels). then x<=2^(x1+1)-1 (just summing all powers of 2 from 1..x1) but y>=2^y1>=2^(x1+1)>x, a contradiction.
or, tl;dr just dealing with the case of 2 powers:
we want to disprove that there exists a,b,c,d such that
it works for any number of integers. The first proof above (before tl;dr) is showing that every positive integer has a unique representation as a sum of distinct powers of 2, ie binary, and that no two integers have the same representation. You can watch a lecture about the representation of sets in binary here https://www.youtube.com/watch?v=Iw21xgyN9To (google representing sets with bits for way more like this)
But again it's not useful in practice for very sparse sets: if you have say a million players, with at most 10 at the same poker table, setting 10 bits of a million-bit binary number is super wasteful. Even representing the players as fixed size 20-bit numbers (1 million in binary is 20 bits long), and appending the 10 sorted numbers, means you don't need more than 200 bits to represent this set.
And you can go much smaller if all you want is to label a _bucket_ that includes this particular set; just hash the 10 numbers to get a short id. Then to query faster for a specific combination of players you construct the hash of that group, query to get everything in that bucket (which may include false positives), then filter this much smaller set of answers.
... z = (x+y+1)(x+y)/2 + y - but you have to sort x,y first to get the order independence you wanted. This function is famously used in the argument that the set of integers and the set of rationals have the same cardinality.
mm. I did see this when I was figuring it out. The sorting first was the specific thing I wanted to avoid, because it would've been by far the most expensive part of the operation when looking at a million poker hands and trying to target several players for potential collusion.
So the goal was to generate signatures for 2, 3 or more players and then be able to reference anything in the history table that had that combination of players without doing a full scan and cross-joining the same table multiple times. Specifically to avoid having ten index columns in the history table for each seat's player. This was also prior to JSON querying in mysql. I needed a way to either bake in the combinations at write time, or to generate a unique id at read time in a way that wouldn't require me to query whether playerIDs were [1201,1803,2903] or [1803,1201,2903] etc. Just a one-shot unique signature for that combination of players that could always evaluate the same regardless of the order. If that makes sense. There were other considerations and this was not exactly how it worked, since only certain players were flagged and I was looking for patterns when those particular players were on the same table. It wasn't like every combination of players had a unique id, just a few combinations where I needed to be able to search over a large space to find when they were in the same room together, but disregarding the order they were listed in.
SO in 2013 was a different world from the SO of the 2020's. In the latter world your post would have been moderator classified as 'duplicate' of some basic textbook copy/pasted method posted by a karma grinding CS student and closed.
The question boils down to: can you simulate the bulk outcome of a sequence of priority queue operations (insert and delete-minimum) in linear time, or is O(n log n) necessary. Surprisingly, linear time is possible.
Reddit is my current go-to for human-sourced info. Search for "reddit your question here". Where on reddit? Not sure. I don't post, tbh, but I do search.
Has the added benefit of NOT returning stackoverflow answers, since StackOverflow seems to have rotted out these days, and been taken over by the "rejection police".
Even if LLMs were trained on the answer, that doesn't mean they'll ever recommend it. Regardless of how accurate it may be. LLMs are black box next token predictors and that's part of the issue.
Why did SO decide to do that to us? to not invest in ai and then, iirc, claim our contributions their ownership. i sometimes go back to answers i gave, even when answered my own questions.
AFAICT all they did is stop providing dumps. That doesn't change the license.
I was very active, In fact I'm actually upset at myself for spending so much time there. That said, I always thought I was getting fair value. They provided free hosting, I got answers and got to contribute answers for others.
Try running ipython inside a vim terminal (:below term). Being able to yank and paste between buffers and the terminal (which is itself backed by a vim buffer), and vice versa, is a big multiplier
I've seen a similar setup with an llm loop integrated with clojure. In clojure, code is data, so the llm can query, execute, and modify the program directly
Waaay back when in Japan, sekigaisen (infrared) was a verb meaning to transfer contact details or photos or whatever between phones via infrared. It was amazing how fast the iPhone took over Japan and killed off their quirky phone ecosystem.
Edit: want to emphasize that it was totally ubiquitous. Every phone has it
yes, "beaming" in the us was also used for quite a while. as in IR beam
japanese phones were buggy, feature packed monstrosities. a bunch of companies fighting to check as many boxes as they could. it's not a surprise that they got wiped out by an attempt to make a holistic internet communicator.
but for a while, there was nothing like them and their ability to get information on the internet
I wonder if this was driven by the Palm Pilots in the early 2000s. We beamed contacts, calendar entries, whole apps via IR. At trade shows exhibitors had terminals that would constantly send out contact informations via OBEX (?).
In the US (edit: and elsewhere!), "beaming" worked great between Apple Newton devices, including the pretty cool eMate 300 (an early Jony Ive creation, I just found on Wikipedia).
I remember being blown away by the Gameboy Colour IR link. You could use it to trade Pokemon. That makes a bit more sense now if sekigaisen was already a popular ecosystem.
To any Linux users, I recently bought a fully loaded M4 MacBook pro to replace my aging Lenovo and strongly regret it. I thought I would use it for playing with LLMs, but local dev on a Mac is not fun and I still don't have it fully set up. I'll probably replace it with a framework at some point in the near future.
Edit: okay, that garnered more attention than I expected, I guess I owe a qualification.
1. Everything is just slightly different. I had to split all my dot files into common/Linux/Mac specific sections. Don't expect to be able to clone and build any random C++ project unless someone in the project is specifically targeting Mac.
2. Not everything is supported natively on arm64. I had an idea and wanted to spin up a project using DynamoRIO, but wasn't supported. Others have mentioned the docker quirks.
3. The window manager. I'm not a fan of all the animations and needing to gester between screens (and yes, I've been down the hotkeys rabbit hole). To install a 3rd party window manager you need to disable some security setting because appearantly they work by injecting into the display manager and calling private APIs.
So my person takeaway was that I took the openness of the Linux ecosystem for granted (I've always had a local checkout of the kernel so I can grep an error message if needed). Losing that for me felt like wearing a straightjacket. Ironically I have a MBP at work, but spend my day ssh'd into a Linux box. It's a great machine for running a web browser and terminal emulator.
I've used Macs for 20 years starting on the day 32-bit Intel Macs were released, and agree with the GP. Linux and Plasma spoiled me, going back to macOS and its windowing system feels like a step backward, especially for development, where using multiple windows is a must. Task switching is.. not good? I don't get window previews I can switch through when I hover over the dock, but I do on Linux.
Yes, I know about Yabai and the other things that modify the existing window manager. The problem is the window manager itself.
Outside of the windowing system, running native Linux if you're deploying to Linux beats using an amalgamation of old BSD utils + stuff from Homebrew and hoping it works between platforms, or using VMs. The dev tools that are native to Linux are also nice.
When it comes to multiple monitors, I want a dock on each monitor. I can do that in Plasma, but I can't in macOS, unless I use some weird 3rd party software apparently.
When you use linux as desktop, sometimes you get into a customization-hole and make everything "just right" because on linux everything is customizable.
Then you switch to macOS or Windows or even (not your) linux setup and hate it. When I manage to contain myself entirely to the terminal it's okay, but the moment I have to interact with GUI I start to miss those "just right" things.
I can relate. macOS hilariously sucks on certain GUI and terminal aspects. Not much you can do about GUI, just have to adapt to the way macOS wants to be used. For terminal, I use home-manager to manage my $HOME. It not space efficient and public caches are sub-par, but it's better than searching "sed in-place repace macos and linux cross-platform" for the 9000th time.
The irony is that I set up my Plasma desktop to mimic macOS' layout in terms of positioning buttons, menus, widgets and docks, and just leave the default settings and themes. Just what you get for free by default with Plasma is great vs macOS even with customizations.
I do nerd out when customizing the shell, though.
> It not space efficient and public caches are sub-par, but it's better than searching "sed in-place repace macos and linux cross-platform" for the 9000th time.
When onboarding new devs, it's like Groundhog Day, where I will inevitably have the "did you use GNU sed or BSD sed" conversation at some point if they have Macs.
It's amazing, I have the same terminal environment in WLS2, macOS and linux. NeoVim with all of its native dependencies, all k8s tools, etc. Sometimes I run it issues something not working on macos, but usually easy to resolve, if not, I use homebrew via home-manager.
Honestly, I get the GP completely. It’s not much the customisation hole. It’s just that MacOS is pretty meh as an OS and hilariously Snow Leopard, the first version I ever used, is my favourite amongst all the ones that followed.
I like the hardware however. I really wish there was a good laptop using a competitive ARM SoC with great Linux support. I refuse to buy anything from Apple since they started the whole EU shenanigans and I don’t really now which laptop I will buy. I’m seriously considering only using a phone as my personal computing device now that Android takes convergence semi seriously.
> Not really. Google, in fact, very opposes that convergence because it will hurt ChromeBook and chomecast sales.
They oppose convergence so much that they have just added a desktop environment when Android is plugged to a screen and a way to run Linux app with GPU acceleration.
Also Pixels natively support HDMI through usb-c and have done so for years. They do have terrible SoC however so I'm leaning more towards a Chinese phone personally.
> Task switching is.. not good? I don't get window previews I can switch through when I hover over the dock, but I do on Linux.
That just sounds like being accustomed to one way of switching tasks, honestly. If I want previews, I use Expose (three-finger swipe up/down or ctrl-up/down). But mostly I just use cmd-tab and haven't really needed to see previews there. Because macOS switches between applications, not windows, often there isn't single window to preview, and I'm not sure showing all the windows would work well either. For Expose it works well because the it can use the entire screen to show previews.
If I wanted to use gestures, three/four finger swipe up and down shows all of the windows and all of the desktops with windows respectively. If I'm switching using the keyboard, I get window previews. If I'm switching using the dock, I get window previews.
Going back to macOS where I don't get window previews forces me to think in terms of app icons, instead of the UI I've been staring at and will instantly recognize. And if I use the dock, I have to remember the window title's name to switch windows using the context menu.
Funny, I've been using Macs for 10+ years and I've never really used Expose. Often I'd be trying to select between windows that look very similar (eg, code windows), so it doesn't work. Instead, I just use Cmd+Tab, and then Cmd+` to cycle windows.
Exactly. I find the macOS approach (Cmd-Tab to pick the right app, Cmd-` to pick the right window) much faster/better than just one shortcut to go through all windows.
Imagine having N apps with M windows each, with the macOS model your number of presses to find a given window goes from O(NM) to O(N+M).
This is how it works in Plasma, you can use the app switcher key combo to switch between apps and then the app window switcher combo to switch between that app's windows. You can also go through every single window, if you want.
The app switching behaviour is really infuriating. Selecting a window and having all the app’s windows come to the fore, obscuring the window from another app is still annoying, 20 years on.
And then when you full-screen a window, switch to another app for a moment, and then you can’t find it without delving into the ‘window‘ menu.
You're right, I'm thinking of using the keyboard cmd-tab.
So if I have two Zed windows and Firefox in front of one of them, I can't switch from Zed to Firefox and back to Zed without losing view of Firefox. Means I have to move windows around so they don't overlap, which seems so counterintuitive.
I'll echo the sentiment about being very familiar with macOS but being spoiled by Linux and KDE Plasma. I put up with my work MacBook. My personal Linux setup just works and gets out of the way as a machine.
The newest Macbooks have insanely powerful hardware (I have an M4 Macbook Max). Yet they do not feel as speedy or instant on my machines with i3. There's always a perceivable milliseconds of latency, with response time from the keyboard to the screen. As someone who has tons of key bindings, I find this tolerable, but it can get a bit grating compared to just how instantaneous everything is on my Linux.
The way sidebars feel is really "sticky". This has got worse with SwiftUI. The List component used for this has notoriously poor performance and a really inflexible API.
Agreed, as a software engineer of ~8 years now Mac is actually my _preferred_ environment -- I find it an extremely productive OS for development whether I'm working on full stack or Unity game dev in my free time.
I don't agree with OP's sentiment that macOS is a bad dev environment, but surely I prefer Linux+KDE as an overall dev environment. I find that all the tools I need are there but that I'm fighting the UI/UX enough to make it a hassle relative to KDE.
> I don't agree with OP's sentiment that macOS is a bad dev environment, but surely I prefer Linux+KDE as an overall dev environment. I find that all the tools I need are there but that I'm fighting the UI/UX enough to make it a hassle relative to KDE.
This sounds like you think macOS is a good dev environment, but that you personally don't like the UI/UX (always safer to make UI/UX judgements subjective ["I don't like"] rather than objective ["it's bad"], since it's so difficult to evaluate objectively, e.g., compared to saying something like Docker doesn't run natively on macOS, which is just an objective fact).
For example, if I open a new Firefox window, the Mac seems to force the two Firefox windows onto different desktops. This already is a struggle, because sometimes I don't want the windows to be on two desktops. I find that if I try to move one window to the same desktop as the other, then Mac will move the other desktop to the original desktop so they are both still on different desktops.
OK, got sidetracked there on a different annoyance, but on top of the above, CMD-backtick doesn't usually work for me, and I attribute it to the windows typically being forced onto different desktops. Some of the constraints for using a Mac are truly a mystery to me, although I'm determined to master it eventually. It shouldn't be this difficult though. For sure, Mac is nowhere near as intuitive as it's made out to be.
My favorite is how it'll force move your workspace if you get a popup.
To reproduce, get a second monitor, throw your web browser onto that second monitor (not in full screen), and then open a application into full screen on your laptop's screen (I frequently have a terminal there). Then go to a site that gives you a popup for OAuth or a Security Key (e.g. GitHub, Amazon, Claude, you got a million options here). Watch as you get a jarring motion on the screen you aren't looking at, have to finish your login, and then move back to where you were.
> Mac are truly a mystery to me
Everyone tells me how pretty and intuitive they are yet despite being on one for years I have not become used to them. It is amazing how many dumb and simple little problems there are that arise out of normal behavior like connecting a monitor. Like what brilliant engineer decided that it was a good idea to not allow certain resolutions despite the monitor... being that resolution? Or all the flipping back and forth. It's like they looked at the KDE workspaces and were like "Let's do that, but make it jarring and not actually have programs stay in their windows". I thought Apple cared about design and aesthetics but even as a Linux user I find these quite ugly and unintuitive.
Stop using "full screen mode", with recent macOS you can just drag the window to the top of the screen and let it "snap" to fit the entire screen. This is different from "full screen mode" which is largely useless. What you want is that the app window fills the screen space, not that it takes over the entire screen
I'm truly annoyed at it reordering the desktops even when i have just a single screen (the built in one). I expect my programs to be in certain order, so switching between them is predictable.
Or sometimes it just decided to open a link in a new chrome window instead of just opening a tab.... and not even consistently.
That gets extra weird with a second monitor. I really cannot predict where a workspace on that monitor will land when disconnecting. It could be prepending or appending. I think it orders based on last active but my lack of confidence should even say something. I mean just because you interact with a program doesn't mean that was the last active program... crazy that I can scroll or type into a window and it not be considered the active window
Even worse is the lag in switching windows. If you use keyboard shortcuts to switch, your screen will have switched over but focus is still on the previous window so anything you type goes there. I have to pause for a second to wait for it to catch up.
and disabling animations doesn't help, it's still slow.
It still surprises me how slow so much of Windows and OSX are. It is absolutely bonkers how slow so many things are[0]. Even more given how many people don't realize how fast everything can or should be. People will fight hard to justify why they don't have basic optimizations. Much harder than it would be to actually implement them...
Stop using multiple desktops. Use a single extended desktop. Move the apps where you want them and snap them to one side or the other to any given screen. Done.
Or just put a program onto a second monitor then open a second window for that program. Usually it will not open in the same monitor. This is especially fun when you get pop-ups in browsers...
Only sometimes it doesn't work. (For me on a Norwegian keyboard it is CMD+<)
Specifically, sometimes it works with my Safari windows ans sometimes it doesn't.
And sometimes when it doesn't work, Option+< will work for some reason.
But sometimes that doesn't work either and then I just have to swipe and slide or use alt-tab (yes, you can now install a program that gives you proper alt-tab, so I do not have to deal with this IMO nonsense, it just feels like the right thing to do when I know I'm just looking for the other Safari window.)
I'm not complaining, I knew what I went to when I asked $WORK for a Mac, I have had one before and for me the tradeoff of having a laptop supported by IT and with good battery time is worth it even if the UX is (again IMO) somewhat crazy for a guy who comes from a C64->Win 3.1->Windows 95/98->Linux (all of them and a number of weird desktops) background.
> Have you tried any that actually delivered on what was promised?
It absolutely does.
Maybe I should just switch to using it 100% of the time like on Windows. (I was trying to have it the KDE way: Yes, window based switching instead of App based, also an option to switch between windows from the same application.)
> Or otherwise you can enable the app exposé feature to swipe down with three fingers and it will show you only windows of the same app.
If you have an Apple keyboard, CTRL-F3 (without the Fn modifier) will do the same. Not sure if there are third-party keyboards that support Mac media keys, but I'm guessing there are some at least...
That has terrible ergonomics for anyone using a non-US keyboard, though - the backtick is immediately above the option key so to hit together with CMD requires clawing together your thumb and little finger.
GNOME does this much better, as it instead uses Super+<whatever the key above Tab is>. In the US, that remains ` but elsewhere it's so much better than on MacOS.
> That has terrible ergonomics for anyone using a non-US keyboard, though - the backtick is immediately above the option key so to hit together with CMD requires clawing together your thumb and little finger.
That's true, hence why I remap it to a "proper" key, above Tab with:
Firstly, I have not complained about Linux, did I? Even more, I'm about to switch to Linux on my desktop, thanks to Microsoft and their EOS for "the last Windows version ever". Secondly, I suspect power users hardly ever find any OS perfect for their needs, there are always some customizations.
So many comments about how Linux isn't ready because of some admin task requiring to run a CLI command.
Then Windows apologists tell you that actually all your problems are because you didn't edit your install ISO or pirate a IOT enterprise edition. Because that's normal behaviour.
And it's becoming more common with Macs.
I remember Snow Leopard was genuinely amazing, and a massive improvement over everything else. I had high hopes after Mountain Lion that we would get a feature release and then a performance release, because the performance releases just made everything so much better. Alas I just seem to get more whitespace.
Apple really doesn't tell power-users about a lot of these features. You can really gain a lot by searching for Mac shortcuts and tricks. I still learn new things that have been around for over a decade.
Another tip: lots of useful characters are only an option press away. You can find them by viewing your keyboard [1], which is easy if you have you input source on your dock. Some of my favorites:
⌥k = ˚ (degree) ⌥e a = á
⌥p = π (pi) ⌥e e = é
⌥5 = ∞ (infinity) ⌥e i = í
⌥d = ∂ (delta) ⌥e o = ó
⌥8 = • (bullet) ⌥e u = ú
⇧⌥9 = · (middot) ⌥n n = ñ
This is one of my favorite features of macs and it astounds me there's nothing close in equivalence for the other platforms. The recommendations are always 'Install a keybinding app and add all of them as key bindings', as if that wouldn't take hours of tedious labor to do.
I'd argue if you need to be told about keyboard shortcuts, then you're not a power user. (I.e., knowing how to find keyboard shortcuts I'd consider a core trait of power users).
My perception is that on Windows it is standard to display keyboard shortcuts next to application menu items, whereas on the Mac, that doesn’t seem to be the case. Perhaps that’s just a culture thing. It’s expected on Windows, and not as expected on Mac.
macOS does this too (if I'm following correctly), you can see it in the "The Apple Menu in macOS Ventura" screenshot on this Wikipedia page https://en.wikipedia.org/wiki/Apple_menu#/media/File:Apple_m... it's done for both application keyboard shortcuts, as well as system shortcuts (as in this example).
For completion, system shortcuts are also available in `System Settings > Keyboard > Keyboard Shortcuts...` (where they can also be changed). (Although I don't think that's 100% comprehensive, e.g., I don't think `⌘⇥` for the application switcher is listed there.)
As a hybrid macOS / Windows user (with 20+ years of Windows keyboard muscle memory), I found Karabiner Elements a godsend. You can import complex key modifications from community built scripts which will automatically remap things like Cmd+Tab to switch windows, as well as a number of other Windows hotkeys to MacOS equivalents (link below):
I’ve not used windows since XP, but the one thing I missed was the keyboard menu navigation with alt and the underlined single letters. Still in my muscle memory, and I always felt like it was at least as good as keyboard shortcuts.
Poking around System Settings > Keyboard > Keyboard Shortcuts… > Keyboard is a pretty good place to browse if you haven't. There are default keybindings that can changed and switched on/off. For macOS Sequoia 15.6:
[ ] Change the way Tab moves focus ⌃F7
[ ] Turn keyboard access on or off ⌃F1
[ ] Move focus on the menu bar ⌃F2
[ ] Move focus on the Dock ⌃F3
[ ] Move focus to the active or next window ⌃F4
[ ] Move focus to the window toolbar ⌃F5
[ ] Move focus to the floating window ⌃F6
[*] Move focus to next window ⌘`
[ ] Move focus to status menus ⌃F8
[ ] Show contextual menu ⌃↩
I only have one checked currently; I'm not feeling adventurous.
Stuff like the containerisation story on Macs is so miserable. The fact that so many devs use Docker and on Mac and pay the orders of magnitude file costs (or do a bunch of other shenanigans) really makes for an unfun experience.
Really wish someone could have figured out something a bit better in that space in particular. Docker compose is a "least worst" option for setting up a project with devs when many are uncomfortable with other solutions, but it really takes the oxygen out of anything that might "work"
I don't. I'm constantly shifting between my Linux desktop and a Mac for work. I also picked up a personal MBP with as much RAM as Apple allowed (still far overpriced and limited options) about a year and a half ago. While I don't regret it, it's still not my first choice.
If there's "endless options for local dev on a Mac" then I don't know how to describe the flexibility that a decent laptop running Linux gives you, comparatively. Honestly I think the Mac only excels in one area still today and that is: the breath of their paid for software library. The polish of Mac used to be the draw, but OS X has degraded over the years as Apple shifts to unify IOS and OS X. And don't get me started on the garbage that iCloud is that Apple continues to force feed harder and harder having, clearly, taken cues from the Windows team in Redmond.
I'm really hopeful we start to see more ARM options in non-Mac laptop formats soon. Because, while trivial, it is nice to be able to run small models for a variety of reasons.
It is interesting though that I see a "huge share of devs" using a Mac to write code targeting Linux environments when they could actually simplify their development environment by ditching Mac. To each their own.
Wow, this account's recent comment history is just full-on blasting with pro-apple opinions and attacking anyone who posts even a tinge of negativity about Apple or its recent product(s). I find it amusing we'd become so defensive about for-profit companies and their products..
I'm sorry, I just really hate this Apple Fanboy rhetoric. It's frequent and infuriating. Don't get me wrong, I hate it when the linux people do it too, but they tend to tell you how to get shit done while being mean.
The biggest problem with Linux is poor interfaces[0] but the biggest problem with Apple is handcuffs. And honestly, I do not find Apple interfaces intuitive. Linux interfaces and structure, I get, even if the barrier to entry is a big higher, there's lots of documentation. Apple less so. But also with Apple there's just things that are needlessly complex, buried under multiple different locations, and inconsistent.
But I said the biggest problem is handcuffs. So let me give a very dumb example. How do you merge identical contacts? Here's the official answer[1]
Either:
1) Card > Look for Duplicates
2) Select the duplicate cards, then Card > Merge Selected Cards.
Well guess what? #2 isn't an option! I believe this option only appears if you have two contacts that are in the same address book. Otherwise you have the option "Link Selected Cards". Something that isn't clear since the card doesn't tell you what account it is coming from and clicking "Find duplicates" won't offer this suggestion to you. There's dozens of issues like this where you can be right that I'm "holding it wrong", but that just means the interface isn't intuitive. You can try this one out. You can try this out. Go to your contacts, select "All Contacts" and then by clicking any random one try to figure out which address book that contact is from. It will not tell you unless you have linked contacts. And that's the idiocracy of Apple. Everything works smoothly[2] when you've always been on Apple and only use Apple but is painful to even figure out what the problem even is if you have one. The docs are horrendous. The options in the menu bar change and inconsistently disappear or gray out, leading to "where the fuck is that button?".
So yeah, maybe a lot of this is due to unfamiliarity, but it's not like they are making it easy. With Apple, it is "Do things the Apple way, or not at all". But with Linux it is "sure whatever you say ¯\_(ツ)_/¯". If my Android phone is not displaying/silencing calls people go "weird, have you tried adjusting X settings?" But if my iPhone is not displaying/silencing calls an Apple person goes "well my watch tells me when someone is calling" and they do not understand how infuriating such an answer is. Yet, it is the norm.
I really do want to love Apple. They make beautiful machines. But it is really hard to love something that is constantly punching you in the face. Linux will laugh when you fall on your face, but it doesn't actively try to take a swing or put up roadblocks. There's a big difference.
[0] But there's been a big push the last few years to fix this and things have come a long way. It definitely helps that Microsoft and Apple are deteriorating, so thanks for lowering the bar :)
It’s rare that I get feedback like this about a feature I owned for a decade.
1) The vast majority of users have only one contact-sync account, so it’s not an issue for them, merge works fine
2) For users that have multiple contact-sync accounts, they almost never want a feature to silently choose one account’s contact and delete the other account’s contact. So linking is really what these users want if the contacts live in different accounts.
It’s interesting feedback that a combined “link or merge” command would be what you’d expect. That’s a reasonable request; in my day we generally steered clear of combining destructive operations (merging) with non-destructive (linking).
I was more focused on the fact that the macOS implementation of “look for duplicates” is pretty broken; there’s a decent iOS implementation we never got around to migrating to macOS.
Don't get me wrong, I don't have an issue with linking. I think that's the correct solution.
In fact, it's kinda the only solution unless you can push info upstream, and you shouldn't assume you have those privileges or even know their data structure. But that doesn't matter because what the user cares about is how the information is displayed.
It is primarily a display issue. No deletions needed
The critical issue is I, the user, can't identify if these two contacts are in the same address book or not. The only way I can find this out is to guess and check. I have to guess the address book and then search that name, then repeat. That's not a reasonable solution nor does it scale. It's trivially solvable too. Just tell the user what address book a contact belongs to!
That's what leads to the confusion. All the program is telling me is that there are two contacts with the same name, nickname, phone number, and birthday. But the contacts differed on email and notes. The UI feedback tells me "Apple doesn't know how to do a trivial database query" not "Apple doesn't want to destructively merge these contacts because they are in different address books." That is actually not an obvious thing and I chased multiple other issues first. This is especially bad because in my calendar I had 3 entries for this person's birthday and 3 contacts. 2 were linked to my iCloud address book and 1 to Google (by ctrl clicking on the date, but maybe (in hindsight) that's not actually accurate). I somehow got it down to two, which resulted in 4 birthdays on my calendar! That actually created a false flag because now the icons showed as of 1 was from google and now 3 from iCloud, with all 3 no longer linking to a contact. The feedback the programs are giving me is "Apple can't merge tables", right? Or at least that's a reasonable interpretation.
I think theres a relatively simple solution to this. 1) indicate on the contact card which address book the contact belongs to. 2) "Find duplicates" queries across address books. Present the option "link contacts" instead of "merge". It's obviously reasonable that a user would want this as you have that capability for a reason. I honestly think "merge" could be "link" in most cases, because depending on the data structure those will be equivalent (you reference a node. That node has children pointing to the different tables). I agree, you shouldn't delete data, but there's also likely no reason to (yes delete if you have duplicate pointers pointing to the same object unless these pointers are aliases)
The same idea applies to calendar events. I missed a ton of events when I first switched to an iPhone because I'll look at my calendar and see 3 copies of "Columbus Day" and 1 "Indigenous People's Day" (Apple does both!) and not what I had scheduled for 10am. The only solution I have is to disable the holiday tables from my Google calendar and outlook. Effectively that's "deleting" data. This looks like a fine solution but those calendars aren't identical. As a user I want the union. I want deduplication. Because who wants redundant information? It's clearly not something the user is intending (at least in this case). That's going to be true for things like birthdays too (which I'd be happy to import). Apple doesn't even distinguish that as a separate table for my Google calendar so I'm stuck with dupes.
Effectively it is a display issue. As a user that's what's critical to me because that's what makes the program useful. As a programmer, yeah, I care about details but my tech illiterate parents don't.
> With Apple, it is "Do things the Apple way, or not at all".
Well kinda, you don't have to use all that much Apple software on macs though. If you can live with the window manager / desktop environment then you can use whichever apps you choose for pretty anything else.
I'm not sure this is true, especially if you're a "power user"[0]. Here's an example: I want to modify `~/.ssh/config` to define a machine's alias depending on the SSID I'm on. So I want this logic
If on MyHomeSSID:
Host FooComputer
Hostname 192.168.1.123
Else If tailscale-is-running
Host FooComputer
Hostname 100.64.0.123
The reason you might want to do this is so that you can have your ssh connection adapt to the network you're using. You can just always write `ssh FooComputer` and get the connection you want. This can get much more complicated[1], but is incredibly useful.
How would you accomplish this? Well actually, I don't know ANYMORE[2]. The linked thread had a solution that worked, but `ipconfig getsummary en0` now redacts the SSID (even when running sudo!). Though `system_profiler SPAirPortDataType` still works and I can get the result in 4 seconds... So not actually a solution. Yet it shows the idiocracy and inconsistency of Apple's tooling. There was a solution, then Apple changed it. wtallis helped me find a different solution, and well... then Apple changed it. YET `system_profiler` still doesn't redact the SSID so what is going on? Why is it even redacted in the first place? I can just throw my cursor up to the top right of the screen and see the SSID information. If it was a security issue then I should not be able to view that information in GUI OR CLI and it would be a big concern if I could see it in some unprivileged programs but not in others.
And that's the problem with Apple. If I write some script to do some job, I don't know if that script is going to work in 6mo because some person decided they didn't want that feature. So I can find some other command to do the exact same thing and end up playing a game of Wack-a-mole. *It is absolutely infuriating.* This is what I mean by "constantly punching you in the face". The machine fights you and that's not okay.
[0] I put in quotes because the example I'm about to give is to some "complex" but others "dead simple". I'd actually say the latter is true
[side note] I've used a similar SSID trick to write myself a "phone home" program in termux for Android and other machines. I can get my GPS coordinates and other information there so you can just write a <50 line program to ping a trusted machine if your device doesn't check in to trusted locations within certain timeframes. Sure, there's FindMy, but does that give me a history? I can't set an easing function to track if my device is on the move. Can I remote into the lost machine? Can I get it to take pictures or audio to help me locate it? Can I force on tailscale or some other means for me to get in without the other person also having technical knowledge? Why not just have a backup method in case one fails? I'm just trying to give this as an example of something that has clear utility and is normally simple to write.
I ended up doing something similar a few years ago. Picked up a MacBook Pro M1 Max back when the M1 stuff was new to replace an aging Lenovo running Linux. I actually really loved my Lenovo + Linux, but the M1 was new and shiny and I desperately wanted better battery life.
The hardware was great, but life on a Mac always felt a bit convoluted. Updating the OS was especially frustrating as a software developer because of all the interdependent bits (xcode, brew, etc) that often ended up breaking my dev environment in some way. It also always amazed me at the stuff that was missing. Like, how isn't the default terminal app fully functional after all these years? On the plus side, over the time I used it they did add tiling and the ability to hide the notch.
Finally at the start of the year I moved back to Linux and couldn't be happier. Had forgotten just how nice it is to have everything I need out of the box. The big thing I miss is Affinity Photo, though that looks like it's in the middle of dying right now.
Exactly! I too bought the M1 Macbook Air in 2021 because of its great battery life. I wanted a powerful device for hacking on personal projects at home (I use a Dell running Ubuntu at work) but every time I opened it there was always something frustrating about OS X that made it unsuitable for dev stuff (at least for me)
* Finder - this is my most hated piece of software. It doesn't display the full file path and no easy way to copy it
* I still haven't figured out how to do cut/paste - CMD + X didn't work for me
* No Virtualbox support for Apple Silicon (last checked 1 year ago)
* Weird bugs when running Rancher Desktop + Docker on Apple Silicon
But still Apple hardware is unbeatable. My 2015 Macbook pro lasted 10 years and the M1 is also working well even after 4 years.
> * Finder - this is my most hated piece of software. It doesn't display the full file path and no easy way to copy it
View -> Show Path Bar to display the full path of a file.
When a file is selected, press Option-Cmd-C to copy the full file path. Or just drag the file anywhere that expects a string (like the Terminal, or here). That strikes me as quite easy.
Cmd-X, -C, -V work as expected, what exactly is the problem? (Note that macOS, unlike Windows, doesn't allow to cut & paste files to avoid loss of the file in case the operation isn't completed. However, you can copy (Cmd-C), then use Option-Cmd-V to paste & move.)
Now, that might not be completely easy to discover (though, when you press Option the items in the Edit menu change to reveal both "tricks" described above, and contain the keyboard shortcut).
At any rate: when switching OS, is it too much to ask to spend a few minutes online to find out how common operations are achieved on the new OS?
FWIW, Virtual box did get ported to Apple silicon, but long time Mac software developer Parallels has a consumer grade VM management software. Theirs supports directX 11 on arm windows, which is critical for getting usable performance out of it. Conversely, VMware's Mac offering does not, making 3d graphics on that painfully slow.
There's also a couple of open source VM utilities. UTM, tart, QEMU, Colima, probably others.
I have an M1 air and 8th gen intel Dell (openbsd) and I’m much happier wit the Dell for hacking on stuff. MacOS is pretty much a nightmare if your workflow is not apps and IDE centered.
It is maybe one of the most featureless terminals out there. Slow, poor color support, weird and frustrating permission interactions, limited font options, incomplete terminal emulation, etc.
It has improved a bit over the years and is generally fine if you just need to knock out a few commands. But I don't find it to be a very pleasurable experience compared to the alternatives. It feels very much like Apple implemented "just enough" and no more.
iTerm2 is a must. This is probably the only Mac app I miss on Linux. Kitty and Ghostty are missing so many important features, they fill like hobby proof of concept terminals to me. The closest alternative for Linux IMO is Wezterm.
It does have drawing tools, as well as tools for working with exposure, sharpness, color, text, shapes, selection, etc. I’d suggest exploring the features in Preview. It can do a surprising number of things with images.
I didn't actually buy anything new for my transition back to Linux. I have a gaming system that had traditionally been running windows. It's a powerful system, but has always been a "toy" running Windows for playing games. Last year I moved it to Linux and have been incredibly happy with the move.
These days I am also now working from home full time, so it kinda hit me. "Why the hell am I trying to work from this MacBook when I have my really great gaming desktop that runs Linux now?" Moved my work over and have been incredibly happy.
I'll have to give the Fedora Asahi Remix a go on my MacBook Pro though. That's a great idea!
> The window manager. I'm not a fan of all the animations and needing to gester between screens (and yes, I've been down the hotkeys rabbit hole). To install a 3rd party window manager, you need to disable some security setting because appearantly they work by injecting into the display manager and calling private APIs.
For using the vanilla macOS workspaces though, if you avoid using full screen apps (since those go to their on ephemeral workspace that you can't keybind for some stupid reason), if you create a fixed amount of workspaces you can bind keyboard shortcuts to switch to them. I have 5 set up, and use Ctrl+1/2/3/4/5 to switch between isntead of using gestures.
Apart from that, I use Raycast to set keybindings for opening specific applications. You can also bind apple shortcuts that you make.
Still not my favorite OS over Linux, but I've managed to make it work because I love the hardware, and outside of $dayjob I do professional photography and the adobe suite runs better here than even my insanely overspeced gaming machine on Windows.
Mac laptop hardware is objectively better, but I am on the same camp as the parent post. For most development workflows, Linux is my favorite option. In particular, I think NixOS and the convenience of x86_64 is usually worth the energy efficiency deficit with Apple M.
It will be interesting to see how this evolves as local LLMs become mainstream and support for local hardware matures. Perhaps, the energy efficiency of the Apple Neural Engine will widen the moat, or perhaps NPUs like those in Ryzen chips will close the gap.
I develop using a MacBook because I like the hardware and non-development apps but all my accrual work happens on a Linux server I connect to. It's a good mix.
Thanks for sharing Aerospace, can’t believe I overlooked it! It’s like finding out someone fixed half the things that make macOS feel like a beautiful prison .Somehow it makes the whole OS feel less… Apple-managed.
I kinda agree with the OP, but then I was a Linux user for well over a decade. I do think that C/C++ libraries are much, much more of a pain on Mac as soon as you go off the beaten path (compiling GDAL was not pleasant, whereas it would be a breeze on Linux).
Some of this is probably brew not being as useful as apt, and some more of it is probably me not being as familiar with the Mac stuff, but it's definitely something I noticed when I switched.
The overall (graphical) UI is much fluider and more convenient than Linux though.
Meh, i've had a macbook for 15 years, i'm on the league that thinks that snow leopard was the last good OS they made. Still get to use one almost daily at work, but honestly i prefer using windows for developing, if linux is not an option.
My 2012 MPB still lives on running debian (not ideal because some driver quirks, but miles better in terms of responsiveness and doing actual work on it than whatever OSX i could put on it)
I honestly agree with the parent. I'd love a macbook M because the hardware is simply fantastic, but if i can't put debian on it, then i'll pass
I have to agree. The loss of sense of reality among Linux fanboys is really annoying.
I had been a Linux notebook user for many years and have praised it on this board years ago. But today the Linux desktop has regressed into a piece of trash even for basic command line usage while providing zero exclusive apps worth using. It's really sad since it's unforced and brought upon Linux users by overzealous developers alone.
Mac OS trounces Linux in absolutely every way on the desktop it's not even funny: performance, battery life, apps, usability, innovation. Available PC notebook HW is a laughable value compared to even an entry level Apple MacBook Air. Anecdata but I have no less than five "pro" notebooks (Dell Lattitude, XPS, and Lenovo Thinkpad) come and go with basic battery problems, mechanical touchpad problems, touchpad driver issues, WLAN driver issues, power management issues, gross design issues, and all kind of crap come and go in the last five years so I'm pretty sure I know what I'm talking about.
The one thing Mac isn't great for is games, and I think SteamOS/Proton/wine comes along nicely and timely as Windows is finally turning to the dark side entirely.
> Mac OS trounces Linux in absolutely every way on the desktop it's not even funny: performance, battery life, apps, usability, innovation.
performance - I don't agree
battery life - absolutely
apps - absolutely
usability - I don't agree
innovation - I don't agree
One significant annoyance associated with Linux on a laptop is that configuring suspend-then-hibernate is an arduous task, whereas it just works on a Macbook.
But, the main thing is commercial application support.
It's not unique to Linux users. Anytime a new device is released, there's always a comment from an alternate-device user stating that they prefer their device.
With HN, Linux user comments are voted-up so often that they're worthy of a bingo square.
As a long term Mac user who works on ROS a lot I hear you. Most people here think local dev means developing a React app. Outside of mainstream web frameworks Mac sucks for local dev.
Yeah, C++ is only a side actor on Apple since Mac OS got replaces with NeXTSTEP, Copland was C++ based, and BeOS as well, but Objective-C won.
Now with Swift, and the whole security legistation ongoing issues across several countries, Apple seems to only care to the extent it needs for their uses of LLVM, Metal Shading Language (C++14 dialect), and IO / Driver Kit frameworks.
They aren't contributing to clang as they once were, Google also not after the whole ABI break discussion.
On Windows land, it isn't much better, it appears that after getting first place reaching C++20 compliance, Microsoft decided to invest their programming language budgets on .NET, Rust and Go, asking the community what features that actually care about in newer standards.
I have a pretty good cross-platform dotfiles setup for both Mac OS and Linux that I use Chezmoi to provision. I try not to repeat myself as much as possible.
I use Linux at work and for gaming, and Mac OS for personal stuff. They both build from the same dotfiles repository.
Some things I've learned is:
- Manually set Mac's XDG paths to be equal to your Linux ones. It's much less hassle than using the default system ones.
- See my .profile as an example on how I do this: https://github.com/lkdm/dotfiles/blob/main/dot_profile.tmpl
- Use Homebrew on both Linux and Mac OS for your CLI tools
- Add Mac OS specific $PATH locations /bin, /usr/sbin, /sbin
- Do NOT use Docker Desktop. It's terrible. Use the CLI version, or use the OrbStack GUI application if you must.
- If you use iCloud, make a Zsh alias for the iCloud Drive base directory
- Mac OS ships with outdated bash and git. If you use bash scripts with `#!/usr/bin/env bash`, you should install a newer version of bash with brew, and make sure Homebrew's opt path comes before the system one, so the new bash is prioritised.
I hope this is helpful to you, so feel free to ask me anything about how I set up my dotfiles.
I can relate. I've spent almost 30 years working primarily on Linux. I moved Windows to be under VM when I needed it around for occasionally using MS Office, first under vmware and later under kvm. Now I don't even use it as a VM, since work has Office 365.
My work got me a similar M4 MacBook Pro early this year, and I find the friction high enough that I rarely use it. It is, at best, an annoying SSH over VPN client that runs the endpoint-management tools my IT group wants. Otherwise, it is a paperweight since it adds nothing for me.
The rest of the time, I continue to use Fedora on my last gen Thinkpad P14s (AMD Ryzen 7 PRO 7840U). Or even my 5+ year old Thinkpad T495 (AMD Ryzen 7 PRO 3700U), though I can only use it for scratch stuff since it has a sporadic "fan error" that will prevent boot when it happens.
But, I'm not doing any local work that is really GPU dependent. If I were, I'd be torn between chasing the latest AMD iGPU that can use large (but lower bandwidth) system RAM versus rekindling my old workstation habit to host a full size graphics card. It would depend on the details of what I needed to run. I don't really like the NVIDIA driver experience on Linux, but have worked with it in the past (when I had a current gen Titan X) but also did OpenCL on several vendors.
Speaking of the P14s, I have an Intel version from 2 years back and battery life is poor. And I hunger for the mac's screen for occasional photography. The other thing I found difficult is that there's no equivalent of the X1 Carbon with an AMD chip. It's Intel only. The P14s is so much heavier.
Seconded. I have a mostly CLI setup and in my experience Nix favors that on Mac, but nonetheless it makes my Nix and Linux setups a breeze. Everything is in sync, love it.
Though if you don't like Nixlang it will of course be a chore to learn/etc. It was for me.
Really? This surprises me. I've used them for projects and for my home-manager setup and it's always been amazing at it. The best example I can come up with is packaging a font I needed into a nix package for a LaTeX file. It would have taken me a month of trying various smaller projects to know how to do that.
Honestly it helped quite a bit. There are a lot of obscure (imo) errors in Nix that LLMs spot pretty quickly. I made quite a bit of progress since using them for this.
If you don't mind me asking, why do you prefer this set up over just using brew?
I've poked around articles and other posts about this, but I'm not sure I quite get it.
If I just need to install packages, would brew just work for me?
I have a collection of bash scripts in my dotfiles for setting things up, and I've been meaning to adapt them for my linux laptop. It seems like Nix may be helpful here!
The big thing is undoing: if I want to uninstall something, I delete it from the text file and rerun my nix init. I also just decided to get more serious about having all my aliases and shell functions in source control, but then I need to be able to guarantee that I have my little dependencies like fzf, jq, fd, etc. Another option for those is to write the shell functions in nix-script so that they require their command line utilities when run for the first time. The other reason is that it makes me more disciplined about per-project dependencies. I use nix flakes for all my little projects too, so I end up with a project owning its npm or rails installs and database binary and database data and… you get the idea. It takes like 20 more minutes to get coding on something if I set it up that way but I let the LLMs handle that part. It's not quite "make the AI do the dishes while I make art" but it's pretty close!
I'm often envious of these Macbook announcements, as the battery life on my XPS is poor (~2ish hours) when running Ubuntu. (No idea if it's also bad on Windows - as I haven't run it in years).
MacOS is great for development. Tons of high profile devs, from Python and ML, to JS, Java, Go, Rust and more use it - the very people who headline major projects for those languages.
2ish hours battery life is crazy. It's 8+ hours with the average Macbook.
If you are on a M-series MacBook and aren't running a 3D Benchmark the entire time, your Mac is broken if it is dying after 2.5 hours.
Have you checked your Battery Health?
If you have an intel-based Mac, it's the same expected battery life as Windows and 2.5 hours on an intel MacBook battery sounds decent for something 5+ years old.
8+ hours sounds about right. I have a M1 Macbook Pro and even 5 years later I can still use it (a million browser tabs, couple containers, messaging apps) for an entire day without having to charge it.
Yes, macOS sucks compared to Linux, but the m chip gets absolutely incredible battery life, whereas the framework gets terrible battery life. I still use my framework at work though.
Yes, there is a dilemma in the Linux space. But is running Linux on a MacBook a viable option these days? Is Ashahi Linux solid enough?
I much prefer a framework and the repairability aspect. However, if it's going to sound like a jet engine and have half the battery life of a new m series Mac. Then I feel like there's really no option if I want solid battery life and good performance.
Mac has done a great job here. Kudos to you, Mac team!
Oh, I have that tab still open from when I was reading the other thread.
Here is the feature support from Asahi. Still a way to go unless you are on an old M1 looks like?
If you don't need a dedicated graphics card, there's plenty of laptops that get 12 or better hours of battery life (8 under heavy load such as you're compiling things), which is perfectly fine for me. LG gram was my most recent one I was using, and that required zero tweaks to any power management or battery or ssd or any other settings to get.
The AMD chips I'm using with integrated graphics have 6+ hours of battery life (system76 pangolin) and the newer intel ultra chips are decent on battery too. +/- a bit depending on how hard you end up pushing them machine. Huge improvement over my frist linux laptop, though it had a nvidia chip set but would go only 2-3 hours per charge.
You're doing it wrong. Mac is by far one of the best development environments and is used by millions for dev, including LLMs. In fact I'm running LLMs and image AI models right now on my M4 MBA and everything works perfectly.
For your dotfiles there's not too many differences just make a separate entry point for zsh that only includes the zsh + macOS things (a few system calls are different in macOS) and then set your .zshrc to load the zsh + macOS version instead of the Linux or "universal" one. This is trivial if you've split your dotfiles into multiple separate files to import individually from a central master file per OS.
For window management you want to use CMD + ` to switch windows in the same app and CMD + Tab to switch apps. You also want to utilize the touch gestures for App Expose and Mission Control.
The only thing that's still wonky is the touchpad Natural Scroll vs the mouse wheel scroll, there's a third party "Scroll Reverser" app that can give you normal mouse wheel scroll and Natural Scroll on the touchpad at the same time. Hopefully some day Apple will make that a native feature.
Stop trying to install third party window managers.
What are the differences though? I have mbpr and a pc with Fedora on it and I barely see any differences aside from sandboxing in my atomic Kinoite setup and different package manager.
People often hating on brew but as a backend dev I haven't encountered any issues for years.
The issues I see people struggle with on a Mac is that development often needs things in a non-default and often less-secure setup.
There isn't a "dev switch" in macOS, so you have to know which setting is getting in your way. Apple doesn't like to EVER show error alerts if at all possible to suppress, so when things in your dev environment fail, you don't know why.
If you're a seasoned dev, you have an idea why and can track it down. If you're learning as you go or new to things, it can be a real problem to figure out if the package/IDE/runtime you're working with is the problem or if macOS Gatekeeper or some other system protection is in the way.
I can tell you in one sentence: try to have a DNS server when mDNSResponder sits on port 53 (for example because you use the new virtualization framework).
And there are a lot of such things, which are trivial or non problem in Linux.
Funny you say that, as a long term Linux user who was in the exact same boat as you, I actually find Mac M4 my best Linux laptop purchase ever so far. I think what you're missing is its virtualization story. Put UTM on it, and you're back to a familiar environment, just on much nicer hardware. The first time I booted into my Linux desktop on it, I was blown away by how much snappier it felt compared to my ~5 year old top-of-the-line PC build.
I'm as much of a fan of Mac OS as the next Linux user here, but it's a very decent hypervisor and Stuff Just Works out of the box, for the most time. No more screwing around with half-baked qemu wrappers for me, vfio, virgl and what not. And running stuff without virtualization is a non-starter for me, I've been concerned about supply chain attacks before it became fashionable. Of course it would be even nicer if new Macs could run Linux natively, and I hope Asahi project will succeed with that, but until then I'm pretty happy running Linux desktop virtualized on it.
arm64 support is very decent across all the different OS now, I hardly miss Intel. I can even reasonably play most AAA games up to maybe mid-2010s on a Windows VM that's just a three finger swipe away from my main Linux desktop.
They're basically things along those lines. They're more nefarious when background services quietly error out and you need to dig to find it was a newly required permission.
Launching unsigned apps is a problem, especially if an app bundle contains multiple binaries, since by default you need to approve exception for each of them separately.
I know that it's possible to script that since Homebrew handles it automatically, but if you just want to use a specific app outside of Homebrew, experience is definitely worse than on Linux/Windows.
There are a lot of annoying hurdles when allowing some types of application access. Needing to manually allow things in the security menu, allowing unrecognized developers, unsigned apps. Nothing insurmountable so far, but progressively more annoying for competent users to have control over their devices.
For me, who came from linux the only thing I don't like is the overview menu's lack of an (x) to close a window. The way slack stacks windows within the app so it's hard to find the right one. Pressing the red button doesn't close the app from appearing in your CMD+Tab cycle between apps, you also have to press CMD+Q. (Just a preference to how windows and linux treat windows, actually closing them. Rectangle resolved the snap to corner thing (I know MacOS has it natively too but it's not too great in comparison).
Things I prefer: Raycast + it's plugins compared to the linux app search tooling, battery life, performance. Brew vs the linux package managers I don't notice much of a difference.
Things that are basically the same: The dev experience (just a shell and my dotfiles has it essentially the same between OS's)
I think the hardest part for me, is getting used to using CMD vs CTRL for cut-copy-paste, then when I start to get used to it... in a terminal, it breaks me out with a different key for Ctrl+C. I got used to Ctrl+Shift for terminals in Linux (and Windows) for cut-copy-paste, etc.
It may seem like a small thing, but when you have literal decades of muscle memory working against you, it's not that small.
I'm a lifelong Mac user, so obviously I'm used to using CMD instead of CTRL. Inside the terminal we use CTRL for things like CTRL-C to exit a CLI application.
What messes me up when I'm working on a linux machine is not being able to do things like copy/paste text from the terminal with a hotkey combo because there is no CMD-C, and CTRL-C already has a job other than copying.
IMO apple really messed up by putting the FN key in the bottom left corner of the keyboard instead of CTRL. Those keys get swapped on every Mac I buy.
Ctrl+Shift+(X,C,V) tends to work for many/most terminals in Linux and Windows (including Code and the new Terminal in Windows)...
I agree on the Fn key positioning... I hate it in the corner and tend to zoom in when considering laptops for anyone just in case. I've also had weird arrow keys on the right side in a laptop keyboard where I'd hit the up arrow instead of the right shift a lot in practice... really messed up test area input.
As a very long-term Linux user, I'm still aggravated when implicit copy and middle-click paste doesn't just work between some apps, since it is so deeply embedded in my muscle memory!
I'm only a recent MacOS user after not using it for over 20 years, so please people correct me if I'm wrong.
But in the end the biggest thing to remember is in MacOS a window is not the application. In Windows or in many Linux desktop apps, when you close the last or root window you've exited the application. This isn't true in MacOS, applications can continue running even if they don't currently display any windows. That's why there's the dot at the bottom under the launcher and why you can alt+tab to them still. If you alt+tab to an app without a window the menu bar changes to that app's menu bar.
I remember back to my elementary school computer lab with the teacher reminding me "be sure to actually quit the application in the menu bar before going to the next lesson, do not just close" especially due to the memory limitations at the time.
I've found once I really got that model of how applications really work in MacOS it made a good bit more sense why the behaviors are the way they are.
Docker works very weirdly (it's a desktop application you have to install that has usage restrictions in enterprise contexts, and it's inside a VM so some things don't work), or you have to use an alternative with similar restrictions (Podman, Rancher Desktop).
The OS also has weird rough edges when used from the terminal - there are read-only parts, there are restrictions on loading libraries, multiple utilities come with very old versions or BSD versions with different flags than the GNU ones you might be used to coming from Linux, the package manager is pretty terrible. There are things (e.g. installing drivers to be able to connect to ESP32 devices) that require jumping through multiple ridiculous hoops. Some things are flat out impossible. Each new OS update brings new restrictions "for your safety" that are probably good for the average consumer, but annoying for people using the device for development/related.
>The OS also has weird rough edges when used from the terminal - there are read-only parts, there are restrictions on loading libraries, multiple utilities come with very old versions or BSD versions with different flags than the GNU ones you might be used to coming from Linux, the package manager is pretty terrible.
You use nix or brew (or something like MacPorts).
And they are mighty fine.
You shouldn't be concerned with the built-in utilities.
Dovker on mac has one killer feature though: bindmounts remap permissions sensibly so that uid/gid in the container is the correct value for the container rather than the same uid/gid from the host.
the workarounds on the internet are like "just build the image so that it uses the same uid you use on your host" which is batshot crazy advice.
i have no idea how people use docker on other platforms where this doesn't work properly. One of our devs has a linux host and was unable to use our dev stack and we couldn't find a workaround. Luckily he's a frontend dev and eventually just gave up using the dev stack in favour of running requestly to forward frontend from prod to his local tooling.
I suggest trying Nix on Macos, it is very nice as a package manager but also it can be used as a way to replace Docker (at least for my needs, it works very well).
This days I don't even bother installing brew on my Mac, I only use Nix.
I saw the announcement, and it looks like a cool tool. But I don't think it supports docker compose specs, which a lot of my projects use for running services (like postgres) locally when developing. And doesn't seem like there is any support for kubernetes - e.g. still needs to run through Colima etc.
Yes, with major tradeoffs. Asahi Linux is an amazing project, but they have not yet figured out how to get anywhere close to a Mac's power efficiency when it is running MacOS. For example, you will lose a lot of battery life[0][1] with the lid closed, whereas on MacOS you lose pretty much nothing.
I want to love Linux on the desktop as much as the next Linux fan, but I always end up coming back to the Mac (begrudgingly).
I really liked Windows when WSL came out, but the direction Microsoft seems to be going makes me want to run the other way.
Windows or macOS... for the hardware working well, generally just works as expected. The tradeoffs you make with each, are different. But it's usually not a hardware thing, as to why (in my experience).
I just put Linux on a 5th-gen ThinkPad P1. It works... mostly. Sound works... at about 50% volume of what Windows or macOS would output. This has consistently been an issue with me, every time I've tried to use Linux on the desktop.
It ends up being some set of compromises to use Linux.
And when video is a frequent part of my work and personal use... the quality of it on Linux just doesn't cut it.
For server usage... forget it. Linux wins, hands down. Zero contest. :D
I also like the multi desktop experience on KDE more, but I‘ve recently found out you can at least switch off some of the annoying behavior in the Mac settings, so that e.g it no longer switches to another desktop if you click on a dock icon that is open on another desktop
I thought the same thing when I saw the M5 in the news today. It’s not that I hate macOS 26, hate implies passion.. what I feel is closer to disappointment.
The problem is their philosophy. Somewhere along the way, Apple decided users should be protected from themselves. My laptop now feels like a leased car with the hood welded shut. Forget hardware upgrades, I can’t even speed up animations without disabling SIP. You shouldn’t have to jailbreak your own computer just to make it feel responsive.
Their first-party apps have taken a nosedive too. They’ve stopped being products and started being pipelines, each one a beautifully designed toll booth for a subscription. What used to feel like craftsmanship now feels like conversion-rate optimization.
I’m not anti-Apple. I just miss when their devices felt like instruments, not appliances. When you bought a Mac because it let you create, not because it let Apple curate.
Usually there's an accessibility option of some kind that disables animations; at least it exists in android and I feel like it existed in iOS (though I haven't used that in ages). I'm surprised Mac doesn't have something similar.
I just want shit to work, and most modern devs function many levels above the OS most of the time. Stuff I write is gonna run in a browser, a phone or a containerized cloud env. I don’t care about how configurable my OS is I just want to do my work and sign off.
It was different for me. I tried to move from Windows to Linux multiple times, but my Dell just refused to run it reliably no matter what. After fidling with multiple distros I finally bit the bullet and went for a mac.
I cant be more happier to have a Linux experience without the Linux pains.
Note that there certainly are quirks around arm64, however, coming from windows, i am no stranger to have to deal with such issues so they bother me less.
The best thing is, that i can confidently put mac into my backpack without worries of it performing a suicide due to not-fully-sleeping (common windowns issue)
> Everything is just slightly different. I had to split all my dot files into common/Linux/Mac specific sections. Don't expect to be able to clone and build any random C++ project unless someone in the project is specifically targeting Mac.
This seems like a very unfair complaint. macOS is not Linux. Its shell environment is based on Darwin which is distantly related to BSD. It has no connection to Linux, except for its UNIX certification.
Why is it unfair? The OP literally stated "To any Linux users". They aren't saying it's worse, just that if you're coming from Linux it can be hard to adapt. Sounds reasonable to me.
As a Linux user, I sometimes dream about the Apple hardware, and I tell myself "How hard can it be to get used to MacOS?! It has a shell after all!". The OP reminded me that it can be quite difficult.
That can be true while still being a genuine irritant. Windows and POSIX shells are different enough that you'd never assume that a script would be compatible between them - but the same is not true between your average Linux distro and macOS, which leads people to repeatedly get bit when trying to write a script that supports both.
I get the comment about Docker. Not being able to share memory with docker makes it a pain to use to run things alongside mac, unless you have mountains of ram.
So basically you're a linux user who is mad macOS isn't linux? Don't get me wrong, Tahoe is the worst GUI upgrade ever, but the last time I had problems with lack of native Mac-Arm support was ... 2021? I think your arguments are topical and don't point to a significant problem with the build ecosystem. Yes, rare niche packages haven't all migrated to Arm, but ... that's all you got?
I'm sympathetic to all of this except the part about DynamoRIO: I've barely seen people compile DynamoRIO successfully on Windows and Linux, so struggles on macOS don't seem that unusual. It seems like a marginal case to ding the Mac on.
I've been forced to use Macbooks for development at work for the past 7 years. I still strongly prefer my personal Thinkpad running Debian for development in my personal life. So don't just put it down to lack of familiarity.
Try Aerospace. Completely solved window management for me.
Also for dev, set up your desired environment in a native container and then just remote into it with your terminal of choice. (Personally recommend Ghostty with Zellij or Tmux)
macOS has a different dev culture than Linux, but you can get pretty close if you install the Homebrew package manager. For running LLMs locally I would recommend Ollama (easy) or llama.cpp. Due to the unified memory, you should be able to run larger models than what you can run on a typical consumer grade GPU, but slower.
Hello! Yes! Writing this from my commute home using my companies M3 Pro and I hate it. I'm waiting for a new joiner so I can hand this off to a new starter who has a different brain to me.
I can write up all the details, but it's well covered on a recent linuxmatters.sh and Martin did a good job of explaining what I'm feeling: https://linuxmatters.sh/65/
For me a VM set up via UTM works quite well on my Mac. Just make sure you do not virtualize x86, that kills both performance and battery life. This way I get the nice battery life and performance in a small packge but am not limited by MacOs for my development.
> I'll probably replace it with a framework at some point in the near future.
I kind of did the opposite. I have a first-gen Framework and really enjoy it, but WOW that thing runs scorchingly hot and loud. Too hot to put on your lap even doing basic workflows. Battery life is also horrible, maybe ~4 hours if you're doing any sort of heavy work, ~6 hours if you're just browsing the web. Did I mention it's loud? The fans spin up and they sound like a jet engine. The speaker on it is also substandard if that matters to you - it's inside the chassis and has no volume or bass.
Last year I replaced it with an M4 Pro Macbook and the difference is night and day. The Macbook stays cool, quiet, and has 10+ hour battery life doing the same sort of work. The trade-off is not being able to use Linux (yes, I know about Asahi, the tradeoffs are not worth it) but I have yet to find anything that I can't do on linux.
I also _despise_ the macOS window manager. It's so bad.
I suggest you head over to /r/unixporn, and you'll probably be presently surprised. Contrary to popular belief, most of this stuff is not very hard to setup. Of course, there are people also showing off custom tooling and the craziest (and sometimes silliest) things they can pull off, but a beautiful interface is usually something you can do in under an hour. It is customary to include a repo with all configurations, so if you wanted to direct copy paste, you can do it much faster than that.
Unless you're talking about the look of the physical machine. Well then that's an easier fix ;)
In my opinion, Apple is the one doing very poorly on (software) looks recently. Liquid Glass looks like a joke. Both KDE and GNOME look better. The new Expressive Material 3, on Android, actually looks great.
I have a Macbook Air and I pretty much use it as an ssh machine. It is definitely over priced for that, but it at least beats the annoyance of having to deal with Windows and all the Word docs I get sent or Teams meetings... (Seriously, how does Microsoft still exist?)
Since I mostly live in the terminal (ghostty) or am using the web browser I usually don't have to deal with stupid Apple decisions. Though I've found it quite painful to try to do some even basic things when I want to use my Macbook like I'd use a linux machine. Especially since the functionality can change dramatically after an update... I just don't get why they (and other companies) try to hinder power users so much. I understand we're small in numbers, but usually things don't follow flat distributions.
> I had to split all my dot files into common/Linux/Mac specific sections
There's often better ways around this. On my machine my OSX config isn't really about specifically OSX but what programs I might be running there[0]. Same goes for linux[1], which you'll see is pretty much just about CUDA and aliasing apt to nala if I'm on a Debian/Ubuntu machine (sometimes I don't get a choice).
I think what ends up being more complicated is when a program has a different name under a distro or version[2]. Though that can be sorted out by a little scripting. This definitely isn't the most efficient way to do things but I write like this so that things are easier to organize, turn on/off, or for me to try new things.
What I find more of a pain in the ass is how commands like `find`[3] and `grep` differ. But usually there are ways you can find to get them to work identically across platforms.
> Don't expect to be able to clone and build any random C++ project unless someone in the project is specifically targeting Mac.
https://jobs.lever.co/zoox/d4a324c6-8186-435f-a102-1ac7a69ab...
I'm an engineer on the HD Mapping Team at Zoox (autonomous vehicles) and we are looking for a mid-career full stack engineer with a flair for automation and dev tooling. The role is about helping us scale our map production and management systems. Geospatial data and 3D visualization (deck.gl) experience are nice to haves but by no means required.
If the role sounds like a good fit and you would like to talk to someone directly you can email me at carl chatfield gmail. (Please fill in the gaps).
Usual perks apply, but please read the linked job description for details.
Thank you :)
reply