I feel like big / old companies thrive on process and are bogged down in bureaucracy.
Sure there is a process to get a library approved, and that abstraction makes you feel better but for the guy who's job it is to approve they are not going to spend an entire day reviewing a lib. The abstraction hides what is essentially a "LGTM" its just that takes a week for someone to check it off their outlook todos.
They are caching internal LLM state, which is in the 10s of GB for each session. It's called a KV cache (because the internal state that is cached are the K and V matrices) and it is fundamental to how LLM inference works; it's not some Anthropic-specific design decision. See my other comment for more detail and a reference.
But there's more to agtech than driving a tractor around, a lot of what these big integrated systems do (at the high end) is very data driven -- determining where and how to plant, irrigate, fertilize, etc. There's a lot of integration work beyond just making the tractor drive.
35 years in the tech industry has taught me one thing: incumbents that have been around for a long time are almost always more clueless and more full of shit than you think, what they do isn't as hard as they claim and you can probably do better given a fraction of the time they spent just because you don't have legacy systems to worry about and because technology and tooling has moved on.
Incumbents thrive on the myths about what they do being hard and impossible to replicate.
Yes, it is a lot of work to replace what you can get off the shelf today. But it isn't like the basic tech itself is all that hard to replicate step by step if you accept that it takes time and the first N development stages will give you something that isn't as feature rich and polished. And if one makes it open source, interoperability will be easier to do something about.
Perhaps some of the analysis tools/services you can buy today will be hard to replicate, but I doubt they are that hard to replicate. And it is worth having slightly suboptimal results for a couple of seasons than being on the receiving end of a hostage-situation.
But yes, it is certainly a huge effort to get what you actually need.
The Pareto principle applies. For highly complex systems it’s easy to build most of what the incumbents have. It’s the last 20% where it is hard to catch up just because the incumbents have decades of a head start and have the momentum. And even more so here because it’s not just software. It’s very science and hardware heavy.
For farming, it’s even more tough because the market has a really uneven distribution. Usually the best place to tackle huge incumbents is in the midmarket. They’re big enough to need your automation, but they’re small enough to take a risk to save some money, and the features you haven’t built yet aren’t blockers for them.
But there’s basically no midmarket farming, all farms are pretty much either really big or really small.
> But there's more to agtech than driving a tractor around, a lot of what these big integrated systems do (at the high end) is very data driven -- determining where and how to plant, irrigate, fertilize, etc.
How difficult is this to implement outside of big ag-tech? I feel that a community of experienced farmers and programmers (or programmer-farmers) could tackle this.
The machine, from tractor to combine and everything in between often feeds data together to produce a holistic understanding.
Things like
- How much fuel was used
- Where your tractors and sprayers drove
- Soil samples and content
- How and where every bit of chemical and fertilizer was applied
- What weather hit your field
- How much and and the moisture content of every bit of the field you harvested
But if you're observing a fleet of 100+ machines you kinda need some integration and a central location. Which in turn connects to multiple other services like weather, crop markets, fuel prices etc.
I think that is a different market than the market for dumb tractors. There might be some overlap, but I doubt the people who want to fix their own tractors are different than the corporations that are tracking 100 tractors across hundreds/thousands of fields.
The software is certainly easier to build, but there's a lot of hardware involved here beyond the tractor. Claude is not necessarily going to make it easier to do soil sampling or measuring field conditions or yield outputs.
Farmers would be foolish to rely on an LLM because farming margins are too low to makeup for even a small quick mistake. Many farms will profit 1% on investment over 1-2 decades, although year to year yield can vary 30%.
What kind of sensors do those cheap kits come with?
A tractor is a big thing to have rolling around unsupervised. I would want a lot of safeguards. Blindly going from one GPS point to another sounds like a nightmare.
The cheapie aliexpress specials simply drive the line they're programmed to drive. They have GPS and a gyro to account for the slope of the land. You're supposed to stay in the tractor while they're operating as a safety... but this doesn't always happen in some parts of the world.
30 years ago you had a hand-gas and clamped the wheel to drive the tractor in a line. Using GPS is a litle bit more safe than that. And I talk about Germany!
Right, but that has nothing to do with a vendor making a dumb tractor. Why do we need to dismissively move the conversation from TFA. The data driven approach is made up of several parts, and we're looking at a specific part
Making a dumb tractor for the use-case of dumb tractor is obviously a winning idea.
I just don't think you're going to effectively compete with big agtech by putting a bunch of parts in a box, shaking it, and hoping you end up with a beautifully integrated solution. Integration hell is the reason big commercial firms dominate when it comes to large integrated systems.
Why not? They sell telematics systems separately from cars. It’s possible to do this and it might not be too difficult depending on how the system is composed.
Precision ag is orders of magnitude more complicated of a system than vehicle telematics. Again, driving the tractor is the easy part, and you can already get cheap systems to do this.
admittedly, i'm not a farmer nor an expert in data driving farming. but getting a farmer the ability to precisely drive a tractor in a field so that planting seeds, applying fertilizer, and any of the other steps would be a huge win. The settings used when doing that can easily come from bigFarmData gained from other sources. Can it be used even more precisely when everything is gathered/integrated by one company? That's a question that I'm not by default saying yes to, but it seems like you do think that is true. Even if it is true, does that mean the difference from a farmer going broke because his DIY tractor behaved slightly differently than your solution? I'd posit that a farmer only being allowed to play the bigFarmData game by only being allowed to buy from one vendor that is expensive while also forcing any repairs to be expensive will cause farmers to financially unnecessarily struggle.
The economics of farming (at least in the US) are brutal. Scaling up is really the only way to make a living long term. Some of this is due to equipment cost (look up how much a combine costs), and some is due to competition. It's not unusual for a farmer to be land rich and cash poor.
If you want to see a couple of guys learning how to farm from scratch, visit https://www.youtube.com/@spencerhilbert. Spencer and his brother made a bit of money off games and Youtube and have been starting out on corn, hay, as well as raising beef. It gives a pretty good insight into how pervasive tech is in farming, and how despite that, how much of farming still relies on hard, physical work.
I'll check out Spencer's channel. For a comedy perspective, there's Clarkson's Farm or Growing Belushi. Even though they are for entertainment, there's a still a lot of info in those shows to not be written off.
However, I'm not as interested in being a farmer at that level. I'm much more interested in the homesteading aspect of farming. I'm not trying to feed the world as much as me and mine and maybe some extra. So not just farming, but also some ranching with sheep/goats/chickens/pigs. I have friends doing this that I'm keeping an eye on. They had a head start as their kids grew up in FFA and are already familiar with raising live stock, and then having them processed to make that part much less daunting.
That would be a correct interpretation. Depending on how "cowboy" you want to go, there's plenty of slang. Raising hamburgers and steaks. Bacon seeds. Lamb chops. Just idiomatic sayings referring to the ultimate end products. I've heard all sorts of things to be cute.
Scale is a huge factor. It makes the most sense to invest in precision ag tech when you have enough acres that the investment pays off. At 5000+ acres, farms are using integrated systems that combine satellite data, on-tractor sensors, soil sensors, drone sensors, in-field weather sensors, with a lot of science to squeeze the most out of the land. At that scale, there's a lot of money invested in a season and you aren't looking for a DIY project, you need production quality product with proven scientific rigor. You probably don't have the manpower to do a DIY project anyway, you are relying heavily on automation and outsourcing. And at the low end, it it more effort to implement any of this than you'll get out of it.
So a DIY solution is aiming for somewhere in the center of the market -- enough scale that it makes sense to bother, but not enough enough money to avoid the headache of DIY. It might make sense for some mid-sized farms in developing economies, but it seems to be a narrow window to me.
Is suspect most farmers would prefer the diy add-on version of these than the single manufacturer integrated one. A modern smartphone and stay of I/o sensors send like it could do pretty much the entire job
I had to scroll back up to see what this reply was to, to get the full chuckle and yup, I was told frequently by my male parental unit that the top two reasons for having kids was chores and tax deductions. But there's a reason farm families leaned on the large side. The more hands you had helping the less hard things could be while never being easy
I think people forget how many satellites are pointed at all parts of the planet. They are used for crop reporting and weather and all sorts of shit. It isnt the 1960s where only the super powers have them and they drop rolls of film.
Satellites aren't pointed at "all parts of the planet". They're generally taking regular photos of known locations, when the right type of satellite passes over. That's where you get lucky shots like the one you noticed. Then that satellite has to orbit, and there isn't another one nearby just ready to take another photo. Then the carrier changes direction...
Sure any single one but there are many companies, some with hundreds of satellites in orbit at any given time who will point it where ever if you pay them enough
An aircraft carrier is not that fast, if you see it once you know roughly what radius of circle it is going to be in for a while (ignoring the fact that they are likely going somewhere for a reason its not their job is to say out of sight)
This is literally the point: it's easy to tell them to point a satellite at beirut and get pictures every 3 hours or whatever, it's much more difficult to tell them to point at a location in the middle of the pacific ocean... because you don't know the location in the first place.
Beirut doesn't move around a lot. Carriers do. While there are a lot of satellites pointing at the earth at any one moment, this isn't some kind of Hollywood super screen showing a real time image of the entire pacific. You just see whatever small patch the satellite happens to be pointing at.
And again, ignoring the part where america would probably start shooting down satellites.
Do you seriously think the US Navy doesn't avoid Chinese tracking? What kind of a question is that? Like, there's probably a magazine that lists the cruising destinations of most of the carriers, what ports they're going to stop at next, etc, because, you know, they're not at war and trying to maintain secrecy.
> Do you seriously think the US Navy doesn't avoid Chinese tracking?
How would they avoid having a Chinese satellite continuously track their movement? They have the capability to do that, there is nothing USA can do about it except shoot down all the Chinese satelites.
US carrier groups probably pose the #1 strategic threat to the PRC in the Pacific. You can safely assume they throw whatever resources are necessary at the task of knowing their whereabouts.
I mean, you can try all you want, but there's limits to hiding a fleet of ships on the open sea. They are huge, emit immense heat signatures, and produce miles-long wakes while moving. As long as there are satellites overhead, they will be able to find them.
I suspect we might be talking past one another because we have different degrees of precision in mind: I'm not saying the Chinese could have a missile target lock on a carrier whenever they wanted, much less in wartime. Far from it. But I highly doubt you can reposition a carrier group without them catching wind of it within hours.
This is the sort of arms race that is going to change every year. I just read an article that claimed that China has launched a system of satellites that use non-visual means to track ships in the pacific (via.. emissions or radar or something?) and china can certainly afford to put a bunch of them in orbit.
It's not impossible to track a carrier group via satellites, but it's not trivial either, you can't just, like, open up your windows gui and click on a satellite and click the button that says "follow this carrier" because like satellites orbit and fly around the earth and the ships can alter course when you don't have eyes on them and so on and so forth.
And yeah, as you point out, there's a big difference between having a satellite picture showing a probable carrier group at X and Y coordinates and being able to actually strike the thing.
Now I’m contemplating just how small and light of an instrument could be carried on a Starlink-style satellite that could detect a large ship. A smallish COTS telescope, e.g. a Celestron 8SE ($1700 retail) could easily see a ship from the Starlink constellation altitude.
Never mind that the Starlink radio arrays are, well, radio arrays that quite effectively cover the whole planet. If you think of each satellite as a radio telescope, its resolution is crap and probably cannot disambiguate a carrier group from anything else (at least according to disclosed specs). But it would be quite interesting to build a synthetic aperture array out of multiple satellites. This would rely on emissions from the ships themselves, but I bet it could be done and could locate ships quite nicely.
Citation needed. It seems pretty clear that a mechanism to allow a user to access a battery will increase complexity, making all the other properties harder to achieve.
1) iPhones for example are ip68 rated while those are just ipx8/9
2) Do you want to be limited to the universe of those search results? Do you want to buy a Sony Xperia?
You can't make batteries directly replaceable at the same quality and price. There are tradeoffs. Obviously waterproof non-embedded batteries exist. Just like you could make a removable battery the same slimness as embedded. With massive tradeoffs. It's capacity will be terrible. No one is surprised a removable battery can be waterproof but the point is there are tradeoffs.
I don't see those options in the search results either way
In any case we heard the same sort of rationalization for getting rid of the headphone jack, so color me extremely skeptical-- yes of course there's going to be trade-offs, but what a coincidence that headphone jacks, replaceable batteries, SD card slots have all gone by the wayside, which just so happens to allow for upselling Bluetooth and cloud storage
Kinda weird to argue for longer life via battery replacement and against longer life via contaminant protections. My phone is regularly covered in chalk dust, sawdust, water, …
No, the list was "Cheaper, higher battery capacity, water proof, smaller, stronger". I don't think it's all that controversial to say that there are engineering tradeoffs to be made here. You can make a waterproof phone with a removable battery, but you can't make a waterproof phone with a removable battery that is as good or better than an iPhone in every other respect too. If you could, iPhones would already have removable batteries.
My point is that if it's all of those things (crucially, including cheaper), then it's a Pro-Apple move to manufacture iPhones that way. There would be no downside. To the extent they make anti-consumer moves at all (which I'll cede for the sake of keeping this brief), they do so because those moves are pro-Apple.
Planned obsolesce are anti consumer and increases sales. So yes anti consumer design can increase sales volume, that is often the point.
Replaceable batteries lets you use your phone longer, that means people will take longer to buy a new phone and reduce iphone sales. Such anti consumer moves requires regulations to be fixed, since there is no incentive for the company to be pro consumer here.
Oh yes, the famous Galaxy XCover 7 Pro. People are camping out in the rain waiting for their release because replaceable batteries are under such high demand.
So we're moving the goalposts from "these features can coexist" to "such a phone has to be popular"? Why don't you skip to the end and tell me where they're going to end up?
If phones are not for sale with features, how does that allow drawing any conclusion about popularity? I've yet to meet a single person who says, "I sure am glad I can't use fingerprint unlock on my iPhone anymore", but obviously it's not worth leaving the entire ecosystem
Recall also that building Android phones barely makes any money, so it's not exactly a business teeming with disruption
It really really really depends on how you are using it and what you are using it for.
I can get LLMs to write most CSS I need by treating it like a slot machine and pulling the handle till it spits out what I need, this doesnt cause me to learn CSS at all.
I find it a lot more useful to dive into bugs involving multiple layers and versions of 3rd party dependencies. Deep issues where when I see the answer I completely understand what it did to find it and what the problem was (so in essence I wouldn't of learned anything diving deep into the issue), but it was able to do so in a much more efficient fashion than me referencing code across multiple commits on github, docs, etc...
This allows me to focus my attention on important learning endeavors, things I actually want to learn and are not forced to simply because a vendor was sloppy and introduced a bug in v3.4.1.3.
LLMS excel when you can give them a lot of relevant context and they behave like an intelligent search function.
Indeed, many if not most bugs are intellectually dull. They're just lodged within a layered morass of cruft and require a lot of effort to unearth. It is rarely intellectually stimulating, and when it is as a matter of methodology, it is often uninteresting as a matter of acquired knowledge.
The real fun of programming is when it becomes a vector for modeling something, communicating that model to others, and talking about that model with others. That is what programming is, modeling. There's a domain you're operating within. Programming is a language you use to talk about part of it. It's annoying when a distracting and unessential detail derails this conversation.
Pure vibe coding is lazy, but I see no problem with AI assistants. They're not a difference in kind, but of degree. No one argues that we should throw away type checking, because it reduces the cognitive load needed to infer the types of expressions in dynamic languages in your head. The reduction in wasteful cognitive load is precisely the point.
Quoting Aristotle's Politics, "all paid employments [..] absorb and degrade the mind". There's a scale, arguably. There are intellectual activities that are more worthy and better elevate the mind, and there are those that absorb its attention, mold it according to base concerns, drag it into triviality, and take time away away from higher pursuits.
I agree with your definition of programming (and I’ve been saying the same thing here), but
> It's annoying when a distracting and unessential detail derails this conversation
there is no such details.
The model (the program) and the simulation (the process) are intrinsically linked as the latter is what gives the former its semantic. The simulation apparatus may be noisy (when it’s own model blends into our own), but corrective and transformative models exists (abstraction).
> No one argues that we should throw away type checking,…
That’s not a good comparison. Type checking helps with cognitive load in verifying correctness, but it does increase it, when you’re not sure of the final shape of the solution. It’s a bit like Pen vs Pencil in drawing. Pen is more durable and cleaner, while Pencil feels more adventurous.
As long as you can pattern match to get a solution, LLM can help you, but that does requires having encountered the pattern before to describe it. It can remove tediousness, but any creative usage is problematic as it has no restraints.
Qua formal system, yes, but this is a pedantic point as the aim - the what - of a system is more important than the how. This distinction makes the distinction between domain-relevant features and implementation details more conspicuous. If I wish to predict the relative positions of the objects of our solar system, then in relation to that end and that domain concern, it matters not whether the underlying model assumes a geocentric or heliocentric stance in its model (that tacitly is the deeper value of Copernicus's work; he didn't vindicate heliocentrism, he showed that a heliocentric model is just as explanatory and preserves appearances equally well, and I would say that this mathematical and even philosophical stance toward scientific modeling is the real Copernican revolution, not all the later pamphleteer mythology).
Of course, in relation to other ends and contexts, what were implementation details in one case become the domain in the other. If you are, say, aiming for model simplicity, then you might prefer heliocentrism over geocentrism with all its baroque explanatory or predictive devices.
The underlying implementation is, from a design point-of-view, virtually within the composite. The implementation model is not of equal rank and importance as the domain model, even if the former constrains the latter. (It's also why we talk about rabbit-holing; we can get distracted from our domain-specific aim, but distraction presupposes a distinction between domain-specific aim and something that isn't.) When woodworking, we aren't talking about quantum mechanical phenomena in the wood, because while you cannot separate the wood from the quantum mechanical phenomena as a factual matter - distinction is not separation - the quantum is virtual, not actual with respect to the wood, and it is irrelevant within the domain concerning the woodworker.
So, if there is a bug in a library, that is, in some sense, a distraction from our domain. LLMs can help keep us on task, because our abstractions don't care how they're implemented as long as they work and work the way we want. This can actually encourage clearer thinking. Category mistakes occur in part because of a failure to maintain clear domain distinctions.
> That’s not a good comparison. Type checking [...]
It reduces cognitive load vis-a-vis understanding code. When I want to understand a function in a dynamic language, I often have to drill down into composing functions, or look at callers, e.g., in test cases to build up a bunch of constraints in my mind about what the domain and codomain is. (This can become increasingly difficult when the dynamic language has some form of generics, because if you care about the concrete type/class in some case, you need even more information.)
This cognitive load distracts us from the domain. The domain is effectively blurred without types. Usually, modeling something using types first actually liberates us, because it encourages clearer thinking upfront about the what instead of jumping right into how. (I don't pretend that types never increase certain kinds of burdens, at least in the short term, but I am talking about a specific affordance. In any case, LLMs play very nicely with statically-typed languages, and so this actually reduces one of the argued benefits of dynamic languages as ostensibly better at prototyping.)
> As long as you can pattern match to get a solution [...]
Indeed, and that's the point. LLMs work so well precisely, because our abstractions suck. We have lot of boilerplate and repetitive plumbing that is time-consuming and tedious and pulls us away from the domain. Years of programming research and programming practice has not resolved this problem, which suggests that such abstractions are either impractical or unattainable. (The problem is related to the philosophical question whether you can formalize all of reality, which you cannot, and certainly not under one formal system.)
I don't claim that LLMs don't have drawbacks or tradeoffs, or require new methodologies to operate. My stance is a moderate one.
> Yes but that’s why you ask it to teach you what it just did.
Are you really going to do that though? The whole point of using AI for coding is to crank shit out as fast as possible. If you’re gonna stop and try to “learn” everything, why not take that approach to begin with? You’re fooling yourself if you think “ok, give me the answer first then teach me” is the same as learning and being able to figure out the answer yourself.
I would consider this a benefit. I've been a professional for 10 years and have successfully avoided CSS for all of it. Now I can do even more things and still successfully avoid it.
This isn’t necessarily a bad thing. I know a little css and have zero desire or motivation to know more; the things I’d like done that need css just wouldn’t have been done without LLMs.
Has apple been a serious development platform in the last 20 years?
I know a lot of devs like apple hardware because it is premium but OSX has always been "almost linux" controlled by a company that cares more about itunes then it does the people using their hardware to develop.
At least 9 out of every 10 software engineers I know does all their development on a mac. Because this sample is from my experience, it’s skewed to startups and tech companies. For sure, lots of devs outside those areas, but tech companies are a big chunk of the world’s developers.
So yea I would say Apple is a “serious development platform” just given how much it dominates software development in the tech sector in the US.
The hardware for a Linux laptop right now is not great. Especially for an arm64 machine. Even if the hardware is good the chassis and everything else is typically plastic and shitty.
That is a surprising sentiment. Most dell and Lenovo laptops work just fine and are usually of reasonably good build quality (non-plastic chassis etc.).
arm64 is however mostly bad. The only real contender for Linux laptops (outside of asahi) was Snapdragon's chips but the HW support there was lacking iirc.
They give us Dell Linux machines from work. They suck so bad and we have so many problems. Overheating, camera is terrible, performance is bad relatively to the huge weight of the device. Everything is a huge step down from Macs.
Whenever I see Linux people comparing Linux and Mac I'm amazed at the audacity. They are not in the same league. Not by a mile. Even the CLI is more convenient on the Mac which is truly amazing to me.
How is the Mac CLI more convenient? There isn't even a package manager in the box, they ship loads of old outdated tools too. Plus there's the whole BSD/GNU convention thing you have to watch out for.
I don't find my ThinkPad running Linux overheats, nor is it particularly heavy. And performance is comparable to the similarly priced MBP at the time. Camera sucks, but compared to my Surface so do the Macs...
Recently an article on HN front page was about a guy who had to file down his MBP because the front edge of it was too sharp and resting his wrists on it hurt his hands. At least two people in the comment section noted how the sweat on their hands over time caused the sharp edge of the MBP chassis to pit and it caused it to turn in to a sharp serrated edge that actually cut their hands.
You can say other laptops are "plastic and shitty" all you want, but Apple's offerings aren't necessarily the best thing out there either. I personally like variety, and you don't get that from Apple. I can choose from hundreds of form factors from a lot of vendors that all run Linux and Windows just fine, plastic or not.
I have a personal Framework 13 and a work-issued MacBook Pro. I love Framework’s mission of providing user-serviceable hardware; we need upgradable, serviceable hardware. However, the battery life on my MacBook Pro is dramatically better than on my Framework. Moreover, Apple Silicon offers excellent performance on top of its energy efficiency. While I use Windows 11 on my Framework, I prefer macOS.
Additionally, today’s sky-high RAM and SSD prices have caused an unexpected situation: Apple’s inflated prices for RAM and SSD upgrades don’t look that bad in comparison to paying market prices for DIMMs and NVMe SSDs. Yes, the Framework has the advantage of being upgradable, meaning that if RAM and SSD prices decrease, then upgrades will be cheaper in the future, whereas with a Mac you can’t (easily) upgrade the RAM and storage once purchased. However, for someone who needs a computer right now and is willing to purchase another one in a few years, then a new Mac looks appealing, especially when considering the benefits of Apple Silicon.
>>At least 9 out of every 10 software engineers I know does all their development on a mac
I work in video games, you know, industry larger than films - 10 out of 10 devs I know are on Windows. I have a work issued Mac just to do some iOS dev and I honestly don't understand how anyone can use it day to day as their main dev machine, it's just so restrictive in what the OS allows you to do.
It makes sense that you use Windows in a video game company. We use windows as well at work and it's absolutely awful for development. I would really prefer a Linux desktop, especially since we exclusively deploy to Linux.
I work as a consultant for the position, navigation, and timing industry and 10 of 10 devs were on Windows. Before that I worked for a big hollywood company and while scriptwriters and VP executive assistants had Macs, everyone technical was on Windows. Movies were all edited and color graded on Windows.
>it's just so restrictive in what the OS allows you to do.
The people using them typically aren't being paid to customize their OS. The OS is good for if you just want to get stuff done and don't want to worry about the OS.
I compile a tool we use, send it to another developer, they can't open it without going through system settings because the OS thinks it's unsafe. There is no blanket easy way to disable this behaviour.
We also inject custom dlibs into clang during compilation and starting with Tahoe that started to fail - we discovered that it's because of SIP(system integrity protection). We reached out to apple, got the answer that "we will not discuss any functionality related to operation of SIP". Great. So now we either have to disable SIP on every development machine(which IT is very unhappy about) or re-sign the clang executable with our own dev key so that the OS leaves us alone.
If it's being sent to another developer then asking them to run xattr -rd com.apple.quarantine on the file so they can run it doesn't seem insurmountable. I agree that it's a non-starter to ask marketing or sales to do that, but developers can manage. Having to sign and then upload the binary to Apple to notarize is also annoying but you put it in a script and go about your day.
If SIP is kicking in, it sounds like you're using the clang that comes with Apple's developer tools. Does this same issue occur with clang sourced from homebrew, or from LLVM's own binary releases?
Yes, it kicks in even with non apple supplied clang(most notably, with the clang supplied as part of the Android toolchain, since we sometimes build Android on MacOS and having to re-sign the google-supplied clang with our own certificate is now a regular thing every time there is an update released).
Because...it's official behaviour that is fully supported by clang? If you want to add a hook on compilation start, it's literally the documented way - you include your own dlib with necessary overrides and then you can call your own methods at each compilation step. Not even sure how you'd do it with a shell script? You need to have knowledge of all the compilation and linking units, which....you have from within Clang.
It is a weird situation. Apple products are consumer products but they make us use them as development hardware because there is no other way to make software for those products.
> Has apple been a serious development platform in the last 20 years?
This is one of those comments that is so far away from reality that I can’t tell if it’s trolling.
To give an honest answer: Using Macs for serious development is very common. At bigger tech companies most employees choose Mac even when quality Linux options are available.
I’m kind of interested in how someone could reach a point where they thought macs were not used for software development for 20 years.
> I’m kind of interested in how someone could reach a point where they thought macs were not used for software development for 20 years.
If you work with engineering or CAD software then Macs aren't super common at all. They're definitely ubiquitous in the startup/webapp world, but not necessarily synonymous with programming or development itself.
Most "serious" companies do not support Linux in their IT infrastructure. I've begged to run Linux, but it's a hard no from IT. They only support Windows and MacOS, and that's all. So I choose a Windows desktop, because I am not a fan of Apple. Having been forced to use Macs in past jobs, I'll choose Windows every time. I liked being able to dual-boot Windows on a MBP in the past, but that is no longer an option.
Anything being developed for the Apple ecosystem requires use of the Apple development platform. Maybe the scope could be called "unserious," but the scale cannot be ignored.
However having used Xcode at some point 10 years ago my belief is that the app ecosystem exists in spite of that and that people would never choose this given the choice.
For me at least, not being Linux is a feature. Linux has always been “almost Unix” to the point where now it has become its own thing for better or worse. OS X was never trying to be Linux. It would be better if we still had a few more commercial POSIX implementations.
That is fair but in my experience most devs are targeting linux servers not BSD(or any other flavour) which is helped by OSX. If OSX was linux derived it would suit them just as well.
edit: I suppose I should also note the vast majority of people developing on mac books (in my experience anyway) are actually targeting chrome.
> Turns out, an operating system is more than just a kernel with some userspace crap tacked on top, unlike what Linux distros tend to be.
This is also my opinion of OSX, let's not pretend that the userland mess is the most beautiful part of OSX.
Apple has great kernel and driver engineering for sure but once you go the stack above, it's ducktape upon ducktape and you better not upgrade your OS too quickly before they fix the next pile they've just added.
Heterogeneity is the feature. The Linux ecosystem is better off for it (systemd, Wayland, dconf, epoll, inotify are all based on ideas that were in OS X first) and not being beholden to Linux is a competitive advantage for Apple everyone wins.
> Has apple been a serious development platform in the last 20 years?
i dont think anyone asks this question in good faith, so it may not even be worth answering. see:
> I know a lot of devs like apple hardware because it is premium but OSX has always been "almost linux" controlled by a company that cares more about itunes then it does the people using their hardware to develop.
yea fwiw macs own for multi-target deployments. i spin up a gazillion containers in whatever i need. need a desktop? arm native linux or windows installations in utm/parallels/whatever run damn near native speed, and if im so inclined i can fully emulate x86/64 envs. dont run into needing to do that often, but the fact that i can without needing to bust out a different device owns. speed penalty barely even matter to me, because ive got untold resources to play around with in this backpack device that literally gets all day battery. spare cores, spare unified mem, worlds my oyster. i was just in win xp 32bit sp2 few weeks ago using 86box compiling something in a very legacy dependent visual studio .net 7 environment that needed the exact msvc-flavored float precision that was shipping 22 years ago, and i needed a fully emulated cpu running at frequencies that was going to make the compiler make the same decisions it did 22 years ago. never had to leave my mac, didnt have to buy some 22 year old thinkpad on ebay, this thing gave me a time machine into another era so i could get something compiled to spec. these techs arent heard of, but its just one of many scenarios where i dont have to leave my mac to get something done. to say its a swiss army knife is an understatement. its a swiss army knife that ships with underlying hardware specs to let you fan out into anything.
for development i have never been blocked on macos in the apple silicon era. i have been blocked on windows/linux developing for other targets. fwiw i use everything, im loyal to whoever puts forth the best thing i can throw my money at. for my professional life, that is unequivocally apple atm. when the day comes some other darkhorse brings forth better hardware ill abandon this env without a second thought. i have no tribalistic loyalties in this space, i just gravitate towards whoever presents me with the best economic win that has the things im after. we havent been talking about itunes for like a decade.
Apple had real Unix a decade before the Linux crap was made, a bad unix copy. Nextstep was much better than Linux crap. "A budget of bad ideas" is what Alan Kay said about Linux [1], he invented the personal computer.
My 1987-1997 ISP was based on several different Unix running on Apple, probably long before you where born.
Yes, I ran it also on 68000 and PowerPC Macs. I preferred MacOS with all the MPW environment and tools on top, the GUI was much better: a full WYSIWYG text editor that also was the command line, so you could compose text, copy and paste and also execute it. But that was invented with the workspace in Smalltalk-76 and recreated with MPW.
Email me if you need help restoring it on your Mac, or if you need parts to revive your hardware. I have at least one of every Mac since 1982 (yes I know the Lisa was introduced in januari 1983) including all floppies, CD-ROMs, books, screens, keyboards mice, Appletalk. Although some parts have rusted or decayed beyond repair. I hope someday somebody will buy the whole museum from me.
The best quality Unix we ran was BSDi, you'll find some of that still in NetBSD, OpenBSD and maybe FreeBSD.
The coolest Unix was IRIX though, but that was because of the Grafix code, not the Unix kernel.
I mean there were mainframes which could be described as that. IBM just fixed it in hardware instead of software so its not like it was an unknown field.
Even if that were actually true (it’s not in important ways) Google showed you could do this cheaply in software instead of expensive in hardware.
You’re still hand waving away things like inventing a way to make map/reduce fault tolerant and automatic partitioning of data and automatic scheduling which didn’t exist before and made map/reduce accessible - mainframes weren’t doing this.
They pioneered how you durably store data on a bunch of commodity hardware through GFS - others were not doing this. And they showed how to do distributed systems at a scale not seen before because the field had bottlenecked on however big you could make a mainframe.
Sure there is a process to get a library approved, and that abstraction makes you feel better but for the guy who's job it is to approve they are not going to spend an entire day reviewing a lib. The abstraction hides what is essentially a "LGTM" its just that takes a week for someone to check it off their outlook todos.
Maybe your experience is different.
reply