Hacker Newsnew | past | comments | ask | show | jobs | submit | gettingoverit's commentslogin

Dear Noam,

I remember your nickname from the Infernu project of yours. It was one of the best attempts at typechecking JavaScript I've ever seen, and it's as needed in modern days of TS team going insane as it never has been.

The things we don't really need more of are editors, CLI software, or Rust software.

Please reconsider your goals.


Okay, this will be a bit on a conspiracy theory side, but there was a paper recently describing how to do matrix computations on a RAM stick connected to FPGA, and they've shown it's possible to do it cheaper per flop that GPUs. Except, of course, there's a variety of RAM producers.

It might either be artificial to keep GPU prices where they are, or someone already started building their RAM-based AI datacenter.


Oh, that smell of molten keyboard plastic, those yellow spots burned into a display with its own heat exhaust, those laser-machined loudspeaker holes next to keyboard, all filled with grime! How I miss that time on a Macbook, with all the chords you have to press whenever you need a Home or End button to edit the line! Not to mention the power button right next to backspace.

It's so rewarding when its charger dies in a month, and you feel superior to your colleague, whose vintage 6 months old charging cable with none of that extraneous rubber next to the connector catches fire along with your office. What a time to be alive!

The best part is the motherboard produced in a way to fail due to moisture in a couple of years, with all the uncoated copper, with 0.1mm pitch debugging ports that short-circuit due to a single hair, and the whole Louis Rossmann's youtube worth of other hardware features meant to remind you to buy a new Apple laptop every couple of years. How would you otherwise be able to change the whole laptop without all the walls around repair manuals and parts? You just absolutely have to love the fact even transplanting chips from other laptops won't help due to all the overlapping hardware DRMs.

I'll go plug the cable into the bottom of my wireless Apple mouse, and remind myself of all the best times I had with Apple's hardware. It really rocks.


> the whole Louis Rossmann's youtube worth of other hardware features meant to remind you to buy a new Apple laptop every couple of years

Apple have a couple of extra mechanisms in place to remind us to buy a new device:

- On iOS the updates are so large it doesn't fit on the device. This is because they purposely put a small hard drive i. It serves a second purpose - people will buy Apple cloud storage because nothing fits locally.

- No longer providing updates to the device after just a few years when it's still perfectly fine. Then forcing the app developer ecosystem to target the newer iOS version and not support the older versions. But it's not planned obsolescence when it's Apple, because they're the good guys, right? They did that 1984 ad. Right guys?


> No longer providing updates to the device after just a few years when it's still perfectly fine.

This is a weird one to complain about because Apple leads the industry in supporting older devices with software updates. iOS 26 supports devices back to 2019. And they just released a security update for the iPhone 6S, a model released a full decade ago, last month.

The oldest Samsung flagship you can get Android 16 for is their 2023 model (Galaxy S23), and for Google the oldest is the 2021 model (Pixel 6).


We’re moving away from hardware and into software and longevity in this discussion but wrt “apple leads the industry in supporting older devices with software updates” i would point out that Red Hat is probably more of a beacon / industry leader here as the main promise of RHEL is 10 years of support and updates. But again we don’t ship hardware so I see the narrower sense that you’re making but still would like to push back on the idea that giant companies cannot continue to keep complicated legacy code bases secure and functional about 2x longer in most cases than what Apple has done


Main problem, not just from Apple, is that as phone tech gets standardized and more long-lasting the software support cycles have not gotten longer.

It is abysmal that Android phone makers still need to customize the OS so much for their hardware. Apple has no incentive for longer support cycles if Android does even worse on it.


It has always been like that since CP/M and commercial UNIX days.

Vertical integrations like everyone sell a product, a brand, a whole ecosystem experience.

If all OEMs sold the same CP/M, UNIX, MSX, MS-DOS, Windows software stack, on the what is basically the same hardware with a different name glued on the case, they wouldn't get any brand recognition, aka product differentiation.

Thus OEMs specific customisations get added, back in the day bundled software packages are part of the deal, nowadays preinstalled on the OS image, and so on.


"You cheated on me last night!"

"This is a weird one to complain about, look at Donnie, he cheated on his girlfriend 3 times last month!"


i don't get it, how long do you think is reasonable?


I tend to look at technology prices in terms of cost per unit time of useful life.

If Apple continues to supply updates for six-year-old phones, iPhone 17 prices range from $11/month (base model iPhone 17) to $28/month (iPhone 17 Pro Max w/2TB storage), meaning it's only about 20% more expensive to store data on a RAID 10 array of iPhone 17 Pro Maxes running current iOS versions than on standard-tier S3 (not a relevant comparison, obviously, but it amuses me).

So I don't know what's reasonable, but Apple's policies certainly appear to be.

I'm still salty that Apple no longer offer battery service on my OG Apple Watch, however, so reason has its limits.


Suppose you always want to be running the latest iOS release, but you want to replace your phone as infrequently as possible. You would "only" have to have purchased 4 iPhones since 2007:

    | Model     | Launch date        | Obsoleted by | Price 
    |-----------|--------------------|--------------|------
    | iPhone    | June 29, 2007      | iOS 4        | $399 (*price cut)
    | iPhone 4  | June 24, 2010      | iOS 8        | $599
    | iPhone 6  | September 19, 2014 | iOS 13       | $649
    | iPhone 11 | September 13, 2019 | -            | $699
Adjusted for inflation, the total for these phones is $3,287 excluding carrier contracts. Assuming the iPhone 11 will be obsoleted by iOS 27 in September 2026, this costs you about $14.29/mo.


I was a long time Android user - but I realised I was getting through 2 or more phones in the time my wife had one. They'd either become obsolete or just die. I reluctantly bought an iPhone on this basis - it's actually going to work out cheaper if I get 5 or 6 years out of it.

However, I find the iPhone keyboard so bad and the settings concept so muddled that I'm going to return to Android when this experiment is over. Probably not for another 4 years though!


If we're talking anecdotes, my wife changes her iPhone every 4 years because it gets worse and worse. Daughter does the same. I change my Galaxy every 4 years because it gets worse and worse as well. Not sure how some people can say their <insert beloved brand> holds forever, unless they don't really use it of course. No brand really keeps up with the requirements, unless all you do is make phone calls - which is why my dad still has a Sony Ericsson.


You know you can sell and replace your phone if you don't like it. Recent Pixels have 7 years of support and they don't die. That's what I'd recommend you get instead. You can even trade in your iPhone for up to $700 when you buy a Pixel. You really don't need to force yourself to use a phone you don't like, leave alone for that long.


Technically not updates but if you hook up a PowerPC mac with 10.4 Tiger on it you can still get it updated to the latest version released, 10.4.11


I demoed that exact feature (though on 10.5) not so long ago and people didn’t believe me…!


The part that really gets me is that the price per GB to go from a 256 to a 512 GB iPhone is $2.54 (since the next storage option up costs $200 total). Two and a half dollars!!! A 512 GB micro SD would run you $0.10/GB. They have been charging 25x the market rate for storage on a device with no expandable storage at all for years. Baffling that they aren't called on it more. It should be criminal.


I had the 2019 cheesegrater Mac Pro. 7TB (going from 1 to 8) would cost me $3,000.

So I bought a 4xM2 PCI card, 4 2TB Samsung Pro SSDs for $1,100. And as a result got 6.5GBps versus the onboard 1TB's 5GBps.

Same with memory. 160GB (32 to 192GB) from Apple was also around $3K. OWC sold the exact same memory chips, manufacturer, spec, speed, for $1,000 for 192GB.


I recently found my ipad mini 2 (released in 2013) that had been boxed up when I moved a few years ago. After charging up the battery and booting it up, I checked for system updates. The latest system available for it was ios 12.5.7, released in 2023. It loaded fine, and I’ve been using the mini as an ereader ever since – the screen is fine, and wifi works.


A Macbook is the only Apple device I have in my entire array of computers and computer-related stuff, so I've got plenty of points of comparison. While Apple's hardware design isn't perfect, all of what you bring up seems wildly blown out of proportion to me. I can say I've never seen anyone with molten keyboards and displays. I've used the charger cable on my main charging brick for about five years now, and it's still going strong, despite being used for charging everything everywhere. And while Apple has committed many sins in terms of doing their absolute best at preventing anyone from touching their sacred hardware (we just need DRMed cables and enclosures to complete the set), this only affects repair. In terms of planned obsolescence, Macbooks statistically don't seem much less reliable than any other laptops on the market. They make up a majority of the used laptop market where I am.

And of course, just had to bring up the whole mouse charger thing. Back when Apple updated their mouse once and replaced the AA compartment with a battery+port block in the same spot to reuse the old housing, and a decade later people still go on about the evil Apple designers personally spitting in your face for whatever reason sounds the most outrageous.


Apple produced at least three mice that were very different and terrible in different ways. Their laptops are good, but don't waste your time defending their other peripherals.


Apple's unwillingness to admit that one button isn't enough is legendary. They added a fucking multi-touch area to the fucking mouse because that's apparently easier to use and more efficient. It's funny as hell.


I've barely ever tried them, but I've never liked the shaping of any that I have held, and I don't think that the touchpad addition justified the discomfort that it causes in all other use cases. That being said, the whole "Apple added the charging port on the bottom to be evil and prevent you from using the mouse" thing had become such an entrenched internet fable over the last decade that it's impossible for me to come by it and not comment on it. I'll clarify that no one but the designers themselves knows the original intention, but since it's the exact same design as the AA model, just with internal changes, it seems like an open-and-shut case.


We do know the intention though. Apple thinks a mouse with a cable looks messy and ugly, so they made the mouse charge fast and put the port on the bottom. Made it impossible to use it whilst charging but you could get 2 hours of use out of like 10 minutes or charging. The end result Apple hoped for was people always seeing the mouse on the desk, cableless, charged.

I'm surprised it came out during the Jobs era because he strongly believed in "form follows function".


Again, this is something that's often repeated all over the internet, but there is no source for this, it's just speculation - and fairly unconvincing speculation at that, since it has to go so far in assigning the designers these strong opinions and unwillingness to compromise just for it all to make sense. I feel like what I proposed is a far simpler and more straightforward explanation. Occam's razor and all. Just look at what the mouse looked like through its generations[1]. When redesigning it, they obviously just took out the single-use battery compartment and replaced it with a section containing the rechargeable battery and the charging port. In fact, they really couldn't have done it any other way, because the mouse is so flat that its top and bottom sides taper all the way to the desk, with no space for a charging port. So, when making the gen 2 model, just putting the port where it is was probably a far simpler drop-in solution that saved them from having to redesign the rest of the mouse.

[1] https://cdn.shopify.com/s/files/1/0572/5788/5832/files/magic...


> I'm surprised it came out during the Jobs era because he strongly believed in "form follows function".

The Jobs era of Apple had a ton of pretty but less functional designs. Jobs is quoted as saying that, but he was full of it. He didn't actually practice that philosophy at all.


I’ll admit to owning one and I use it.

The charging port location is weird and stupid, but I have never needed to charge it while I am using it. When it hits about 15%, I plug it in at the EOD and don’t have to charge it again for maybe a month. I am a neat freak and you have to look hard to see any cable on my desk rig.

The multi touch stuff works fine for me, but perhaps I am just used to it.

The only complaint I have is the shape, it’s not “comfortable” to use. Easily addressed by a stupid 3D printed humpback add on that changes the aesthetic but makes it comfortable for me to use. I shouldn’t have to buy a mouse accessory…but I did.

Here is the thing though…it’s just a mouse. I point, I click, then I move my hand back to the keyboard. It’s fine. While I’m sure there is a better designed one out there, is any mouse truly revolutionary?


>Apple added the charging port on the bottom to be evil

I don't think anyone does anything "to be evil".

But clearly they had a choice between what was good for the user (being able to use the mouse while charging) and what suited their aesthetic, and they chose the latter. Open-and-shut case, indeed.


That's Apple for you. Any time there's a conflict between aesthetics and user friendliness, aesthetics will always win out.


“I'll clarify that no one but the designers themselves knows the original intention, but since it's the exact same design as the AA model, just with internal changes, it seems like an open-and-shut case.”

“Legendary attention to detail”

Indeed, it is pretty open-and-shut.


which is really funny, since the Microsoft mice (only a few are left) and keyboards (discontinued) are by far some of my favorite peripherals.

On the apple mouse side, I got a white corded mouse with the tiny eraser looking mousewheel back in around 2003 or so, it's still in use today with a M4 mac mini. Works like a champ, Keyboard from that era is also still in use and used daily in our family.


I daily drive the Microsoft Touch Mouse, have for 10+ years. It is by far my favorite piece of hardware. I've never seen another one used in the wild, which might explain why they discontinued it.


The remote controls for Apple TV are among the all time worst peripherals I have ever used. Remotes aren’t hard. They reinvented the wheel by making it rectangular.


To be fair, since the Logitech Harmony One went EOL there hasn't been a decent remote available from anyone.


There was a third-party battery module[1] for the original AA Magic Mouse that would allow it to charge wirelessly, a feature that Apple somehow still has not managed to steal!

[1] https://techpp.com/2011/04/19/mobee-magic-charger-for-magic-...


> How I miss that time on a Macbook, with all the chords you have to press whenever you need a Home or End button to edit the line!

???? ctrl+a and ctrl+e? That works on most Linux setups, too. Only Microsoft screws that up. I love how in Mac Office apps, Microsoft also makes ctrl+a and ctrl+e do what they do in windows lol.


Can you be specific about your bad experiences with Apple hardware? I've gone through 5 MacBook Pros since 2008 and my only complaint was the old Intel models always got too hot. Nothing ever broke on them and I guess I kept them relatively clean?

I also have all of the adapters that came with the MBPs too, all perfectly functioning, the oldest still attached and powering my 2013 model with the dead battery (2008 model was sold, still working). The magsafe cable is pretty yellow now, and maybe a little wonky from the constant travelling, but no fraying/fire hazard yet.


Was there a Gateway that did better?


n=4 but my niece spilled a whole cup of milk and a whole cup of matcha on my M2 (twice on 1 device). I just flipped it up, dried it out with a hair dryer (apparently shouldn't do that) and it still works 2 years later.

Can't relate to what you're saying, had 4 MacBooks, and many PCs too.


Also they leak charge onto the case.


Any properly grounded device will do that with specifically incorrect electrical wiring and/or a shoddy charger. Did this happen with a properly wired outlet, and an undamaged Apple charger?

I have doubts that it did, as that would warrant a safety recall.


Can confirm it does happen. UK, both on my ThinkPad and a friend's MacBook when plugged in. It's a somewhat unavoidable side effect of the switching AC adapter designs - the output is isolated from the mains, but there is a tiny leakage current that can sometimes be felt as a "vibration". This is completely safe (far below the currents needed to cause harm) and no recall is needed.


Thank you. I always felt this vibration and wondered what it was.


If you replace the two prong plug on the AC adapter for a three prong cable, your MacBook case will be properly grounded and you won’t feel any vibration.


Cast aside your doubts, I've been to different parts of Europe a few times with different, healthy MBPs (I buy a new one every 4-5 years) with healthy adapters.

Plugging into the wide EU outlet with the Apple-manufactured plug, from the "World Travel Adapter Kit", can lead to uncomfortable "vibration" that you feel when you touch the top case, depending on the the hotel/airbnb. Whenever I visit I expect I should charge while I'm not using the device.


In researching why it was happening to me, I found sufficient forum posts complaining about it that it seems to be commonplace.

I have my doubts that Apple would admit enough to perform a safety recall given the issues they've had with their garbage chargers in the past. Other companies have no problems with building hardware that lasts. Apple seem to prefer their customers pay for replacements.


Found this out one time when I went to touch my MBA and it was like I stuck my finger in a light socket.


>Oh, that smell of molten keyboard plastic, those yellow spots burned into a display with its own heat exhaust, those laser-machined loudspeaker holes next to keyboard, all filled with grime! How I miss that time on a Macbook, with all the chords you have to press whenever you need a Home or End button to edit the line! Not to mention the power button right next to backspace. It's so rewarding when its charger dies in a month, and you feel superior to your colleague, whose vintage 6 months old charging cable with none of that extraneous rubber next to the connector catches fire along with your office. What a time to be alive!

None of the above sound like anybody's actual experience. Which is also they have the biggest resale value retention among PC laptops, and biggest reported user satisfaction.

Now, if you were about the lack of ports (at least for a period) or the crappy "butterfly" keyboard (for a period), you'd have an actual point.

Home/End is just Control-A/E.

Never seen "molten keyboard plastic". I'm sure you can find some person who has that somewhere on the internet. I doubt it's a problem beyond some 0.0001% rare battery failures or something like that.

"yellow spots burned into a display with its own heat exhaust". Not sure what this even means. Especially AS Macs don't even get hot. I haven't heard the fan ever, and I use a M1 MBP of 5+ years with vms and heavy audio/video apps.

"when its charger dies in a month" is just bs.


Staingate?

I had a GPU issue (that was the subject of a recall that matched my symptoms precisely (and I could make the MBP core dump on demand in the Genius Bar) but "recall declined, does not fail diagnostics".

Damaged charging circuit on an MBA. Laptop worked perfectly. Battery health check fine. Just could not charge it. "That will be a $900 repair. Maybe we can look at getting you into a new Mac?" (for one brief moment I thought they were going to exchange mine... no, they wanted me to buy one. And of course, my MBA couldn't be traded in because it was damaged...).

I've also had multiple Magsafe connectors fray to the point of becoming like a paper lantern with all the bare wire visible, despite the cable being attached to a desk with cable connectors so there was near zero cable stress (and often only plugged/unplugged once a week).


While I was in law school, every student who had an Apple laptop had to get their laptop replaced at least once (some multiple times) over the course of our program. The biggest problem was the bulging keyboard, due to the bulging battery, but their were also numerous issues with displays and with chargers not lasting very long. Most chargers lasted at least a semester, but few of the Apple chargers lasted an entire school year. They simply weren't designed with durability in mind. Quite humorously, after one student's laptop keyboard began bulging during torts, the professor began an impromptu lecture on product liability laws.

The only PC laptops that were replaced were the ones that got damaged in accidents (car accidents, dropped off a balcony, used as a shield in self defense during a robbery, etc.). Dell Latitudes of that era were sturdy, and not noticeably heavier than their fragile Apple counterparts.


Oh, it's not a problem, until you're a Kuznetsov-father trying to get on a plane with your Kuznetsova-daughter, and you become a suspect of kidnapping until you don't prove it's actually your daughter. After you get your mortally scared daughter back from authorities, you probably start thinking that having the same surname is something you actually need.

And no, US authorities won't make it easy for you or her.


I should remind that in a similarly cheerful mood FB dumped support of Jest and a bunch of other libraries. They have a long history of killing successful projects.

Worse, Vercel is involved, and I literally don't remember anything good about that company.

I'd recommend to be very cautious with such news, and use older versions of React for the next couple of years.


Vercel is already heavily involved, take a look at the core team:

https://react.dev/community/team

This announcement mentions they are separating business and technical governance, I suspect they are trying to limit Vercel's influence, and prevent them from taking it in a direction that only benefits them.


Jest is the most popular JS testing framework. It's wildly inaccurate to say it was killed.


Looks to me it will be dead soon if they don't figure out how to handle ESM imports. More and more libraries stop packaging commonjs for their new versions. I've been bitten first by d3, then by graphql-request (now graffle), then by msw, then by faker-js. Faker-js, for god's sake! They write in their docs that since version 10, they are incompatible with Jest [0]. Jest seems to be going the way of Enzyme and dodo.

The maintainer of MSW has been screaming for years for people to drop jest [1]

[0] - https://github.com/faker-js/faker/blob/428ff3328b4c4b13ec29d...

[1] - https://x.com/kettanaito/status/1746165619731382351#m


Man we have started with Jest tests for our React Native App half a year ago and now we should already drop it? What should we use instead?? Vitest? How's the compatibility? I'm so exhausted man, glad I'm qutting JS dev soon hopefully.


Vitest, yes. Compatibility with jest is great.

The ultimate win, of course, would be to use the native Node test runner. See the sourse of the Node.js website - I think they have pulled it off despite running a Node.js app.


Problem with vitest is that there's no first class caching if I recall correctly


I think keeping it unsupported for a couple of years, and reluctantly pushing it off to volunteers who barely have enough technical experience to support it is quite close to "was killed".

Until recently Jest had a bug that made it crash due to sl (yes, the famous steam locomotive) running under the hood. This gives a hint at, ahem, the sophistication of its architecture.

The project is long in its EOL, and the only reason for its use is inertia, the jQuery kind of it.


Any idea what people have generally moved on to? Currently using jest but its definitely showing its warts often and is pretty slow. Curious if there is an obvious successor.


Vitest is the incumbent I would say, but there seems to be a lot of momentum behind the runtime-builtin test runners recently. Bun is gaining traction like nothing else, and node has put a lot of work into the test builtins lately.


Node's test runner is a non-contender, at least right now.

If you've ever used any other test runner, you'll find Node's is woefully inferior. I'd say "but maybe it will get better", except I've seen the maintainer responses to several issues, and it seems they are wedded to bad architectural decisions that keep it that way.


The node test runner is perfect for small libraries without build where you pretty much ship the source code. The assertion library is actually superior to vitest’s if you don’t use spies etc. because unlike vitest’s assertions, the node ones do type narrowing correctly.


I moved back to Mocha with Chai for a while (both have great ESM support, quietly still well maintained, despite predating Jest) and then to Node's built-in test harness (and Deno's), sometimes still using Chai rather than `node:assert/strict` or `jsr:@std/assert`.

But I wasn't using a lot of Jest features anyway, generally preferred Mocha even during the height of Jest's popularity, and Node's test runner is sufficient for most of my needs (and Deno's starts to seem more and more the path forward as I come to prefer deno.json in a lot more types of projects than package.json).


I've had a really good experience with vitest


There's always Mocha.


Vitest is what people have been suggesting to me.


They made it possible for Rich Harris to get paid while working on Svelte. Not sure what will happen in the future though.


It’s time to start moving away from React in general.

I cannot for the life of me understand why anyone would intentionally pick it in 2025 unless there were serious constraints that forced them to.


Being generally employable and paying your bills? I don't like Angular one iota but I'd assume people who still choose it have legit reasons. What do you think we should all be using?


After doing this for so many years now and one thing I’ve really come to see as a fairly fundamental truth at this point is that the more you can align with the underlying platform you’re working with generally the better.

And so in that sense, the answer is web components. I know everyone hates the API but it was intentionally designed to be a low level thing to build a developer experience on top of and the best implementation of that right now is Lit. It’s also a pleasure to use and incredibly lightweight and designed to become even more lightweight overtime as new capabilities come to the web like signals in JavaScript or talk of native templating etc.

There is another option that I really like for certain kinds of applications which is Flutter which might sound contradictory to my original point because it skips the idea of the DOM entirely and brings its own rendering engine to a canvas element.

But it’s set to be the first serious Wasm and WebGPU based UI library on the web and has no problems spitting out 120fps (this is before even they have actually added WebGPU support and their newest rendering engine to the mix by the way) while keeping its rendering engine to I think about 1.5-2mb in size. It also gets you a single codebase that will run literally anywhere as an AOT compiled app.

That’s to say nothing of Dart itself which I cannot even begin to describe what a huge improvement it is over JavaScript and Typescript. If you haven’t tried it yet, do yourself a favour.

I’m just making the point that JavaScript and the DOM are no longer the only players in town and when you’re not trying to mix abstractions of a document markup language and an application a whole lot of problems just disappear like does this look different in different browsers or can I use this new feature etc.. it just works… everywhere… on the web.. on a desktop.. on a phone or an IoT device.


I thought Google had already killed Dart once? Or was it Flutter? I'm not really touching any of their “open-source” projects; fool me once, you know.

But my point stands. I haven't seen a single job posting in 2 years that isn't using React, Vue, Angular, or (a tiny bit) Svelte. So Web Components just aren't being widely used, it seems.


I’m not trying to be rude in any way here but literally everything you have posted here is incorrect. I don’t know where you’re getting your information from.

The only exception is about job postings and react, I’m not making the argument in any way shape or form that that isn’t the overwhelming majority of jobs out there, but that wasn’t what I was talking about I was saying it’s an actively bad choice for a project in 2025.


> I’m not trying to be rude in any way here but literally everything you have posted here is incorrect

I'm also not trying to be rude but you said a lot about React and offered a myriad of Google solutions to fix it, most of them got downgraded by Google itself and are on their way to the famous Google graveyard. Telling us about a problem and that Google knows how to fix it is indeed rude.


I stand corrected if I was wrong. Following the news over the years I had the impression that Dart (or Flutter) and Angular had been axed — with the decision reversed.


You are not wrong, staff from Dart, Flutter and Lit teams got fired. What happened after did not get to the news.

  A Flutter and Firebase developer who goes by the name xeladu wrote that after a discussion with an unnamed “Google developer expert,” the developer wouldn’t recommend learning the language, but added it’s not time to port Flutter projects just yet.

  “I’d rather say no to be totally honest. Only do it if you just want to play around. Then it will be fine,” xeladu wrote. “But becoming a serious professional Flutter developer could probably be a waste of time.”
https://thenewstack.io/whats-next-for-flutter-after-layoffs-...

https://techcrunch.com/2024/05/01/google-lays-off-staff-from...


Yea just to be clear none of that ever happened in real life.

I think maybe you might be mixing up some things here which is that there was once many years ago a version of Angular which was written in Dart which exists internally and powers AdWords but no longer exists publicly. Perhaps that is why you’re thinking of?


[flagged]


Until they dump your startup for no reason, and you'll have to migrate off their undocumented proprietary infrastructure. Please read stories on HN. You're risking your business if you work with them.


"Monopolies aren't so bad, the company is treating me quite well" is not a good counter argument.


do you consider Vercel a monopoly?


lol this has got to be satire


Oh sweet summer child, you have no idea what is coming.

The second "happy to pay" becomes "we could hire a team of devs for what we pay vercel" you will realize you made a giant mistake.

Hopefully you will be suffering from success and have the runway to migrate away or just eat the costs, but many many startup die due to impossible unit economics driven by ridiculous hosting costs that platforms like vercel lock you into.


I've been using Vercel for > 5 years. Suppose it's possible but that's true for any platform.


...no it isn't

If you host your non-nextjs SPA on Cloudflare and they do something you hate.. you move

Vercel has every incentive to lock everyone in as hard as possible and raise prices all they can. Not something I'd want to tie my business to


> "we could hire a team of devs for what we pay vercel"

It will be evenn worse when they see they need one, two, three more engineers to deal with the Next.js mess they created.


I've been using Next.js since 2017 (before dynamic routes were even a thing!).

The app router kinda sucks so we're staying with /pages. No complaints thus far with a team of 6 engineers.


this has got to be a shill


> Please don't post insinuations about astroturfing, shilling, brigading, foreign agents, and the like. It degrades discussion and is usually mistaken.


Also: please don't shill!


This is just silly, Don. HN has gotten so mired in negativity that someone who's a genuine fan of a product, and dares to go against the narrative is now a "shill." Also: see definition of shill. I'm just a happy user that runs a startup, and my life is easier because of their existence.


So there are - sloth on a tree - UFO in a forest - playground - alien on a beach

What's the 5th thing you have to find?


There's one of the houses with a platform...


On top of other criticism here, I'd like to add that the article optimistically assumes that actors are completely honest with their benchmarks when billions of dollars and national security are at stake.

I'm only an "expert" in computer science and software engineering, and can say that - neither of widely available LLMs can produce answers at the level of first year CS student; - students using LLMs can easily be distingished by being wrong in all the ways a human would otherwise never be.

So to me it's not really the question of whether CS-related benchmarks are false, it's a question of how exactly did this BS even fly.

Obviously in other disciplines LLMs show similar lack of performance, but I can't call myself an "expert" there, and someone might argue I tend to use wrong prompts.

Until we see a website where we can put an intermediate problem and get a working solution, "benchmarks show that our AI solves problems on gold medalist level" will still be an obvious BS.


Loved the artstyle, but controls... uhh... let's say they were on par with a bad PC port of Sonic Adventure. I can barely control where I'm going, because the camera just rotates randomly. Something I'll never miss from the old days.


The controls are the only gripe I have with this amazing work.

If I was in Unity, I would address this issue by manually placing a bunch of virtual cameras in the world and using cinemachine to blend between them. The size of this world is small enough to justify manual placement and configuration of each. You could also just focus on the complex areas and let the default follow cam handle the rest.


This really wants to be controlled by twin thumbsticks.


..or mouse for camera.

But I kind of understand it. I did a somehow similar project before and for people who are not trained in video game style controls it is quite hard to get used to them ad-hoc.

Assuming this project is at least partially aimed at art directors, project leads and such aka people who aren't necessarily gamers, detached movement/camera controls are a bit risky.


The mouse look was actually something I was kind of wanting wandering around.

There were a lot of cool scenic locations, that almost beg for the ability to just stand somewhere and look around, yet you can't really look down or up very conveniently.

Also, walking locations where you might fall, be kind of nice to be able to look at where you're aiming at. Minor nit mostly, just fit the explore a scenic island theme.


There is some level of mouse look. I suspect together with not locking the cursor/the cursor leaving the window, this is part of why people report issues.


If there are errors in implementation of general constructs, they tend to be visible at their every use, and get rapidly fixed.

Some general constructs are better than the others, because they have an algebraic theory behind them, and sometimes that theory was already researched for a few hundred years.

For example, product/coproduct types mentioned in the article are quite close to addition and multiplication that we've all learned in school, and obey the same laws.

So there are several levels where the choice of ad-hoc constructs is wrong, and in the end the only valid reason to choose them is time constraints.

If they had 24 years to figure out how to do it properly, but they didn't, the technology is just dead.


Hm, that's idealistic...

I've certainly run into cases where small changes in general systems led to hard-to-detect bugs, which took a great deal of investigation to figure out. Not all failures are catastrophic.

The technology is quite alive, which is why it hasn't been 'fixed' - changing the wheels on a moving car, and all that. The actual disappointment is that a better alternative hasn't taken off in the six years since this post was written... If its so easy, where's the alternatives?


That's not idealistic, that's how arithmetics work. If you use the same generic thing more times, you have the higher chance of discovering it broken. The fact that you've run into cases means that chance is never zero, and is irrelevant to the discussion.

As was already mentioned in the article, PB solve a problem that likely only Google has, even if that. State of the art nowadays is JSON/JSONL. If it grows too large, gzip it.

When someone is using third-party closed proprietary technologies to be "not like the rest", it usually doesn't work that well for their business.

The technology is "alive" until it didn't follow the path of Closure, GWT, and the rest of "we use it on the most loaded page of the world" technology. PB will be on the same graveyard soon.


> might be my Linux setup being inefficient

Given that videos spin up those coolers, there is actually a problem with your GPU setup on Linux, and I expect there'd be an improvement if you managed to fix it.

Another thing is that Chrome on Linux tends to consume exorbitant amount of power with all the background processes, inefficient rendering and disk IO, so updating it to one of the latest versions and enabling "memory saving" might help a lot.

Switching to another scheduler, reducing interrupt rate etc. probably help too.

Linux on my current laptop reduced battery time x12 compared to Windows, and a bunch of optimizations like that managed to improve the situation to something like x6, i.e. it's still very bad.

> Is x86 just not able to keep up with the ARM architecture?

Yes and no. x86 is inherently inefficient, and most of the progress over last two decades was about offloading computations to some more advanced and efficient coprocessors. That's how we got GPUs, DMA on M.2 and Ethernet controllers.

That said, it's unlikely that x86 specifically is what wastes your battery. I would rather blame Linux, suspect its CPU frequency/power drivers are misbehaving on some CPUs, and unfortunately have no idea how to fix it.


> x86 is inherently inefficient

Nothing in x86 prohibits you from an implementation less efficient than what you could do with ARM instead.

x86 and ARM have historically served very different markets. I think the pattern of efficiency differences of past implementations is better explained by market forces rather than ISA specifics.


x12 and x6 do not seem plausible. Something is very wrong.


These figures are very plausible. Most Linux distros are terribly inefficient by default.

Linux can actually meet or even exceed Window's power efficiently, at least at some tasks, but it takes a lot of work to get there. I'd start with powertop and TLP.

As usual, the Arch wiki is a good place to find more information: https://wiki.archlinux.org/title/Power_management


Those numbers would imply <1h runtime, or a >50W consumption at idle (for typical battery capacities). That's insane.

I've used Linux laptops since ~2007, and am well aware of the issues. 12x is well beyond normal.


At least on Thinkpads over the years, I've never seen anything remotely close to that either. I've had my Thinkpad x260 power draw down to 2.5 watts at idle, and around 4 or 5 watts with a browser and a few terminals open. That was back in 2018! With the hot-swappable battery on the back, I could go for 24 hours of active use without concern.


I get below 5W at idle (ff and emacs open, screen at indoor brightness, wifi on) on my gen11 framework. Going from 8 to 5 required some tinkering.

I don't think I ever saw 50W at all, even under load; they probably run an Ultra U1xxH, permanently turbo-boosted.

For some reason. Given the level of tinkering (with schedulers and interrupt frequencies), it's likely self-imposed at this point, but you never know.


> would imply <1h runtime, or a >50W consumption at idle

That's the case.


My CPU is at over 5GHz, 1% load and 70C at the moment. That's in a "power-saving mode".

If nothing would be wrong, it'd be at something like 1.5GHz with most of the cores unpowered.


Something is wrong with power governor then. I have an opposite experience, was able to tune Linux on a Core Ultra 155H laptop so it works longer than Windows one. Needed to use kernel 6.11+ and TLP [0] with pretty aggressive energy saving settings. Also played a bit with Intel LPMD [1] but did not notice much improvement.

[0] https://github.com/linrunner/TLP

[1] https://github.com/intel/intel-lpmd


I also own a 155H laptop using Linux Mint! Would you share your settings with TLP and LPMD? I am not getting not much longer battery life than Windows 11 on it after some tinkering, so seeing somebody else's setup may help a lot. Thanks!


Won't say I got much longer battery life, and even what I got may be as well explained as "TLP made energy profile management almost as good as on Windows, and then Windows's tendency to get a bunch of junk processes seeping on your battery tipped the scales to favor Linux". Also I ended up switching back to Windows because of never-ending hardware issues with Linux, installing it on 155H back in February 2024 was especially rough but even 6 months later I randomly got Bluetooth not working anymore after Ubuntu update.

My TLP and LPMD configs: https://gist.github.com/vient/f8448d56c1191bf6280122e7389fc1...

TLP: don't remember details now, as I recall scaling governor does not do anything on modern CPUs when energy perf policy is used. CPU_MAX_PERF_ON_BAT=30 seems to be crucial for battery savings, sacrificing performance (not too much for everyday use really) for joules in battery. CPU_HWP_DYN_BOOST_ON_BAT=0 further prohibits using turbo on battery, just in case.

LPMD: again, did not use it much in the end so not sure what even is written in this config. May need additional care to run alongside TLP.

Also, I used these boot parameters. For performance, I think, beneficial one are *mitigations, nohz_full, rcu*

    quiet splash sysrq_always_enabled=1 mitigations=off i915.mitigations=off transparent_hugepage=always iommu=pt intel_iommu=on nohz_full=all rcu_nocbs=all rcutree.enable_rcu_lazy=1 rcupdate.rcu_expedited=1 cryptomgr.notests no_timer_check noreplace-smp page_alloc.shuffle=1 tsc=reliable


What is the laptop, and what's it doing?


What p-state driver are you using?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: