Hacker Newsnew | past | comments | ask | show | jobs | submit | kshri24's commentslogin

Really sucks to see this happen! Been using Tailwind for past few years now.

All the more reason to go closed source. Except for few really vital components that have national security implications (OS/Kernel, drivers, programming languages), which can be funded and supported by universities, Governments etc, I am of the strong opinion that everything else should go closed source.

Enough with this BS. Stop feeding the slop.


People with that perspective shouldn't have been doing open source in the first place. AI isn't hurting people sharing things, only people who are pretending to share but actually indirectly selling things.

There is no one in this World who will do things purely for altruistic purposes. Even if not for money, it would be for something intangible that ingratiates the Self (fame for example).

I can't find a single example of a software developer who has put out software purely for some altruistic purpose without any returns on that investment (direct or indirect).

Building a sustainable business model was a great way to justify open source. Not anymore.


> There is no one in this World who will do things purely for altruistic purposes. Even if not for money, it would be for something intangible that ingratiates the Self (fame for example).

And pretty much none of that is threatened by AI. LLMs learning from code, or articles, found online, are at worst neutral to this, and in many ways beneficial. They're only negative if you're using your contribution as bait to hold your audience, in lieu of a more honest approach of openly selling it to them.

> Building a sustainable business model was a great way to justify open source. Not anymore.

Or poison it. Open Source as a business model was always an anti-competitive hack, similar to "free with ads": drive price down to 0, destroying any other way of making money in this space, including "honest exchange of money for value".


> this looks too close to money laundering, just like buying art

Yep. Concur with this conclusion. It is getting really ridiculous now. No way most of these companies are at the valuation they are in.

Or the investors are just plain stupid.


> Roll up your sleeves to not fall behind

This confirms AI bubble for me and it now being entirely FUD driven. "Not fall behind" should only apply to technologies where you have to put active effort to learn as it requires years to hone and master the craft. AI is supposed to remove this "active effort" part so as to get you upto speed with the latest and bridge the gap between those "who know" and those "who do not". The fact you need to say "roll up your sleeves to not fall behind" confirms we are not in that situation yet.

In other words, it is the same old learning curve that everyone has to cross EXCEPT this time it is probabilistic instead of linear/exponential. It is quite literally a slightly better than coin toss situation when it comes to you learning the right way or not.

For me personally, we are truly in that zone of zero active effort and total replacement when AI can hit a 100% on ALL METRICS consistently, every single time, even on fresh datasets with challenging questions NOT SEEN/TRAINED by the model. Even better if it can come up with novel discoveries to remove any doubts. Chances of achieving that with current tech is 0%.


Game development is STILL a highly underrated field. Plenty of advancements/optimizations (both in software/hardware) can be directly traced back to game development. Hopefully, with RAM prices shooting up the way it is, we go back to keeping optimizations front and center and reduce all the bloat that has accumulated industry wide.


A number of my tricks are stolen from game devs and applied to boring software. Most notably, resource budgets for each task. You can’t make a whole system fast if you’re spending 20% of your reasonable execution time on one moderately useful aspect of the overall operation.


I think one could even say gaming as a sector single handedly move most of the personal computing platform forward since 80s and 90s. Before that it was probably Military and cooperate. From DOS era, overclocking CPU to push benchmarks, DOOM, 3D Graphics API from 3DFx Glide to Direct X. Faster HDD for faster Gaming Load times. And for 10 - 15 years it was gaming that carried CUDA forward.


Yes please! Stop making me download 100+gb patches!


The large file sizes are not because of bloat per-se...

It's a technique which supposedly helped at one point in time to reduce loading times, helldiver's being the most note-able example of removing this "optimization".

However, this is by design - specifically as an optimization. Can't really be calling that boat in the parents context of inefficient resource usage


This was the the reason in Helldivers, other games have different reasons - like uncompressed audio (which IIRC was the reason for the CoD-install-size drama a couple of years back) - the underlying reason is always the same though, the dev team not caring about asset size (or more likely: they would like to take care of it but are drowned in higher priority tasks).


We aren't talking about the initial downloads though. We are talking about updates. I am like 80% sure you should be able to send what changed without sending the whole game as if you were downloading it for the first time.


Helldiver's engine does have that capability, where bundle patches only include modified files and markers for deleted files. However, the problem with that, and likely the reason Arrowhead doesn't use it, is the lack of a process on the target device to stitch them together. Instead, patch files just sit next to the original file. So the trade-off for smaller downloads is a continuously increasing size on disk.


Generally "small patches" and "well-compressed assets" are on either end of a trade-off spectrum.

More compression means large change amplification and less delta-friendly changes.

More delta-friendly asset storage means storing assets in smaller units with less compression potential.

In theory, you could have the devs ship unpacked assets, then make the Steam client be responsible for packing after install, unpacking pre-patch, and then repacking game assets post-patch, but this basically gets you the worst of all worlds in terms of actual wall clock time to patch, and it'd be heavily constraining for developers.


from my understanding of the technique youre wrong despite being 80% sure ;)

any changes to the code or textures will need the same preprocessing done. large patch size is basically 1% of changes + 99% all the preprocessed data for this optimization


How about incorporating postprocessing into the update procedure instead of preprocessing?


Do you have some resource for people outside this field to understand what it's about?


It goes all the way back to tapes, was still important for CDs, and still thought relevant for HDDs.

Basically you can get much better read performance if you can read everything sequentially and you want to avoid random access at all costs. So you can basically "hydrate" the loading patterns for each state, storing the bytes in order as they're loaded from the game. The only point it makes things slower is once, on download/install.

Of course the whole excercise is pointless if the game is installed to an HDD only because of its bigger size and would otherwise be on an nvme ssd... And with still affordable 2TB nvme drives it doesn't make as much sense anymore.


So this basically leads to duplicating data for each state it's needed in? If that's the case I wonder why this isn't solvable by compressing the update download data (potentially with the knowledge of the data already installed, in case the update really only reshuffles it around)


It's also a valid consideration in the context of streaming games -- making sure that all resources for the first scene/chapter are downloaded first allows the player to begin playing while the rest of the resources are still downloading.


True!


Interesting, today I learned!




Another issue with Google Maps is it not showing Plus Codes for some locations that highlight the entire area. If you however place a pin on that location, it provides a Plus Code. Pretty stupid IMHO.

Also it is really, really hard to search for "Nearby" places. Have to do it through "Directions". Really bad UX.


You can put "... near <location>" at the end of your query to get nearby places. "... near me" also works


> And so everyone and their mother is building big error types. Well, not Everyone. A small handful of indomitable nerds still holds out against the standard.

The author is a fan of Asterix I see :)


Technology was supposed to get rid of most of bureaucracy and move the World towards automation. These FAANG companies have instead successfully integrated bureaucracy with technology and have made bureaucracy permanent. Instead of automating away bureaucracy these companies have automated away customer service.


It is a serious mistake to think that technology can remove bureaucracy. Indeed, technology by its nature makes bureaucracy a lot more rigid. Bureaucracy is about homogenising processes and erasing individual differences, and software reinforces these properties because it allows even less human input or deviation from the process. (That isn't true of all software, just software that is intended to somehow deal with large numbers of people uniformly.)


When I said remove bureaucracy I meant remove bureaucracy from people's lives. Obviously it will exist behind the curtains. I agree with you that software reinforces bureaucratic properties and it should. That is what it was supposed to do. But technology failed when it comes to rectification of any deviations in bureaucratic processes.

For example, assume you are submitting a form and the address is incorrect/not matching exactly what is stored in the database; software should (rightly) flag it and have a human review it and do the necessary correction. Instead we have the worst of both worlds, where the software flags the problem but there is no human in the loop anymore. Even the human is automated out. So the problem is never fixed. Instead, the customer/client who is interacting with the software is indirectly made aware of the internal bureaucratic process but has no recourse.


The lazy response to any new risk or problem is to just layer on new rules and processes. Large organizations always end up with those things defining their workplace culture (risk aversion, checkbox culture) and that worldview filters down to the decisions which impact customers.


they do these things in response to governmental pressure.


"Never be deceived that the rich will permit you to vote away their wealth." - Lucy Parsons


> How can sole maintainers work with multi-billion corporations without being taken advantage of?

Use AGPL, Fair Source or BSL. That's the only way forward. I for one will be using AGPL in everything. If a trillion dollar company cannot pay for services it is a fucking shame. Absolutely disgusting. Microsoft should be ashamed.


It is called the Gell-Mann Amnesia Effect.

Coined by author Michael Crichton:

“Briefly stated, the Gell-Mann Amnesia effect is as follows. You open the newspaper to an article on some subject you know well. In Murray's case, physics. In mine, show business. You read the article and see the journalist has absolutely no understanding of either the facts or the issues. Often, the article is so wrong it actually presents the story backward—reversing cause and effect. I call these the "wet streets cause rain" stories. Paper's full of them.

In any case, you read with exasperation or amusement the multiple errors in a story, and then turn the page to national or international affairs, and read as if the rest of the newspaper was somehow more accurate about Palestine than the baloney you just read. You turn the page, and forget what you know.

That is the Gell-Mann Amnesia effect. I'd point out it does not operate in other arenas of life. In ordinary life, if somebody consistently exaggerates or lies to you, you soon discount everything they say. In court, there is the legal doctrine of falsus in uno, falsus in omnibus, which means untruthful in one part, untruthful in all. But when it comes to the media, we believe against evidence that it is probably worth our time to read other parts of the paper. When, in fact, it almost certainly isn't. The only possible explanation for our behavior is amnesia.”


I read the news to judge what the propagandist want me to know, not to actually learn things.

I find skimming NYT’s and FOX’s headlines are perfectly satisfactory. If anything peaks my interest I do independent analysis, putting weight on primary sources. I often try to listen in on experts discussing the matter among themselves.


When do you have time to work?


It maybe consumes 1-3 hours a week to skim headlines. More research as needed, and mostly focused on matters I have a direct interest in (interest as in money on the line) so it counts as work.

It’s small potatoes for time, most people spend way more on television and social media.


Fwiw, I listen to sports talk radio whenever I’m not in meetings or otherwise physically interacting with someone else at work.

The actual “news” stopped being productive to read, watch, or listen to when the term “pregnant chads” was coined. Been downhill ever since.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: