Hacker Newsnew | past | comments | ask | show | jobs | submit | pipo234's commentslogin

Fair enough, but they also have a deeply embedded New Public Management culture.

> On top of that Patreon is a closed centralized platform that's bound to have issues like this and that's where I very much prefer using protocols (vs platforms) that enable the same. There are very similar solutions to Patreon, but based on nostr and related protocols.

The problem here isn't that Patreon is centralized, but that the app store is. Apple could easily require a cut from any app using nostr and related protocols. Or simply ban them altogether.

Not saying government mandates are ideal, but I don't see any other way to force some sense into Apple (or Google). App stores should be some sort of independent institutions (non-profits) but companies have no incentive to cede that revenue. Until that happens, best not download from app stores unless absolutely necessary.


To many users, an app seems to be perceived as the blessed way to access the web. While on a mobile, they are mostly a way to organize symlinks or bookmarks. Except, off course a web browser does its best to protect the user while most apps don't.

Meanwhile I continue doing the Lords work by telling kids that apps are not the internet. Hopefully, that 95% percentage will eventually decrease.


It's not users who are pushing this. It started off with just superfluous but optional apps of websites. Now every year I find there is something I used to be able to do, which I now must own a smartphone to do. And it's not just getting discounts at coffee chains, it's increasingly stuff like accessing healthcare plan benefits, or verifying my identity for banking

A few sites throw up a blocking screen to download the app, which disappears once you spoof a desktop UA. But the big problem is businesses now having no web interface at all


Very good point, though I believe it's both market push and consumer expectation.

Because we have such limited control over our devices, they effectively provide the security of a jail locking down what users can do. That is appealing from a healthcare or banking perspective because it obfuscates the client-server API and gives exact control over the UI. As a bonus, the coffee chain gets to glean lots of details from your phone that would be unavailable in a browser.

As individuals we can do little more that push back: don't let yourself be trapped by coffee chains (go to a different one) and bother your bank's service line about having to use their app. The rest is up to government intervention, I fear.


>To many users, an app seems to be perceived as the blessed way to access the web. While on a mobile, they are mostly a way to organize symlinks or bookmarks. Except, off course a web browser does its best to protect the user while most apps don't.

That is an education problem. What do school computer courses teach these days? Do schools even have computer literacy classes anymore? Do they still teach students about the internet?


The OS is what protects the user. Have you ever seen the prompts asking the user if they want to share their location?

This made me realize, Firefox needs to create a launcher that just creates PWAs out of bookmarks (or vice versa). That way, people get the "app feel" without needing to download every single app.

> However, if I were closer to the front end of my career, I'd certainly be looking to change, perhaps to technical writing.

It's tough for Juniors. I recommend specializing on one or two systems aspects, like performance, reliability, security. Understanding is design, how to measure the aspect, how to reason about it, knowing which levers to pull.


Hoping plain old software development will remain for the next decade or so, then retire.

I understand some of the appeal of grpc, but resumable uploads and download offsets have long be part of plain http. (E.g. RFC 7233)

Relying on http has the advantage that you can leverage commodity infrastructure like caching proxies and CDN.

Why push protobuf over http when all you need is present in http already?


Because you may already have robust and sensible gRPC infrastructure setup and working, and setting up the correct HTTP infrastructure to take advantage of all the benefits that plain old HTTP provides may not be worth it.

If moving big files around is a major part of the system you’re building, then it’s worth the effort. But if you’re only occasionally moving big files around, then reusing your existing gRPC infrastructure is likely preferable. Keeps your systems nice and uniform, which make it easier to understand later once you’ve forgotten what you originally implemented.


Simplicity makes sense, of course. I just hadn't considered a grpc-only world. But I guess that makes sense in today's Kubernetes/node/python/llm world where grpc is the glue that once was SOAP (or even CORBA).

Still, stateful protocols have a tendency to bite when you scale up. And HTTP is specifically designed to be stateless and you get scalability for free as long as you stick with plain GET requests...


gRPC runs over http. What infra would be missing?

If you happen to be on ASP.NET or Spring Boot its some boilerplate to stand up a plain http and gRPC endpoints side by side but I guess you could be running something more exotic than that.


http/2 is nothing like http/1

feel free to put them both behind load balancers and see how you go


this.

also, http/s compatibility falls off in the long tail of functionality. i've seen cache layers fail to properly implement restartable http.

that said, making long transfers actually restartable, robust and reliable is a lot more work than is presented here.


Is see that QUIC file transfer protocols are available, including a Microsoft SMB implementation.

These would be the ultimate in resumability and mobility between networks, assuming that they exploit the protocol to the fullest.


The evolving schema is much more attractive than a bunch of plain text HTTP headers when you want to communicate additional metadata with the file download/upload.

For example, there are common metadata such as the digest (hash) of the blob, the compression algorithm, the base compression dictionary, whether Reed-Solomon is applicable or not, etc...

And like others have pointed out, having existing grpc infrastructure in place definitely helps using it a lot easier.

But yeah, it's a tradeoff.


Would like to point out that professional translation has been under pressure for much longer than AI.

I have friends that made a descent buck 20-30 years ago translating technical documents like car manuals. Over the years, prices fell from quarters per words to fractions of a cent.

And even though machine translation was barely existent, tools were used to argue higher productivity and therefore lower prices.


This. Google, when it went to a more neural-based translation system about a decade ago, along with DeepL put a serious dent in translation work - particularly around gigs that don't require extremely rigorous accuracy like legal or medical.

Several of my friends who used to work full-time as translators are now supplementing their income with side jobs like foreign language teaching, proofreading, and similar work.


It is a bit of a race to the bottom for library components: either you open source and it gets snatched by LLM parties or you keep it closed and good luck selling your wares.

On top of that the open source market will increasingly be flooded with (well intended) AI slop built by junior devs.


Pedantic: I think you meant to say open whisper protocol, the end to end protocol which is Whatsapp copied from Signal.

The name of the protocol is "Signal".

I don't think his rant is against social networks or instant messaging perse, but about vendor lock in.

The way I read it is along the lines of Mike Masnick's protocols not platforms.

https://knightcolumbia.org/content/protocols-not-platforms-a...


I understand, but in this specific arena, because of the network effect, interoperability is important so you can hope to make a competitive product.

More in general,standard protocols are important but they don't necessarily avoid lock-in.

For example, imagine a Dropbox equivalent with a public API specification.

At some point you want to leave. You are ready to use Postman or even curl and download everything to upload it somewhere else... but download is capped at 10 files/day per user. And you uploaded 100,000 files over years.

The API is public but good luck leaving with all your files!

In other words, standard protocols help avoiding client lock-in, but when the value is on the server side (data,...), they are not enough.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: