Hacker Newsnew | past | comments | ask | show | jobs | submit | more echopom's commentslogin

> Even if there was humiliation, it was humiliation with a $400M+ payout to Adam Neumann. Softbank set up a really bad example in the industry:

AFAIK has always been this way for at least a decade.

An example would be Docker , has raised more than 200M yet it had no decent stream of revenue , it's litterrally a dead man walking yet the CEO left the boat years ago with tens of millions...

Startups that failed and have their founders go "bankrupt" are startups that you never ever hear about... The rest of startups you'll find on HN or have raised 100M+ millions often have their founders pocket millions when they raise very large amount ( 50M+ )...

There no surprise here , once you manage to get a business to a certain valuation / run-rate it's worth a lot thus you can trade that for cash , regardless of the "humiliation"


Hey Patrick,

> As a philosophical matter, we consider ourselves to serve the business, which means that limiting access to what we consider to be the business's own information feels a bit strange.

Maybe I'm wrong , but once a customer upload the document on Stripe Identity they are supposed to be YOUR documents.

I worked in Bank as a Service , fundamentally when a customer goes through a verification process , the documents uploaded are not the owned by the partner using our APIs. They are owned by us , the Bank.

For Stripe Identity the same should have apply. Here the goal is not "Lock the Partner" but rather to protect them.

Now that discord has access to my Passport , in case of an identity theft could you tell me EXACTLY whose liable for the leak in regards to the law ?

With BaaS it's pretty clear , the Bank carry the responsibility to keep those documents safe , thus it's safer to not give access to a basic business to the raw details.

With the current API design you are offering, it's more ambigous and more prone very large leak within a business information system like Discord or Uber etc..

Those leak will happen.


> Now that discord has access to my Passport , in case of an identity theft could you tell me EXACTLY whose liable for the leak in regards to the law ?

Discord only has access to your passport if you upload it to them. They don't have access to it by virtue ofthem being a stripe customer.


It's unfortunate , I'm an Enterprise Architect in Banking and honestly I wouldn't have let that feature go in production.

Businesses that do not have a legitimate reason to view my sensitive document like Passport , should not be allowed to do so.

Only authorized institutions like Licensed Payment Institution / Banks / Insurances etc... should be allowed to do so and AFTER they've been approved.

It's sad because you can tell right away that this will we be abused by Stripe's customers inadvertently. Just like Uber "God View" thats you view any customer ride...

Pretty sure the amount of "Identity Theft" or "Privacy" Scandal is going to explode with such technology available for everyone.

I don't know how a product manager at stripe could tell himself that "Yes , it make sense to give access to sensitive documents" in an age where people are seeking more privacy.


> Businesses that do not have a legitimate reason to view my sensitive document like Passport , should not be allowed to do so.

I get parent comment's totally legitimate security concerns. And businesses that have no business having my identity should surely not be asking for it. But I don't honestly understand how this has anything to do with Stripe. These businesses (which for whatever reason are asking for ID verification before doing business with you) are just using Stripes API to verify identity instead of just taking your info themselves.

Any customer giving their information presumably knows they are giving said business their identity documents, the customers might not even know that the business is using Stripe's API.

Furthermore, Stripe is ostensibly coming in here to streamline the process for business taking identity info from customers. Why - in your opinion - is it worse for consumers when these-type businesses (which ask for identity), use their own-rolled id verification than using Stripe's?


> Why - in your opinion - is it worse for consumers when these-type businesses (which ask for identity), use their own-rolled id verification than using Stripe's?

The point isn't so much using third party , we use a third party on prem.

My point is very simple : Why on earth would you let discord view my passport ? JUST WHY ?!

Those documents are very sensitive and no one should have access to them unless they have a VERY good reason to do so. PCI DSS treat "card information" like hot lava, the same model should have applied here.

Stripe should have acted as a "Trusted Party" and securely store those documents without giving access to it but just let you extract the information from it.

Thus you would been able to have uniquely identified user , backed up by government id , but you can't get access to the documents and sensitive data should have been redacted .... just like Card Number...

Again unless you are a Fintech / Financial Instituion , with a VALID in effect license , you should not have access to those documents.


I totally agree. Businesses should not legally be allowed to access more information than they need. Like why do hospitals ask for my Social Security number? I know I can refuse it, but if they really don't need it shouldn't it be illegal for them to needlessly probe my identity?

And the list goes on...


If you've ever been carded at a bar/liquor store in a foreign country, then that random small business has seen your passport, no? How do you feel about that?


Being human to human, unless they're wearing tech that would allow them to scan/archive it, normally they just verify (eyeball it) and you get it back.

Here, with this system, they could verify and keep the data regardless of what I think is going on.


If you can't assume that a website you upload a scan of your ID to isn't capturing details about it, then you can't assume that a bouncer checking your ID isn't wearing a surreptitious HMD, no? In both cases, you're submitting your PII to an unknown process that seems like it should be safe, but with no previous experience or brand-image there to tell you whether there's actually any proof that it's safe.


That's a silly stretch. It's vastly more likely that a website fetching copies of a passport image is leaking copies or leaving the files where it shouldn't by accident and has the data exfiltrated by third party identity thieves, compared with a bouncer having a secret scan-quality camera installed by identity thieves without the bouncer noticing.


Who said anything about the bouncer not noticing? I'm presuming that the bouncer is the identity thief. If you're looking to make money as an identity thief, being a bouncer is the perfect job!

There was a story on Reddit a few months back, about a bouncer who, when handed real ID cards, claimed they were fakes, and proceeded to immediately "cut them up" (so that people didn't feel any need to demand them back, since what are you going to do with scraps of an ID card?) The bouncer was actually palming the real ID and cutting up a random piece of plastic instead, and then later handing the real ID card off to the owner, who sold them on the black market. One victim of this scheme figured it out after being a victim of identity theft, as they traced back a submitted capture of the photo ID that some third-party had retained, to the one that got "cut up." The police raided the establishment, and a whole ring of people were caught up in it. It was a whole thing.

There's nothing that leads me to believe that this isn't a simple, obvious, repeatable, low-stakes, high-margin criminal business model. As such, it probably happens a lot.


Wow, that's impressive.

I would still assume identity theft via websites being hacked is a lot more common, and likelihood is an appropriate factor when evaluating protective actions. But you make a good point about the bouncer.


Presumably they aren’t taking photographs of the passport and viewing them at some later date from personal computers.


In EU, you don't hand over ID/passport like credit card in US. You show it while keeping it in your hand. Second party can verify your age, while being unable to copy stuff like machine readable zone.


You seem to be contradicting yourself. Businesses are asking for Stripe to verify identity. These businesses just need verification, not copies of documents, but Stripe makes them available anyway. That's the whole contention.

As a consumer, I would expect Stripe would do the verification and give the business partner the result, but not all the data they used to get the results themselves.


I actually disagree with this as well. The Hacker News user is not the average user. The average user has no idea what Stripe is, they assume that the business requesting a verification will have access to anything they submit.

I know this because we use Stripe Identity ourselves (in beta) and user's have no idea that Stripe and us are different companies.


> user's have no idea that Stripe and us are different companies.

Doesn't that imply that if there's a security breach at Stripe, that your users will blame you [too]


That seems right. Businesses aren't islands, they work with other businesses to provide their services. But you as a business have an issue with a vendor/supplier, that's still on you. If McDonalds can't get fries, I don't blame farmer X for a failed harvest, I blame McDonalds for a fragile supply chain.


We should figure out who McDonalds' ice cream machine maker is and ask them why their product keeps breaking down.



As a person that still is trying to recover from identity fraud that happened many years ago. I am always very weary of companies that demand ID papers. Most of the time I will avoid them.

Most companies aren't even supposed to ask for identity papers is Stripe verifying with the passport issuer whether the country allows given their passport to some identity?

I think there should be some sort of consent system built in were when the API consumer wants to download a passport the customer gets an email with the question if they consent in them fetching a copy.


But, also as an Enterprise Architect in Banking, if you were considering Stripe Identity wouldn't you rely on it for KYC compliance? You can't just say Oh we outsource that to a third-party called Stripe, can you?


That's not my point , here my point is very clear and straightforward.

Some people at Discord now have access at the pictures of my Passport that I uploaded during the verification process because they use "Stripe Identity".

The FAQ is very clear , Stripe give you full access to those documents. It should NEVER do so.

Now the very smart people have Discord have access to my passport they can now take a 50K Loan using my documents and face-check video , social security and some fake income documents.

They can also destroy my entire life because I maintain a political blog with views they don't really like that they consider "hate speech". These are exaggerated examples , but you get the idea.

I'm concerned by this , because more and more startups are going to use it to increase the value of their userbase to reduce fraud and look more attractive for their planned exit.

In the meantime, people having access to my personal documents is going to go exponential...

Again , I'm an Architect in Banking we have 500+ Partners selling Loan for us , they have NEVER access to your documents / personal data. They can only tell if the document has been approved , income range and some basic information. You don't know what they are going to do those sensitive documents / info , even if you have contractual agreement with them.

Banking industry has had a very simple rule that everyone has been following for decade : DON'T TRUST THIRD PARTY. Stripe has decided to do otherwise I guess and I'm pretty scared about it.

Stripe Identity seems like Identity Theft as a Service.


> DON'T TRUST THIRD PARTY

This is a good policy when ALL first parties meet a certain (regulatory) bar. For banks, I assume that bar is "don't become insolvent" and more recently "don't lend money to terrorists."

The problem is that, as we've seen from the countless hacks in recent years, the first parties are NOT all meeting the bar when it comes to security, namely "don't leak (or abuse) users' private personal info."

And that's unfortunate, because a lot of the time, all a company really needs to know is a "does the registered account correspond (uniquely) to a real human (with certain legal characteristics)." Sometimes they need to know for compliance reasons ("our users are adults" or "aren't terrorists") and other times for uniqueness/fraud reasons ("We want to reduce spam accounts" or "we're paying users $10 to sign up and so need to make sure users aren't signing up multiple times.") It'd be great to be able to answer those questions without having to protect all that personal data that goes into answering it, similar to credit cards.

But your main point stands: if Stripe is allowing companies access to the collected data, then from a security point of view it's little better than having the companies collect and store it themselves. Hopefully Stripe explains their reasoning, or even better, course-corrects early in this launch.


I know it's not your point, but it's mine.

Why would you upload a copy of your passport to Discord, via a third-party or not? The issue here is just trusting people you shouldn't be trusting with things you shouldn't be trusting them with.

The alternative isn't WhizzBangApp doesn't request you upload documents, the alternative is they roll their own WhizBang ID service, or use a Stripe Identity competitor.

I know my bank needs to verify my driving licence or whatever, and I tr.. well banks are heavily regulated anyway, so I'm happy to upload it without caring whether they use Stripe Identity or their own or whatever.

I know Discord has no business with my passport or whatever, so they're not getting it whatever they use under the hood.


It is entirely fair to have to provide KYC documents for a service you need or desire to use but have the digital artifacts usage governed and access limited.

I let my Congressperson know policy is needed about online identity service providers needing better governance over identity data, as businesses aren’t going to do it voluntarily unless the law requires. This should probably be overseen by the CFPB, even though identity is a bit of a walk from finance (while Stripe is still primarily a financial services provider).


My take is that if you need it, Stripe will be better and more secure than rolling your own


More data concentration makes for a more worthwhile target, thus wiping out at least some of the potential upside. The net effect may very well be negative.

Given the regular stream of extremely large data leaks even from providers who should have size, motivation and competency to protect that data, I find it incredibly hard to believe anyone who tries to assure me, that they won't be breached.


> is lobbying the US govt to spend 250 million to make more software engineers

It cost money to "train" engineers , better use taxpayer money to train those damn "human ressource" rather than those 300Billions of Profit.


Still not tempted by Deno to be honest.

All the problem that currently exist in Node are being ported to Deno straight up.

The built-in apis provided by Node.JS are almost non-existent... hence most of them are buggy and quiet tedious to use , it's why the community has created thousands of packages to resolve those issues.

Here I don't see how Deno is solving this , all the APIS seems again so barebone.. instead of having 1000+ dependency from NPM you'll have 1000+ dependency from remote URL with everything set at "read/write" because they need to read one file from your ".env" folder or perform an arbitrary post install process...

The way Ryan managed Node and how the ecosystem as turned to chaos because of is lack of vision and strategy make me not want to try any of it's tech again. Ryan is the kind of guy that gets obsessed over ONE THING and goes berserk for 5 years on that topic until he overdose and quit abruptly.

I don't think that's how you manage a language , when I look at Zig I'm way more confident of what's being done that the current state of Deno...

Node.JS is one of my main language , but the ecosystem around it is an absolute disaster.


It might catch on, it might not. Not everyone has to like it.

I personally do like it a lot. I think of it as Node.JS with a better organized core (with the benefit of hindsight), use of browser APIs whenever possible, and built in Typescript. I think it might catch on once we have some mature MySQL, Express.js, etc libraries.

I know seeing popular tools be rewritten from scratch is tiresome, but I don't think it's unreasonable in this case given that Node.JS and Deno mostly get their JS implementation from a separate program: V8. In that sense, Deno isn't throwing all of Node.JS away. It's just a different attempt to make V8 a command line tool.

And of course, competition is good. Maybe Typescript will become more convenient in Node because of Deno.

Additionally: as someone who uses Linux in their day to day job, I think it's a phenomenal scripting tool and replacement / supplement for Perl / Python. I mentioned this in a comment here the other day, but with this short wrapper, you can execute a bunch of SSH commands simultaneously using the Promise.all JS function (familiar to web devs). Just an example of a cool thing you can do with Deno scripting. https://github.com/gpasq/deno-exec


> use of browser APIs whenever possible

Correct me if I’m wrong, but isn’t Node.js aligning more and more with the browser APIs. For example if you `import { URL } from 'url'` you get the WHATWG standard URL object (it is also available as a global object). Node.js now has EventTarget and event listeners aligned with the DOM Event API. `crypto` is now a global object with the same API as the Web Crypto API. You have ArrayBuffer and Blob in Node.js just like in the browser.

What is it that Deno is doing differently then node here?


fetch() is a notable web feature missing from Node.js core


add WebSockets to that list.


Like I said, is aligning more and more:

* https://github.com/nodejs/node/issues/19393

* https://github.com/nodejs/node/issues/19308

Web APIs is not something that Deno is doing and Node isn’t. It is more something that Deno has done a few of which Node hasn’t yet.


It also looks like Node.js is going to get import maps.


> I think it might catch on once we have some mature MySQL, Express.js, etc libraries.

This resonates a lot with me. I first wrote nodejs apps not because of nodejs but rather because of Express. I could build a simple app very quickly, wire it up to a database, use Passport to secure it and call it a day. It was the libraries that drew me in.


I don't get the "built in Typescript" argument. Everything is running on V8 at the end of the day. With Node you run a one-line compilation step, Deno does the same thing just under the covers.


Maintainability I suppose.


Totally this, and not just for basic shell tasks but also as as scripting tool for CI/CD pipelines.


> you can execute a bunch of SSH commands simultaneously using the Promise.all JS function

It's just as easy in Python 3.5.

Plus async semaphore for resource limiting

And cancel remaining operations when others fail.

Python continues to be the ideal language for scripts.


You can use ZX to have Node in your shell: https://github.com/google/zx


> Ryan is the kind of guy that gets obsessed over ONE THING and goes berserk for 5 years on that topic until he overdose and quit abruptly.

My problem with that statement is: if you knew him personally it's unlikely you'd have said it. And if you didn't, do you have enough samples to be sure he's 'that kind of guy'?

Or are you extrapolating from N=1?


What about Deno's API is still so barebone?

Are you comparing Deno to PHP? Should Deno have all the various database drivers baked in? SDKs to call out to Salesforce/Stripe/Zoom? Should I pay a performance penalty for my app because you need to be able to read XML and make SOAP calls and want a tool that does that out of the box? Would you be happy to take a hit if the roles were flipped and I wanted Excel read/write?

I will not say Deno's perfect, I am still concerned/uneasy enough about how the whole dependency tree resolves for complex dependencies and is managed to not want to try it in production right now, but w.r.t to the ecosystem I'd say it's as good if not better than Node's, albeit smaller, given the focus on building a stdlib within this ecosystem for common usecases (http middleware, database drivers, common file formats etc)

> everything set at "read/write" because they need to read one file from your ".env" folder or perform an arbitrary post install process...

the CLI args allow for tighter scoping that, and atleast the topic of sandox by default is being discussed and the painpoints/edge cases are coming to light, rather than the node/npm model of execute as user and done.


> Here I don't see how Deno is solving this , all the APIS seems again so barebone.. instead of having 1000+ dependency from NPM you'll have 1000+ dependency from remote URL

With Deno you can already do a lot with only what is provided by the main executable. Here's a subset of the available subcommands:

    bundle: Bundles JS. While it doesn't do everything that webpack does, it already provides enough to deploy SPAs. 
    coverage/test: Builtin test/coverage framework.
    fmt/lint: Builtin lint/formatter.

IMO these provide basic tools that are likely necessary for any JS project, yet with Node.js you need a few hundred NPM deps to achieve the same functionality.

Not to mention the builtin Typescript compiler. Starting a few of years ago, I don't even consider the possibility of creating a Javascript project without using Typescript as the main language. With Deno you have it builtin.


Oh yeah forgot about built in TS, didn't even know about the build tools. Interesting, might have to give it another run


> Starting a few of years ago, I don't even consider the possibility of creating a Javascript project without using Typescript as the main ...

The truth value of this statement is suspect but..

In any case, why should typescript be the default when it compiles to JS? Why shouldn't js be the default in a js framework?


I don't understand what you mean by "default". Deno simply provides a builtin compiler which allows it to run Typescript transparently. Clearly Deno also supports JS out of box.


I personally like to have a few dependencies in my projects and a simple runtime vs having a monster of a language with a huge amount a unrelated and specialised APIs (java) and still need to install some additional dependencies.

I think the barebone nature of nodejs and javascript is what makes it great. If you don't like it, don't use it, there are other languages and runtimes out there and node is a really good fit for a lot of people.


I don't. I'd rather trust the tens of thousands of developers working on the core language and core stdlib, than on a dependency some random guy in Albania maintains on his spare time.

Or in the case of the JS ecosystem, you might only use established dependencies, which in turn use dozens more which in turn use dozens more, and the probability that there's a dependency some random guy in Albania maintains very quickly approaches 100%.

Have we already forgotten left-pad?


If you have a problem with library written and maintained by a random guy in Albania you have the option of not using it. If a functionality is so niche that you can only find one fit in the entire npm ecosystem, I doubt this functionality will ever make it to the standard lib of a non-node runtimes.

Personally I like dependencies written and maintained by a random guy in Albania on their spare times. And I would use it when making fun stuff at home. I might even open an issue or a pull request. Dealing with a random guy in Albania sounds way more fun then dealing with a language committee in Silicon Valley.


How many people audit their dependency authors more than 1 level deep? That's the problem: I know who wrote all of my first level dependencies (react, react-router, redux, reacstrap, etc). I don't know who wrote _their_ dependencies, and the 3rd level, 4th level. And I don't think anyone has the time to adequately evaluate that every time a dependency's version gets bumped given how deep the graph goes.


I'm like you. I want as few dependencies as possible. I liked this about PHP: You could get very far without any dependencies at all.

For a toy project, pull in some dependencies. For a more serious project that requires to do due diligence on every dependency you pull in, it gets annoying very quickly.

Also, I kinda dread all the fast-changing version numbers in the JS ecosystem.


There is nothing stopping you from looking at their code and, after vetting it, copying the code and pasting it into your own local JS files. Now you don’t have to worry about anyone tampering with it after you have vetted it.


Sure, I'll do that next time I'm at work, I'll tell the frontend dev running `npm install next` to spend the next 6 months doing a code review of the 258 dependencies in the tree. Boss will have to wait.

https://npm.anvaka.com/#/view/2d/next

There's dependencies like webpack, and "dependencies" like lodash-sortby, is-number, isarray, diffie-hellman, encoding, is-negative-zero or assert. Who in good faith can argue that those are better served as standalone dependencies maintained by who-knows-who instead of being in a standard library?

I so wish someone had the balls (and good enough OpSec) to inject malware into one of those 5 lines long dependencies, causing hundreds of billions of dollars in damages, and then we'll perhaps do something about it.


In practice, you rely on libraries that are popular and/or written by someone trustworthy. "Vetting" libraries amounts to thorough tests of the complete application.


Java is not really a good benchmark for having lean build artifacts. It is a monster but it has other desirable qualities.

In .Net land, the dotnet core runtime weighs in at about 30mb (what you'd need to run a production server), the sdk is about 140mb (for development machines and build servers)(once off setup). If you compile a project that depends on 5 other packages, it will only include those 5 packages and a package for your actual project (assuming 1 project per solution, else n packages for n projects per solution). It boils down to having build artifacts that are super lean, provided you are using the installed runtime on the target machine. You also have the option to package the framework along with your own package, then you don't have to install the runtime on the target machine, and these typically compile down to less than 100mb. It is probably less by now, but I don't use it. You also have the option to bake everything into a single file, much like how Rust does it.

So yeah, I wish more people would play with .Net Core and it's tooling a bit, it's bloody great at moment. Java and it's tooling feels like a behemoth once you get used to the new dotnet tooling.

Take it from somebody that builds build servers and custom tooling (cloning git repo's, building prod binanies, packing them if needed and then move them around (deployment, nuget server etc), all on linux, with C# code) - it is a dream. Calling the the dotnet build tools from my own console apps is a no brainer. My build tools can then be called by other processes in linux like any other cli app, or if I build them as asp.net projects, a simple middleware to intercept calls from nginx to trigger workflows remotely... easy peasy. All while the build tools can talk to Digital Ocean, Azure, Aws via their api's...


Simple, and lightweight sound great in theory but some problems are complex enough they they require a lot logic, and that complexity needs to live somewhere, whether it’s your code base or a dependency. Node.JS could provide its libraries in a modular way so that you only install what you need. Essentially, that’s how npm works now except that the packages are provided by random people, and often times the work of getting them to work together to form a complete solution to a problem is left as an exercise to the reader.

When everything is broken down to very simple packages, you often end up in a situation where your dependency tree is very deep and now keeping track of which packages you use and vetting them becomes a complex task. Many devs are too trusting of the packages they take a dependency on. Remember that npm package that everyone used but was buried like 3 levels deep in people’s dependency trees, and then maintainer got tired of working on it so he handed it over to someone else who then purposefully injected a vulnerability into it which affected a lot of projects?

> If you don’t like it, don’t use it

Again, simpler said than done. My guess is most of us are working on projects where we don’t/didn’t get to choose the tech stack.


I don't understand all the hate towards .Net. The included libraries are amazing.. about 80% of what you will ever need is provided by the framework. It basically provides you with a massive selection of tools, ready to go. The rest you can either build yourself or pull in a (precompiled) nuget package.

It is my main gripe with JavaScript (and with typescript): a lack of a standard library that everyone uses and trusts. Something that should be predictable and boring, is absolute chaos in the js world. I think it's half the reason things like jQuery, momentjs, lodoash and others exist, because people got frustrated with the lack of built-in functionality. npm has just made everything worse. Can't we have a .Net type framework for javascript? Minus the CLR and compilers of coarse, just the framework bits. Or if we need some kind of CLR type layer, why not build it with webassembly? Then all flavours of javascript can call into that? There has to be a clean way forward.


> It is my main gripe with JavaScript (and with typescript): a lack of a standard library that everyone uses and trusts. Something that should be predictable and boring, is absolute chaos in the js world

This is something that Deno is attempting to build with its stdlib: https://deno.land/std

While the stdlib is not shipped with Deno (it is downloaded like any other third party dep), the code there is reviewed and audited by the core team.


I don't understand all the hate towards .Net.

Because it's the same over-abstracted over-engineered life-sucking ecosystem as the Java world.


I think that's more a property of the C# code that's out there rather than a property of .NET or C# themselves. A lot of (most?) C# and Java code in the world is over-abstracted over-engineered and life-sucking just by virtue of them being popular corporate languages and most code being boring. Using C# with Unity, for instance, is a pretty good experience though.


That really depends on the developers involved. Some people like building over complicated complex nonsense to justify their existence, while others build very lean/shallow code and go on with their lives.

The code I write for production in the "realworld" is maybe 1/4 as complex/convoluted as the academic stuff we did in university. And lines of code -wise, maybe 1/10.

You don't need to build an excavator to add some dirt into potted plants, but I can bet there will always be people who build a space-grade shovels with redundant enterprise level handles that guarantees maximum soil filling rates, even when under water... but not everyone is like that. C#/.Net Core definitely doesn't throw you down that path.

I personally don't like AspNet Core, as plenty of the old mistakes are being repeated, some of the same patterns exists, which I'd argue Microsoft had the opportunity to move away from, but didn't. But .net core itself is pretty great (and lean, no overabstraction in the core system).


Yep, was exploring ASPNet Core for a new project and was like nope ;)


But the bad ecosystem in npm is one of the main reason he made deno, that's what deno/std is for.

While not stable yet, they are working to address this very issue

Zig is looking great but also, solving a different problem


Excuse me, but what's this "bad ecosystem in npm" you're talking about? Every single JS lib, pipeline tool, framework is on npmjs.com (react, webpack, bootstrap, expressjs, and 100'000s others). It's the ecosystem that every contender would love to be.

And the lack of a "stdlib" is exactly how and why npm started over ten years ago, via the community-driven CommonJs initiative (JSCI, connect/express.js, the package.json format, middlewares, etc). The idea being that the core packages on npmjs.com are the stdlib on top of what Node.js/CommonJs provides.


> Every single JS lib, pipeline tool, framework is on npmjs.com (react, webpack, bootstrap, expressjs, and 100'000s others). It's the ecosystem that every contender would love to be.

This is only a strength if you accept that those libs (and their dependencies, and their dependencies' dependencies, and so on...) are adequately scanned for malicious behavior. If you don't accept that, then the incredibly deep dependency graph that is typical of frontend projects these days is a liability.


While that's true, this is really orthogonal to the argument. Especially since Deno's API is also anemic, as complained about elsewhere in this thread.


> It's the ecosystem that every contender would love to be.

Trying to clarify - do you mean other JS ecosystems? Outside of JS, NPM is usually used as what not to do, not as an aspiration.


Could you share some examples of ecosystems that are 1) vibrant and active 2) have working, open source, ergonomic tooling of a comparable caliber to VSCode, typescript and friends 3) can target almost any platform, including but not limited to server, mobile, desktop and web?

I’m trying hard to think of any, Java and Python come closest but both fall short.


There are vibrant and active communities around good projects, but npm is the greatest known repository of abandoned, obsolete, not very good and potentially malicious libraries. The bad scales up along with the good; great tools on npm don't make the Leftpad fiasco more forgivable or technical shortcomings less bad.


Fair enough, but I have no idea how that can be avoided if we take Sturgeon’s Law as a given: 90% of everything is garbage.

I’d argue an essential quality in a modern software engineer is ‘good taste in dependencies’, if you will. Adding a dependency for padding a string with whitespace would have gotten you a friendly but stern lecture from a senior dev, in every good team I’ve been a part of so far.


Npm together with the ergonomics of JavaScript/TypeScript is what keeps me in the Node.js ecosystem. Never understood the hate for a massive ecosystem of community build libraries that you can contribute to, fork or modify at will.


Deno has passed 1.0. Having the standard library still be "not stable yet" doesn't spark much confidence in me for something meant to replace node.


Yes, I would have wanted to see a decent stdlib to be a goal for 1.0.

I recently wrote a 10 line script that had to work with date arithmetic *, and while importing URLs is pretty cool, it still was the same shit of spending 80% of the time it took to write the code browsing and evaluating multiple third party date libraries to find one good enough for my use case. So in practice the only improvement Deno had over Node in that example was that I didn't have to run `npm install`. Yay, great.

*: adding two dates together, converting to and from UTC, given a Date find the next midnight. And as expected the most popular JS library couldn't even get one of these simple tasks correct, there's a bug report open since 2017. Incredible stuff from this ecosystem.


I have yet to find a datetime library in any language that was adequate for all purposes I've needed. That's not a particularly damning example in my opinion. I've had to write very weird datetime code in JS, Python, and Ruby. I don't even want to imagine the horrible things you could find in some other languages.


In Java, Joda-Time was a third party library for ages. It was the dominant date-time lib for enterprise development. Then, it was standardised and added to the stdlib: java.time (JSR-310). The Joda-Time project lead, Stephen Colebourne, ran the standardisation process. It was well received by most, even the breaking API changes that SC was adamant were flaws in the original Joda-Time API. I can vouch for it: Both Joda-Time and the java.time libs are excellent. (In my job, I regularly need to perform complex date-time transformations, including time zones.)

I have also used Howard Hinnant's C++ date-time lib: https://github.com/HowardHinnant/date It is also very good. (He was the guy behind move semantics in C++ 11.)

In Python, the stdlib has a function to get UTC now, that does not include UTC timezone... so it weirdly and surprisingly acts like local time zone! There is endless shit-posting about it. I feel bad for Guido van Rossum et al. To be fair, the original Java date-time lib was horrible, so Joda-Time was created. And JDBC (Java database) dates are still horrible.


There is a difference however between "Does all the things my snowflake app needs" and "does the top 20 most common things". The latter is easy, and even quantifiable if you have the time (to scour open source repos for use cases). When you write dozens of utilities, a few libs, and several apps, its likely the common date time use cases will be suitable for 80% of them. Rinse and repeat across numerous other dependencies, and the result is a std library you learn once, that serves 80% of your use cases just fine. Its not about no dependencies, its about dependencies focused on novel or niche problems, not common and already solved ones.


Just because they are attempting to address it does not mean they will succeed.


So what? Should we all give up now and stop doing whatever we're doing now, just because we might not succeed? That's how people learn, advance and improve the world around us - through failure and mistakes.


> Just because they are attempting to address it does not mean they will succeed.

Strongly agree this statement , hence I don't see how switching from "npm" to "raw urls" will solve anything...

The problem with Node dependency is bigger than just "npm is not a good package manager"... Honestly in this case just fork node and replace npm with something else...

Here the problem relies on a mixture between poor built-in apis which are buggy a lack of vision with the language , which have been core to the language since it's origin.;.

Deno doesn't seems to address those at all...

Again it's just seems to be "npm is bad , and i want to use typescript natively with web apis"...

I just know very well that Ryan is redoing exactly the same mistakes as Node with the same obsession he had on "EPOLL"[0] back then that will end up in a new fiasco.

[0]https://youtu.be/M3BM9TB-8yA Can't find the specific part where he mentioned "EPOLL"


> I just know very well that Ryan is redoing exactly the same mistakes as Node with the same obsession he had on "EPOLL"[0] back then that will end up in a new fiasco.

"epoll" is a Linux API for listening on multiple file descriptors. Different platforms have equivalent APIs, and these are normally core of any scalable non-blocking I/O network program.

Can you elaborate why do you think he had an obsession with "epoll"? More importantly, can you elaborate what was the fiasco? Epoll is still used under the hoods by Node.js (through libuv) and many other network servers such as Nginx


npm maybe bloated but I'd say it's better than pip. pip is insane, you have to set up virtual env (hurts UX) but with node you don't have to.


> (...) will end up in a new fiasco.

Are you saying nodejs is a fiasco?


re-[0]: 15:43


If they don't attempt to address it they'll certainly fail.


Yes but let’s not get that cynical about it. People are working hard on these problems.


deno is definitely better in the sense that it's way more compatible with browser and I think that's really big thing in the long run (but nodejs could do that too, eventually)

regarding 1000+ deps, yes that's a bad thing but it's not really about language, it's rather about people. when node started, usual number of dependencies was low.

I know because I was there and I was making fun of maven and how it pulls half of the universe for a simple thing. Now nodejs is pulling the whole universe.

Yet the problem, in my opinion, is not package manager but rather "look I made a package, it does one small thing and it does it well and I dont want it to do more" which leads to many more packages because you really need that thing so what you are going to do? you will add a package on top of package. rinse and repeat and there we are


If the standard library would be richer you'd just ignore half a million of those packages. They'd just die a quiet death.


Look at the python ecosystem and you'll see that it's not the case. Because of its compatibility commitment, a standard library cannot evolve much and its features end up being replaced with external libs.

“The standard library is where modules go to die”


Isn't that one of the big justifications for not bundling the standard library?

It seems to me like not shipping the std library with the runtime is potentially one of the biggest language innovations we've seen in awhile because it should allow the std library to evolve over time in a much more graceful way ... -- you'll actually be able to make breaking changes to the std library as folks who are unwilling to update their code can just use the older version (until the ossified code becomes irrelevant which eventually code that is never changed eventually will)


How does it work when you want to use third-party libraries though? (because even if you have the biggest stdlib in existence, you'll still gonna end up using some external libraries no matter what)


Python packages, in general, are much bigger though.

And you'd probably be surprised how much the stdlib is used. In many environments third party libraries have to be vetted by security, or the developers are junior and can't probably check/understand a third party library so they just take the safe option and use the stdlib and hand-code a bit to make it do what they want, etc.

Plus, the third party packages that are used generally have to offer much higher convenience or quality or scope (or all three) to be adopted over the stdlib alternative.

So the bar is much higher than leftpad or is-odd.


Not really, I mean what do you think is missing from the standard library json package? It obviously solves most use cases since just yesterday flask dropped simplejson. Standard library is just not great for libraries that are not yet stable.


Exactly this. If anything, Deno can help introduce the use of standard libraries as a source of truth to webdevs who may not be familiar with the concept to begin with.


That does happen (and is happening), its just not often big news https://twitter.com/sindresorhus/status/1320788906888089600


I think you're underestimating just how passionate the Node crowd is on customization and reusability. There are feature-rich, extremely popular packages which act as a stdlib in many ways for particular functions - yet there are constantly alternatives to ecosystem-dominating packages that spring up. Some gain traction, some do not. I don't see this changing, even with a robust stdlib. It's the culture around the toolset that drives this.


That's just a post hoc rationalization.

The whole "culture" popped up because people wanted to share code between browsers and backends and there's no tree shaking in Javascript, so libraries had to be super small and modular to keep the code small for the front end, where download/unzipping/code parsing/compiling code speed matters.

If browsers get a big stdlib, many of these libraries will just go away (bye, leftpad!).


> regarding 1000+ deps, yes that's a bad thing but it's not really about language, it's rather about people.

Not sure that I agree that it's about 'people', except in the sense that every problem with languages/their ecosystems are a people problem because people created them; but I 100% agree that it's not about the language.

My take on the situation is that we have 2 separate issues:

1) Auditing, which is basically an economics issue. It'd help a lot if someone with pockets full o' money were willing to fund a couple mil of auditing infrastructure for npmjs. 2) Devs pulling in lots of packages (which pull in packages, all the way down), which _may_ be partially mitigated by a better base language (no more leftpad, etc). Personally I'm skeptical of the better runtime/language solution.

I think one thing that might help is if there was some automatic way of marking packages as 'safe' in the sense of no side effects, no writing to files, no network activity guaranteed. Such packages could be installed with confidence, and have a lower priority for auditing.

Another possible solution would be a cultural shift among developers to prioritize reducing dependencies with every release. I'd love to see that in a release notes, how many packages were added/removed!


> because you really need that thing so what you are going to do?

When that thing is as simple as left_pad, I’d just copy and paste it into my own code. Or just write it myself.

When did so much of development become glueing other people’s code together? Don’t we all know how to write something as simple as left pad? Why was it ever a good idea to pull it in from somewhere else?


Comparing node, deno and zig is like comparing red apple, green apple and sushi.


But deno is written in Rust, so obviously the Zig squad had to chime in. /s


I am trying to expand my horizons from only node to other frameworks. I've seen so much hype around Deno so I looked into it the other day and I feel the same as you. It doesn't seem different enough from Node from a user standpoint. I also agree that the package system seems messy. For right now I'm going to steer clear and get out django and rails.


Try Elixir and thank me later.


Even as a long time Erlang user, the rise of node.js was just sad to watch. I understand why it happened, but looking from the outside what kind of nonsense they were doing when there are such better solutions was sad.


Elixir's standard library is so pleasant to use. Coming from the JavaScript world it is such a breath of fresh air,


Elixir's stdlib is great. It's small, based on a just a handful of concepts, but thanks to how powerful the concepts are it covers a lot of use cases. Every module contains pretty much everything you'd ever need to work with the concept that module implements, be it a String, an Enumerable, a Stream, or anything else in the stdlib.

But, Elixir is cheating. It can stay clean and compact in part because it sits on top of 30 years of development of Erlang stdlib. Erlang stdlib is messy, spread across multiple applications and modules, with module interfaces inconsistent with each other, not to mention parts of it still include compatibility layers for Erlang/OTP version so old that they didn't have lambdas yet. But, the functionality is there, which enables Elixir to have a small, focused stdlib - because when something is missing, you can just grab an Erlang equivalent.

This is similar to what Clojure does on the JVM. JS doesn't have the luxury of sitting on top of a battle tested stdlib, and trying to cover all the functionality of such stdlib is what results in incomplete and unstable APIs and reliance on so many external packages.


How’s the lack of static types treating you? Honest question as someone looking at Gleam.


I've generally had less typing issues in Elixir than I had when working in Python (which generally wasn't a ton but varied a lot depending on the library).

I've found that Elixir's functional programming model generally alleviates type issues. You're always thinking about what you are passing into a function or what function you are calling and with what data structures.

A lot of type issues in JS, Ruby, Python, etc. seem to come from the mutability and object oriented parts.


> all the APIS seems again so barebone..

What is an example of something you are hoping for?


I setup my business entirely around those remote urls.

https://deno.services


> 1000+ dependency from NPM

I've been working with Node in hobby projects since around 2012, been paid for it since 2016, and still don't understand why is that such a problem.

Compared to other language ecosystems, each of those dependencies is smaller and more atomic. If anything, it's closer to the "unix way" of small tools that do one thing and doing it well, rather then developing huge mega-libraries. Since these libraries are smaller, it's easier to change one for another.

Because of that, community is much less likely to settle on one standard way of doing things just because of "how things are done here", and ecosystem continues to evolve and find better ways of writing code. Would any other language ecosystem that is widely used in production go from callbacks to different promise libraries, to standard promise api to async? I don't think so. (Edit: strike that, Rust seems to have done it too. Well, Rust is also awesome). Of course, it means that you have to learn more; but it also leads to things actually becoming better, and not because of some central mandate by language committee, but as a result of a more decentralised gradual evolution. (Not completely decentralised, just compared to alternatives).

In any other ecosystem, pushing a pull request to any framework or library feels like something that you would do only after spending a couple of days of learning all the ropes of this codebase; in NPM, I've done meaningful contribution to a library less then an hour after learning about it's existence.


> still don't understand why is that such a problem

Some reasons it's a problem:

- it's slows and disrupts the development process

- packages get abandoned very easily; not many packages are highly popular/active

- security audits are essentially impossible

> Of course, it means that you have to learn more

JavaScript takes this to an extreme. It literally takes daily effort to keep up.

> In any other ecosystem, pushing a pull request to any framework or library feels like something that you would do only after spending a couple of days of learning all the ropes of this codebase

So you prefer an ecosystem created by amateurs? After years of working with PHP and JavaScript, I don't.


At least if they would have cleaned up the unnecessary promises from the async/await implementation: https://es.discourse.group/t/callback-based-simplified-async...


I can see why npm is annoying, but I wonder what would be a good role model for a better ecosystem. I can personally only compare npm to PHP, Python and Java, and I think npm is far superior. Do you have an example of a better ecosystem/package manager? I'm genuinely curious.


I think some people like to compare Deno’s package management to go’s. I think that is a good comparison since dependencies live (sort of) on a URL in both cases. However I think Deno has improved significantly on the go approach.

When I look for a better ecosystem though, I like to look towards Rust’s Cargo. However that is bit of an unfair comparison since Cargo was heavily inspired by npm, had learned from npm’s past mistakes and were able to improve on it significantly.


Cargo is one of the nicest build systems I've used. Coming from maven, I have nothing but love for what cargo brings and how easy it makes it.

The key thing for me with Cargo (and Rust) is the documentation. I'm able to quickly glean what I need to do from the docs, and often with useful examples that are close to my use case.

I do wish the package ecosystem was set up with namespaces. Abandoned crates, name squatting, etc. should really be a thing of the past. But I guess this fosters creativity in names.


I use NodeJS daily, but I do agree some fundamental APIs are missing. For example, calling remote URLS is critical for most apps, yet Node only offers the "http" and "https" packages (and why different packages!). It should at least have "fetch" support and require installing the "node-fetch" dependency.

I'm glad Deno is here trying to push the ecosystem forward, at least with Typescript, and hopefully Node will learn from them.


Still not tempted by NodeJS or Deno


> Software is buggy. Since when does a court accept the uncorroborated report of software, without any other corroborating evidence, and sentence someone?

Former PwC Auditor here , the court accepts evidence from independent IT Auditing company who are qualified to perform an audit on the system and asses the reliability of the system or from the vendor himself who provide proofs that the system he sells is reliable ( testing , certifications etc..)

If you're developer you'll probably agree with me that this approach is of course a "non-sense" because no software was ever created "bug-free" or can be really defined as "reliable".

Yet , that's how the court works : "The vendor says the software has no bugs , thus the court is rejecting the objection of buggy software. The court found you guilty"

I'm sadden by this news because it depicts how much the mixture of "bad software" and "corporate/enterprise software" are tied together and how much it can impact people on their daily life with irreversible impact.


> or from the vendor himself who provide proofs that the system he sells is reliable

It seems that this was the conflict of interest that made Fujitsu's testimony unreliable. They couldn't admit fault without risking the contract. And I'm not sure that relationship was addressed in court for a jury to note.


What kind of auditor if you don't mind me asking?

And, do you hate Lotus Notes?


I'm really concerned about the alarms raised by Scientific in regards to "Space Pollution".[0]

Elon Musk has been dodging the question for the past years and never gave a clear answer about it aside of "Umbrella" joke...

Some astronomers are suggesting that with multiple space telecom companies ( US + EU + China ) it would potentially mean we would never be able to see space in plain sight ever. At least not without visual pollution.

[0]https://qz.com/1971751/a-flood-of-spacex-satellites-started-...


SpaceX has put in a lot of effort to reduce the visual pollution aspect of starlink.

See https://twitter.com/ralfvandebergh/status/136999054076322611... for how the visibility of Starlink sats has changed over time.

For professional earth based astronomy, it is possible to remove the streaks digitally. But of course there is a slight impact. But what is the alternative? Just stop development of low earth orbit forever?

The future of professional astronomy is space based. Imagine what a telescope you can launch with a single starship launch...


We could be constantly lifting new observatories to space, launch is no longer the constraint, but satellite manufacturing and cost.

NASA needs the SpaceX equivalent of an org that churns out satellites. The next bus to orbit leaves shortly.


I agree, and this applies to satellite manufacturing in general. Starlink has demonstrated that the new launch cadence requires a switch from single unit and small scale to serial production.

It's unbelievable that we don't currently have standard designs not just for observatories, but for communication, navigation, cartography and other satellite types. Cubesats took a step into the right direction, the same needs to happen for even larger payloads.

Another side of the problem is that NASA budget is heavily influenced by politics and PR. I'm sure there's plenty of smart people there who have realized that from purely scientific point of view, ten or twenty less capable and more disposable interplanetary probes or observatories could have advantage over unique absurdly expensive projects like JWT and Perseverance. But they are not as exciting and harder to sell to politicians and general public.


Launch is not a constraint for... about a year now. Satellites have a bit more lag time.

I don't think NASA needs the SpaceX equivalent for satellites - SpaceX itself is causing a boom in satellite manufacturing, so commercial market is accumulating expertise and lowering prices. NASA should find it easier and cheaper to buy or subcontract pieces of satellites too, and focus on bespoke mission-specific hardware.


Maybe SpaceX should make it up to the scientific community by promising to (at no charge) put 100T of satellite telescopes in a high orbit every year once Starship is functional.


That would be awesome PR, and not that expensive with starship.

But for low production rate things like telescopes, launch cost is almost negligible even at current launch prices.

A replacement for hubble could be launched with a single falcon 9. Building it would cost more than a billion.

The James Webb Telescope is at 10 billion USD and counting. Launch with very expensive Ariane 5 will cost maybe 200 million USD, so ~2%.


I don't know why you're being downvoted... maybe it's the "no charge" aspect of your post, but SpaceX offering to send up research telescopes for at-cost-of-launch, or maybe a little over, would be a great philanthropic endeavor.


It would be a great philanthropic endeavor indeed, but I personally have a problem with suggesting they have to do this to "pay back" to the community. They've already paid back to everyone who ever considers launching anything to space, by cutting off a zero out of launch costs - and they're about to cut off another zero.

Launch costs tend to be a small part of mission costs for bespoke scientific hardware - but what makes those missions expensive is a feedback loop: rare and expensive launches -> need to make best use of the mass budget -> increased complexity -> need to make more robust -> increased complexity -> more expensive -> rarer launches -> more expensive launches. SpaceX just kicked that loop into reverse. With that much cheaper launches, people can afford less robust and less complex missions, and do more of them, which lowers the costs as scale kicks in.

SpaceX is making space cheap. That's already a great gift to everyone.


Mirror size is the issue - until we can easily manufacture huge, incredibly precise mirrors in-situ, space based will never replace ground based astronomy.


It’s not manufacture. It’s assembly. And I think astronauts are faster and cheaper than robots for in space assembly. Or they will be once Starship is operational. NASA did a ton of work in EVA orbital assembly with Shuttle (and still chooses to do exterior work on ISS via EVA and not purely robotically) but it was always like 10 or 100 times too expensive. Starship ought to change that. In addition to its 8m diameter payload bay.


There aren't many telescopes with a >8m diameter mirror, and it looks like the largest single mirror is 8.2m

Starship's payload is 8m.


I don’t see how anyone who has read broadly on this topic could say with a straight face that SpaceX has been just dodging this.

SpaceX has done more to mitigate visibility of their satellites than any satellite maker/operator in history, with the possible exception of classified payloads. They installed sunshades that reduce the visibility of the satellites when fully deployed (in operational orbit) to below the visibility limit in almost all conditions. You have to have exceptionally good timing, eyesight, and dark skies to catch recent Starlink satellites once operational now. But a satellite like ISS is so bright and obvious, you can even sometimes see it in the daytime. (ISS is as bright as all new operational Starlink satellites combined.)

Read this article to see the significant changes they’ve made: https://www.spacex.com/updates/starlink-update-04-28-2020/in...


SpaceX is taking concrete steps to make their satellites as unobtrusive as possible without significantly compromising their design. Beyond that, this is really a choice between developing space and not. Quit with the FUD about Musk “dodging” questions, etc.—say what you mean, which is that you think pristine night skies should forever take precedence over the economic development of space. Or, if it doesn’t sound good stated so straightforwardly, don’t.


I can't wake up and see the world without loads of buildings and roads and cars in sight either.


No different from city light pollution preventing optical telescopes being near cities.

Or radio pollution causing constraints for radio-astronomy.

My guess is that cheap reliable worldwide internet connectivity will help science overall by far more than the costs of modifying terrestrial optical observations to mitigate the extra satellites.


There are already 1200 of them in orbit, can you go outside and point at a single one?

When this controversy was started they addressed concerns by reducing the satellites reflectance and it seems to have worked.


A couple points:

1) 1200 is a small percentage of the tens of thousands of satellites that the full system, plus the other similar systems will eventually consist of.

2) Being able to go outside and point out a satellite has absolutely no relevance on the satellite's impact on professional astronomy.

3) It did not "seem to have worked". SpaceX experimented with reducing the reflectivity of their satellites, but only some satellites have that reduced reflectivity, and astronomers found them to be only marginally better.


Yeah, and what's going to advance humanity as a whole more significantly, I wonder... high-speed Internet access for the entire planet... or professional astronomers not being able to see into space as easily as they did 20 years ago?

or... Or... OR... Launching massive powerful telescopes into space using SpaceX rockets for professional astronomers to use?


Their livelihoods are under attack, they're going to fight it regardless of the upside of starlink.


I have no idea who you're addressing. If it's me, that's quite a nice strawman you've built. Next time perhaps try to actually address the words written in my comment rather than inventing some boogeyman that nobody brought up except you.

Here, I'll demonstrate:

>high-speed Internet access for the entire planet

Musk himself has said that only a tiny, tiny fraction of the world will ever be able to use Starlink. It is not anywhere close to "internet access for the entire planet", and certainly isn't providing any more internet access than is already provided by existing satellite internet providers.

>or professional astronomers not being able to see into space as easily as they did 20 years ago?

In case you weren't aware, astronomy is responsible for some of the most significant scientific advancements since for literally millennia. If you really want an answer to your questions, it's this: professional astronomers being prevented from doing research is significantly more of a negative impact to humanity's advancement than the positive impact from 0.001% of the world having access to lower ping internet. It's not even close.

>or... Or... OR... Launching massive powerful telescopes into space using SpaceX rockets for professional astronomers to use?

Even with something the size of Starship, it's physically impossible to launch anything even remotely close to the size of telescopes needed by professional astronomers.


Surely they will only be visible a short time before sunrise or after sunset, when the satellite can see both you and the sun?


If you know the position of the satellites it's very easy to mitigate their effect.


Low Earth Orbit (LEO) satellite orbits decay rapidly thanks to atmospheric drag.


> scary time in tech when end-users are literally seen as an unrealized capital asset

That's my conclusion after trying to raise capital. VC and Angels seemed disconnected from the reality and behave as pure financial analyst.

No emotion or even interest on the idea just : "How much ARR ? Churn ?".

Was really shocked how people behaved , but it's better to have a cold shower now than in ten years i guess.

I feel like software industry is now tightly integrated with capital market , it's saddening to see that.


Venture _capitalists_ and angel _investors_ are money focused? Who would've thought it. They want to buy a portion of your business because (they hope) it will make them richer, if they also think it's a good idea that is just a bonus.


When was that not the case?


Just before the year 2000 tech bubble popped? :P So 20-ish years ago.


> If China wants to use protectionism to wage digital economy warfare, let's respond in kind. No more Wechat, TikTok, Alipay, or League of Legends or Valorant, until China gives equal treatment to foreign internet firms.

That's not how the US Judiciary systems works. You can't ban dozen of apps based on "suspicious behaviour". You need to start an investigation , then you need to collect proofs , then based on those proofs prosecutor we'll be able to decide to sue or not , and ultimately a judge will decide to ban those apps or not.

It's why our western constitutional democracy are doomed to fail.

Our democracies move and react slower than authoritarian regime like China , Russia plus we have many economic interest with them. It's unlikely any politician would ban those apps .

Trump tried with Tik Tok, a judge blocked the order[0], and I agree with the justice. You can't block an app because you don't like it. You need some solid evidence to prosecute and stop a private a entity.

[0]https://www.wsj.com/articles/tiktok-download-ban-is-blocked-...


> It's why our western constitutional democracy are doomed to fail.

Quite a hyberbolic leap from not being able to arbitrarily ban foreign software to “democratic countries are doomed”. Weird


I don't think we need to worry about western democracy when talking about a software company shopping your browser history. Apps and games have developed on all corners of the globe have done this and are doing this.


Google has already been condemned by UE for this type of issue for 150M. That's literally nothing , at least not enough to hit their wallet.

This type of battle take months if not years in court and cost millions of dollars.

Google products can be shipped and removed in a few weeks , far beyond the reach of operation of the current judiciary system.

Today the problem is GHS , tomorrow it'll be "GSuite Safe Account" or "Youtube Safe Video" etc...

There is no point in taking Google to court just for a "one time" condemnation, it's a systematic issue that is tied to Google itself.


EU is working on addressing these issues. They specifically want to label companies like Google as "gatekeepers" in The Digital Markets Act.

From: https://www.pinsentmasons.com/out-law/news/gatekeepers-face-...

> In case of violations, the new law would provide for fines of up to 10% of a gatekeeper's total global turnover. Lasseur said: "The fines are high, but they correspond to the usual sanction regime for violations of European competition law."

> In the event of a repeated offence, following a detailed market investigation by the Commission, the company could even be broken if the Commission finds that there is no effective alternative to ensure that the gatekeeper obligations are complied with.

The fines will no longer be just a cost of doing business, but an existential threat. It may take 10 or 15 years in courts but EU will break down likes of Google for repeated offenses even if the relevant products will no longer exist.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: