Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
[flagged] Is Crowdstrike the Final Straw? (gavinhoward.com)
29 points by gavinhoward on July 19, 2024 | hide | past | favorite | 57 comments


> But if governments do nothing, then let us, as an industry, take the chance to do something to avoid another disaster that would give governments the political will to enslave our software, and our minds, to their wills.

Setting aside for a moment the flagrant hyperbole:

The industry will never do something without government pressure. It's a collective action problem—building reliable software has an enormous cost to it that building unreliable software does not. No company is going to swallow that cost out of the goodness of their hearts, and unless all companies in a sector move at once they can't recoup the cost from the customer.

This is the exact kind of problem that governments exist to solve: without intervention the market is broken and safety will always be sacrificed to undercut on price. Whatever the author feels about regulation, it's inevitable in some form or another. The question isn't whether the industry will get their act together before governments start moving (they most definitely won't), it's whether governments will give the industry a window to self-regulate under direct and explicit threat of intervention or if they'll immediately start writing the regulations themselves.


I'd say that is entirely correct. I worked for a while on the Trustworthy Software Initiative, which looked at how we can make software more secure and reliable.

We found that the economic incentives basically work against it. The only practical ways to improve it seemed to be raising awareness and education at the people level (important but not sufficient), and regulation/liability at the corporate level (the only way to really move the needle).


> Whatever the author feels about regulation, it's inevitable in some form or another.

Yes, it is, and I am all for good regulations.

> The question isn't whether the industry will get their act together before governments start moving (they most definitely won't)

I wrote this post to change this.

> it's whether governments will give the industry a window to self-regulate under direct and explicit threat of intervention or if they'll immediately start writing the regulations themselves.

EU's Cybersecurity Resilience Act shows that they are already doing the latter.


Will Microsoft and Crowdstrike be forced to pay for this financially? It looks like Crowdstrike will mostly get away with this due to their EULA.[1] The "final straw" will be when governments refuse to let software companies disclaim responsibility for damages.

If you work for a company hit by Crowdstrike's failures, and are berated for it, a good move would be to call in the head of Legal and ask why they signed off on this.

[1] https://www.msn.com/en-us/money/insurance/crowdstrikes-terms...


"Will Microsoft and Crowdtsrike be forced to pay for this financially?"

What is "this"?

It is possible to effectively disclaim liability for purely economic loss. For over three decades, we have all read the disclaimers, usually in capitalised text, in source code and "EULAs".

But, generally, it has never been possible to disclaim tort liability for personal injury or damage to "other property". (For the UCC Article 2 scholars, let's assume that the software is service-oriented, not a sale of goods.)

In theory, this exception could include tort liability for injuries caused by software. That is, software developers cannot disclaim liability for personal injuries or damage to other property. (Not to imply anyone could successfully plead such a case.)

For example, the top ranking comments in the another thread on the outage, which is approaching 3500 comments, are about the effects of the outage on people in hospital. Is this physical harm to persons as a result of "security updates".

The blog post mentions the idea of a "standard of care" for software developers. That is an invitation to discuss tort liability for software.

Theories are just that: theories. Tort liabiilty for software is all but nonexistant.


IMO the primary responsibility should go to the hospitals & banks that crashed. I wouldn't want to live in a world where the hospitals can say "Well, we talked to McKinsey and they said CS was great" and just wash their hands of this.


crowdstrike is caused by over-regulation, not under-regulation. no competent management team would voluntarily install a kernel module for remote surveillance and exfiltration of all their sensitive data, which is what crowdstrike does when it's working properly; it's only a thing at all because the pci guidelines mandate it

the places where the crowdstrike disaster results in additional regulation will not only be dystopically oppressive as howard predicts; they will also cause themselves additional infosec disasters

george kurtz, ceo and founder of crowdstrike, was the cto of mcafee when they did the exact same thing 14 years ago: https://old.reddit.com/r/sysadmin/comments/1e78l0g/can_crowd... https://en.wikipedia.org/wiki/George_Kurtz


> crowdstrike is caused by over-regulation, not under-regulation.

This is laughably wrong. Tech is notoriously under-regulated.

Crowdstrike is caused by bad regulation. Somewhat common, as regulatory bodies are typically not well versed in tech to properly regulate it.


not all regulation is bad regulation, but all regulation contains some percentage of bad regulation. you can't have more regulation without more bad regulation. sometimes that's still good on net

there are a lot of people who think 'tech is under-regulated', which is to say, more regulation would probably overall improve 'tech'. that point of view is implausible in light of regulatorily-induced disasters like this


> but all regulation contains some percentage of bad regulation

Extraordinary claims require extraordinary evidence. I know plenty of regulation that has no bad regulation within.

Case in point, I happen to work in the financial sector. The financial sector regulation tends to be, generally speaking, quite good.

> there are a lot of people who think 'tech is under-regulated', which is to say, more regulation would probably overall improve 'tech'. that point of view is implausible in light of regulatorily-induced disasters like this

Tech absolutely should be more regulated, especially given the abuse of all the major players in the field.


> Extraordinary claims require extraordinary evidence

Saying that a human activity (regulation) makes a non-zero amount of mistakes hardly seems an extraordinary claim. It is extraordinary to claim that there are zero mistakes in financial regulation. Please supply your proof of the negative.


> Saying that a human activity (regulation) makes a non-zero amount of mistakes hardly seems an extraordinary claim

That was not the claim however. The claim was that "all regulation contain bad regulation", which is a broad unqualified statement aimed at throwing a shadow upon regulation. The kind of bullshit regurgitated by libertarian types that think any regulation is icky.

As for financial regulation, all regulation to ensure that banks have enough liquidity to add robustness to the system are good. Things such as KYC and AML are also generally good (which was my statement by the way). They may over time be improved, but anyone would be hard pressed to argue they should be removed for being bad.


if kyc and aml, very controversial regulations pushed by the 'intelligence community' which couldn't be successfully imposed until the 9/11 attacks, are your example of uncontroversially good regulations, it seems likely that it will be difficult indeed to find a regulation you can be convinced is bad


Do you work in SecOps? Do you manage the security of thousands and thousands of end user devices with actual users?

If so you should be grateful for ways to have actual visibility (and possibly blocking) of dangerous activity on the machine. Before EDRs we were blind.

EDRs could be better, cheaper, more integrated (that part is not too bad) but saying that PCI-DSS is the only reason for them is disinformation (or lack of actual experience with SecOps)


i can't believe i'm writing a reply to a guy whose name is broken spanish for 'brando, the little fuck'. (literally. there isn't an alternative interpretation.)

brando, you've got a lot of nerve touting the superiority of secops and edrs on the day when they caused the worst it outage in history, temporarily crippling the global economy in a way worse than any infosec attack secops has ever stopped

centralizing the security of thousands and thousands of end-user devices in the hands of a small team is a recipe for disaster, even if it didn't come with continuous exfiltration of sensitive data from those devices, and that disaster has finally struck. you, by which i mean secops, have done more damage yesterday than all the attackers in history, and the nature of the disaster demonstrates that it could have been enormously worse — and will be next time. i don't care what makes your job easier. you shouldn't be doing that job at all. it's a worthless, destructive job that makes everyone less secure

this is not monday-morning quarterbacking on my part, either; i happen to have posted this comment 7 days ago (6 days before the disaster) lamenting infosec spending "going to useless or counterproductive measures like crowdstrike" https://news.ycombinator.com/item?id=40951278


> i can't believe i'm writing a reply to a guy whose name is broken spanish for 'brando, the little fuck'. (literally. there isn't an alternative interpretation.)

Oh cool, I had no idea.

Now to your point: have you ever worked in cybersecurity in large, complex and heterogenous companies? and have an idea about what is required to keep such a company secure within the reality of its business?

If yes - congratulations, you should be nominated as CISO of the century. I would seriously like to learn from you.

If not - well, I can give you a lot of advice about football (which I do not follow but I am a great consultant) or DIY (which I am bad at but I am ready to build an electrical and plumbing installation because I saw that in Youtube and, frankly, it looks easy).

We can troll our way indefinitely - but I have a lot of work related to this CS disaster. I am super glad for CS, though, for the time and stress it saved me in the past.


You’re not going to get through the pitckfork mobs, silent reservation is the only winning move for us while the masses use this as a moment to point and say “see, finally, I told you _this_ would happen.”


I know, but this is infuriating.

I am the first to point to dumb decisions, usually Gartner-based, to buy the "top right quadrant" software when much better is available. However, using them would require knowledge and understanding of your needs and not being backed by an army of consultants.

And then there is the useful software to buy. A good EDR is one of them. You have nothing on the open-source market (which I use a lot, supporting the developers through whatever they put in place) and CS is very good. They fucked up great this time but it is fortunately a reversible fuck-up. Not the one where you need to reinstall or you lost data (the latter is less certain, but at least for end users this is not the case).

People who are bigots of one of the extremes ("open source is not professional software" or "commercial software is a scam") do not work in other's worlds. I can be proud of my home lab (I am) but it will not scale to the 500,000 devices I deal with at work. I know that what I am doing makes sense on my laptop, but this is far, far from being the case with the 60,000 users who do stupid things.

One day in the world of the "enemy" would change a lot of minds.


Three problems:

1. Regulatory capture is not something I want to see happen to IT. The big companies love making rules knowing smaller organizations can't fulfill them.

2. Just because you make some new legally binding principles, doesn't mean that almost any source code in existence will magically meet it, or can ever meet it. That code is not getting rewritten overnight, or possibly, ever. This would make regulatory capture worse than normal, because sloppy code that's impossible to rewrite would be grandfathered in, and only the newer, more innovative, smaller companies would be held to the standard.

Just take Google - they have over 2 billion lines of code. Just attempting to touch it is more dangerous than leaving sleeping dogs lie. Sucks if you're a competitor - writing your own 2 billion lines is going to be almost impossibly harder.

3. In which case, you've raised the ladder while doing very little to prevent a reoccurrence. Countries that are not the US will gain the edge in innovation, research, and development.

Also, edit:

4. The only reason everyone used Cloudstrike is because everyone's trying to do a CYA (Cover Your Ass) move; which is much easier to outsource to someone else. To regulate would further increase this tendency, which is the opposite of what we need right now.


Author has it exactly backwards. Crowdstrike is installed on a huge number of corporate machines because of rules that have been imposed from above. Probably the #1 reason IT departments install the stuff is because it automates compliance for various cookie-cutter security standards.

So now, instead of having to think through your threat posture and be smart, you can deploy your magical unicorn AI software and have safe harbor ("I followed the industry best-practices!"). Just ignore the fact that said software is exporting a big juicy attack vector, wherein a third-party randomly dumps kernel-level updates on you, without your oversight, testing or permission.

What could go wrong?


You seem to have missed the message. I am saying that we should impose rules on ourselves so that we don't have rules imposed on us from above.

Rules from above and rules by us should be in different galaxies as far as how well they work.


Who is going to set these rules we impose on ourselves?


Us, if we are smart.

The things I linked to are ideas. Others will have ideas too. Let's come to a consensus about them.


Physicians were not smart. They had licensing and liability imposed on them after quacks killed too many people.

Civil engineers were not smart. They had licensing and liability imposed on them after bridges fell, dams burst, buildings collapsed, killing many people.

Software developers will not be smart either. They will cry about not wanting liability and make accusations of gatekeeping until something bad enough happens (Crowdstrike isn't it; not enough people died as a direct consequence) and licensing and liability is forced on developers.

And, to echo sentiments voiced by other replies, the anti-government hyperbole is more kooky than persuasive.


Thank you for your comment. I agree that software developers won't be smart.

> And, to echo sentiments voiced by other replies, the anti-government hyperbole is more kooky than persuasive.

I am serious about it, not hyberbolic, but I will plan to leave it out of future stuff about this.


Requiring crowdstrike or similar to be able to do business isn't a requirement from government, it is from exactly this industry consensus where business, insurance, etc, gets consulting advice on what best practices exist and should be required.

We have to do precisely what we've done because it isn't working?


I didn't miss the message. Who is "us", exactly? What does that look like, in practice, if not something like SOC2 or ISO 27000?

I mean...maybe you're suggesting that Crowdstrike should have followed some as-yet-unknown standard for deploying software. But if so, it's the same problem -- we'll have butt-covering compliance checklists for software development now, applied thoughtlessly by consultants promising safe harbor. Turtles all the way down.


It’s hard to address this submission as it is so hyperbolic that it’s difficult to pick it apart. It has a strong second amendment vibe, where the choice is between self-regulation or a future armed stand-off with the government. This is a false dichotomy.

There are already other, non-regulatory, systems in place to deal with issues like this. Market forces and law suits spring to mind immediately.


> Market forces and law suits spring to mind immediately.

Those things don't seem to be working.

And it isn't a future armed standoff that I am warning about. It is a future where any protest is impossible.


> Those things don't seem to be working.

You're assuming two things:

1. Regulation will decrease the likelihood of a failure (debatable - Boeing is certainly heavily regulated)

2. Cloudstrike will survive unscathed; when this very well could be a long-term corporate extinction event.

On that note, let's say you discover a small security bug. If your small security fix takes 7 layers of auditing and 20 lawyers to get fixed to satisfy your guild rules, you're not going to bother. Multiply this by hundreds of little bugs that are "out of scope" or "too small to worry about"; and you're Microsoft right now, a national security hazard, and you have no idea what combination of little bugs got you here.


The most competent and trustworthy developers and admins I know have impressive home labs and portfolios of contributions to libre software.

The most incompetent and untrustworthy developers and admins have resumes full of certifications.

Any professional certification you create will be a target for "gaming" by people who view it as a ticket to a sweet gig.

Trust in "professionals" has almost totally collapsed in the U.S. anyway. Doctors and lawyers no longer have the trust of the public. I don't see how professional societies and ethics review boards have helped them be more trusted by the average citizen.


I know China essentially nationalizes successful businesses, but do they impose laws on Devs yet in a way this post warns?


Not as far as I can tell. You're right that China does seem to be in a fairly unique position to actually do this though - it is interesting to consider what would happen if they did. Looking at recent engineering failures at Boeing, Tesla, Volkswagen etc., such enforcement of engineering standards could even give them a competitive advantage on the world stage in a way that markets captured by lobbyists could never realistically achieve.


No. Based on everything else that goes on in the world, I'm expecting absolutely zero to come from this. Also you don't really offer a solution or anything. Just kind of a "Let me write something topical with some keywords then share it to HN to soak up some clicks"


You didn't follow any of the links in the post?


As long as people do not understand IT, and most still do not, some will profit on the ignorance of the many, and the sole cure is culture, laws came after, not before.

The cure to current IT social issues is FLOSS, desktop centered computing, but it's unachievable without a mass understanding of the role of IT. Even if ALL modern cars stop suddenly, all payment systems collapse for a week people will not learn because they do not want to.

I suspect any "civilization" can only change after major disruptions like a war, not "the old one", but new generations born after the war. There will always be some who foresee the needs of a radical change, the opportunity of a revolution and so on, but they regularly remain too few to steer the society...


No. Cheap skate corporations dont assume the responsibilty for security. In fact the CEO in a quarterly report may use an "IT glitch" as a problem causing a loss as if thats not his responsibilty too. Deplorable state of affairs. They simply do not realky give a sh*t about customers - their very livelyhood. And the sheep accept the abuse.


Well we are not a guild and software devs naturally hate the guild mindset, so that's extremely difficult to impose on ourselves.

Maybe insurance company and lawsuits are the next good answer. If people start mass sueing company X, the lawsuit cost should persuade the owners to at least do something a bit different.


The problem is too much industry consolidation into Crowdstrike as forced and regulated by the government onto numerous sectors. The answer is more diversification and self-hosting, also fewer centrally managed global services.


The medical discipline does not get to exist separate from government. Self-organizing first does not prevent the fact that doctors in Texas are now deathly afraid of discussing abortion.


It's not a lack of professionalism that caused this. It's the abandoning of MULTICS and all that stems from that premature optimization.

We've been nuts to run any systems based on Ambient Authority outside of a 1980s college classroom environment. Connecting everything via the Internet deranged us into a hellscape just waiting to emerge, as it does periodically.

You can't enumerate badness, as virus scanners try to do. Thus you have to TRUST all the code you run, which is nuts. Operating Systems should NEVER require that you trust code, especially mobile code (stuff that came from outside your 1980s college classroom).

We have fuses and circuit breakers, and all sorts of ways to limit the damage a short circuit can do, but no such analogs for limiting the scope of damage a chunk of code can do to the world. This is insane.

It's not the users, system admins, Microsoft, or anyone else. It's the security model underlying the OS. This problem was solved in the 1970s, for f*cks sake!


> We have fuses and circuit breakers, and all sorts of ways to limit the damage a short circuit can do, but no such analogs for limiting the scope of damage a chunk of code can do to the world. This is insane.

We have the exact thing for that. They're called virtual machines and containers where you have absolute control over the networking and disk access. You practically have to be a software developer to use them, but they exist. Most of us exchange convenience for security, however, so here we are.


Virtual machines are the equivalent of plugging each appliance into it's own generator. It's a crude approximation of what should be trivial. It doesn't have to be this way. We don't have to trust anything outside of the microkernel of the OS.


I doubt that the CrowdStrike debacle will (or should) move any regulatory needles, but I do know it should be an instant extinction event for CrowdStrike as a company--and any company that remains stupid enough to keep the CrowdStrike malware installed deserves the same fate.

[Edit:] I have no financial (or otherwise) interest in CrowdStrike or any related product or vendor. My point here is that by failing to practice the most fundamental QA in their release process, CrowdStrike directly caused a global IT infrastructure meltdown--which is, of course, what other product is supposed to prevent.

The "punishment" needs to be extinction. Only when that happens will other firms like this, along with the brain-dead lemmings who make most enterprise IT purchasing decisions, start acting responsibly.


Ok but the commercial aviation industry already has massive government regulation and oversight and yet Boeing. Something else is wrong and I dont even think it is something that even government can fix.

Add: I think this is related to the failure of the USSR. And by definition that was government everywhere.


Under OPs proposal, the IT version of Boeing would be putting out safer planes/saferSoftware because the quality engineers would loose their certification by doing knowingly bad work or illegally bad that might cause harm. There more than a few whistleblowers willing to speak up. There's lots of red tape to get there.


But isnt this already the case? At some point Comrade we have to realize that maybe the Soviet system doesnt need an improvement but actually needs to be destroyed.


Irrespective of this incident, all companies, whether US based or others, should never use anything like Crowdstrike or Cloudflare or Google or any other that exfiltrates all their data to such a centralized player, which in turn can then forward it all to the NSA.


A person who says "Us as an industry" and refers to Crowdstrike is not talking to _us_, he's talking to high-flying corporate operators and stock owners.


Author here.

I am talking to both. Programmers should set the rules and impose them on their empoyers or the "tech industrialists" won't accept them.


When we have a social revolution and workers control the production of goods and services, then we will set the rules. Unfortunately, for now, we are forced to bear the rules imposed on us. Like having malware like CrowdStrike software installed on our computers, making them slower and sometimes dysfunctional.

But now that you mention it... if software companies were more densely unionized in more of the developed world, especially in the US; and if those would be independent and membership-controlled unions, more like the IWW and less like UAW, SEIU and such - then maybe that would have been more realistically possible and not in some far away future.


Industry-led regulation has never worked in the entire history of the world.


Industry-led content regulation has been very effective at staving off legal regulation in the United States. The MPA rating system and ESRB are very imperfect but nonetheless are success stories.


Okay, I'll give you that. Let me rephrase: Industry-led regulation can work, in industries that won't kill anyone or even realistically lose anyone a bunch of money for non-compliance, if-and-only-if government regulation exists as a real and powerful enough alternative to make it stick.


Yes, but if we already have regulations, governments may be lazy enough to simply put force of law behind our regulations.


I'm sure the news media will frighten everyone into having the governments seize more power.


I feel like these type of failures are by design more than ignorant negligence.

How else do you wield control and influence?

The next one will be we-cant-withdraw-money-from-atm-or-teller level of failure

If you are paid 7 digits a year to ignore and not do your job isn't that essentially being bought?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: