Hacker Newsnew | past | comments | ask | show | jobs | submit | ekr____'s commentslogin

We actually spent some time making sure that we weren't going to run into problems with browsers. However, as the OP points out, because LE had a cross-signature from an existing CA, browsers didn't have to any positive action to make LE certificates work. This was absolutely essential to getting things off the ground.

Oh, I know you all did and I remember the cross-signing. I worried that you'd get slapped down somehow, that the crappy cert companies would find a way to stop/reverse it, that the project would fizzle out, etc. I thought it was cool as hell but it seemed something so clearly good couldn't stay good but you all have only gotten better over time.

In what way do MITM certificates "improve Internet usability for their citizens"?

I just explained that. Basically government wants to block some specific webpage, say https://en.wikipedia.org/wiki/Nursultan_Nazarbayev. Without MITM, they'll end up with blocking the entire en.wikipedia.org domain, so citizens will lose access to a lot of information. With MITM, they'll be able to target precisely one page and I can read any other wikipedia article without issues.

And with MITM they can read literally all of your private internet traffic… That seems like a significantly worse tradeoff to just using a VPN to browse Wikipedia.

A lot of this is covered in the Let's Encrypt retrospective paper from 2019: https://www.abetterinternet.org/documents/letsencryptCCS2019....

From Section 3.1.

"Let’s Encrypt was created through the merging of two simultaneous efforts to build a fully automated certificate authority. In 2012, a group led by Alex Halderman at the University of Michigan and Peter Eckersley at EFF was developing a protocol for automatically issuing and renewing certificates. Simultaneously, a team at Mozilla led by Josh Aas and Eric Rescorla was working on creating a free and automated certificate authority. The groups learned of each other’s efforts and joined forces in May 2013.

...

Initially, ISRG had no full-time staff. Richard Barnes of Mozilla, Jacob Hoffman-Andrews of EFF, and Jeff Hodges (under contract with ISRG) began developing Let’s Encrypt’s CA software stack. Josh Aas and J.C. Jones, both with Mozilla at the time, led infrastructure development with assistance from Cisco and IdenTrust engineers. ISRG’s first full-time employee, Dan Jeffery, joined in April 2015 to help prepare the CA’s infrastructure for launch. Simultaneously, James Kasten, Peter Eckersley, and Seth Schoen worked on the initial ACME client (which would eventually become Certbot) while at the University of Michigan and EFF. Kevin Dick of Right Side Capital Management, John Hou of Hou & Villery, and Josh Aas constituted the team responsible for completing a trusted root partnership deal and signing initial sponsors."


What's the incentive for individual sites or browsers to do this?

From the site's perspective, they're going to need to have a WebPKI certificate for the foreseeable future, basically until there is no appreciable population of WebPKI-only clients, which is years in the future. So DANE is strictly more work.

From the browser's perspective, very few sites actually support DANE, and the current situation is satisfactory, so why go to any additional effort?

In order for technologies to get wide deployment, they usually need to be valuable to individual ecosystem actors at the margin, i.e., they have to get value by deploying them today. Even stipulating that an eventual DANE-only system is better, it doesn't provide any benefit in the near term, so it's very hard to get deployment.


A fun note: I vibecoded a dumb thingy that monitors the top 1000 zones on the Tranco research list of popular zones for DNSSEC status:

https://dnssecmenot.fly.dev/

Obviously, the headline is that just 2% of the top 100 zones are signed (thanks to Cloudflare). But the funnier thing is: in 5+ months of letting this thing run, it's picked up just three changes to DNSSEC status among all the zones it monitors. The third happened just an hour or so ago, when Canva disabled DNSSEC.


DANE could've worked as an alternative to LetsEncrypt, if all CAs had refused to cross-sign it and essentially killed it for years until everyone's cert stores had caught up.

This isn't correct.

There are two authentication properties that one might be interested in:

1. The binding of some real world identity (e.g., "Google") to the domain name ("google.com). 2. The binding of the domain name to a concrete Web site/connection.

The WebPKI is responsible for the second of these but not the first, and ensures that once you have the correct domain name, you are talking to the right site. This still leaves you with the problem of determining the right domain name, but there are other mechanisms for that. For example, you might search for the company name (though of course the search engines aren't perfect), or you might be given a link to click on (in which case you don't need to know the binding).

Yes, it is useful to know the real world identity of some site, but the problem is that real world identity is not a very well-defined technical concept, as names are often not unique, but instead are scoped geographically, by industry sector, etc. This was one of the reasons why EV certificates didn't really work well.

Obviously, this isn't a perfect situation, but the real world is complicated and it significantly reduces the attack surface.


Nothing mentioned will help for a website with a Let's Encrypt SSL cert. How can I know with confidence that I can conduct commerce with this website that purports to be the company and it's not a typo squatter from North Korea? A google search doesn't cut it. Nothing in this thread has answered that basic question.

It's a non-issue for DigiCert and Sectigo certs. I can click on the certs and see for myself that they're genuine.


No you can't. Even during the EV years, clowning an EV cert was more like a casual stunt for researchers than an actual disclosable event. In reality, there's nothing DigiCert is meaningfully doing to assure you about "conducting commerce" on sites.

Worse than typosquatting is EV’s problem that anyone can register a corporation with an identical name.

https://web.archive.org/web/20171211181630/https://stripe.ia...


I think it is working as intended.

Register a corporation often meant it is linked to a real life, government issued ID.

If you do scam or fraud on that web site, they know where to find you.

... unless, of course, if the CA ain't doing the verification.....


> It's a non-issue for DigiCert and Sectigo certs. I can click on the certs and see for myself that they're genuine.

You can see for yourself that a Let's Encrypt certificate is genuine too.


The order of events is a bit more complicated than this.

Google initially proposed restricting powerful features to secure origins back in February of 2015 (https://web.archive.org/web/20150125103531/https://www.chrom...) and Mozilla proposed requiring secure origins for all new features in April of 2015 (https://blog.mozilla.org/security/2015/04/30/deprecating-non...). Let's Encrypt issued its first certificate in September of 2015.

This isn't to say that these two things are unrelated: Mozilla obviously knew about Let's Encrypt and we considered it an important complement for this kind of policy, and at least some people at Chrome knew about LE, though I'm not sure how it played into their thinking. However, it's not as simple as "LE happened and then people started pushing for secure origins for new features".


I'd also argue, very necessary.

A lot of thd new APIs have to do with accessing hardware. Camera, Microphone, Serial ports (currently experimental) etc.

Given how easy a MITM attack to injection JavaScript or HTML into insecure pages is, a world where insecure pages had access to hardware makes that hardware very vulnerable.

Even though all you'd be doing is reading some random blog etc.

To those who still think serving HTTP is some sort of principled stand, just be aware that injecting malware onto your page at delivery time is pretty trivial. Quite honestly, and I mean this in a constructive way, it doesn't signal "principles" it signals "incompetence".


It's never been clear to me what the rationale for OV was, as the UI wasn't even different like EV was.

Can you elaborate a bit about what you mean by "the blessing of a CA"?

I agree that it's true that you need a certificate to do TLS, but importantly Let's Encrypt isn't interested in what you do with your certificate, just that you actually control the domain name. See: https://letsencrypt.org/2015/10/29/phishing-and-malware.html


Their policy today is to grant certificates liberally. There is no technical guarantee that this remains the case indefinitely, only a political one. I don't doubt the sincerity of this guarantee, but I wish I didn't have to rely on it.

A big factor is that they are serving so many certs, with only a tiny amount of funding. Anything beyond the most basic pre-written list of blocked domain names is infeasible. Analyzing the content of every single domain would increase their resource needs by several orders of magnitude. That's reasonably close to a technical guarantee, if you ask me.

> That's reasonably close to a technical guarantee, if you ask me.

Until the feds show up like:

  Okay, either you block these domains, or you're going to jail:
  politician-x-did-something-bad.com
  politician-y-is-corrupt.com
  country-z-did-crimes-against-humanity.com
  political-opposition-party-w-homepage.com
  blog-that-mentions-any-of-the-above.com
  ... (rest of the list that works for 10 or 100'000 domains)
I complained about the centralization that reminds me of Cloudflare in another place, but in general the more distributed this sort of infra is, the better. Both for technical reasons, as well as political ones. In general, one can plan around potential risks like "Okay, what if I assume that this infra of mine is actually running in Russia and the govt hates me and I need to migrate."

VPSes and domains are pretty easy to move across country borders (e.g. moving from NameCheap to INWX and from something like AWS to Hetzner, at least for simple setups), less so when you don't control the CA.


Yes, but that's still a pre-defined list. They can't say "block every website mentioning politician x doing bad things from getting a cert", because that'd be impossible to validate.

The feds are left playing whack-a-mole, and getting the right paperwork to block each new domain popping up is probably going to take a few weeks. Besides, at that point they could also force the .com operator to do the same, could they not?

I do agree that it would be better if LE was more distributed, though. Having a legally-independent second nonprofit running the same software in Switzerland or something would prevent LE from turning into a massive target for the US government.


Why would the feds bother with let’s encrypt in this situation when it would make way more sense to just go to ICANN and get the domain names unregistered. They already do that all the time.

I agree that technical guarantees are better than policy guarantees.

> It is however a document scoped so it cannot be expanded to include either of those things. Work to define interoperable use of other algorithms, including hybrid algorithms, would be in other documents.

FYI, the specification for hybrid MLKEM + ECC is ahead of this document in the publication process. https://datatracker.ietf.org/doc/draft-ietf-tls-ecdhe-mlkem/


The situation is actually somewhat the opposite here: the code points for these algorithms have already been assigned (go to https://www.iana.org/assignments/tls-parameters/tls-paramete... and search for draft-connolly-tls-mlkem-key-agreement-05) and Chrome, at least, has it implemented behind a flag (https://mailarchive.ietf.org/arch/msg/tls/_fCHTJifii3ycIJIDw...).

The question at hand is whether the IETF will publish an Informational (i.e., non-standard) document defining pure-MLKEM in TLS or whether people will have to read the Internet-Draft currently associated with the code point.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: