xeeeeeeeeeeenu 2 days ago

It always fascinates me when this happens. Don't the CAs understand that the browser vendors can and will kill their business if they don't comply with the rules? It's not like a fine that can be ignored.

How dysfunctional does a company have to be to let this happen?

  • crote 2 days ago

    They genuinely believe they are "too big to fail". They've got thousands of employees, they've been around for 30 years, they are a critical part of public infrastructure: surely something as trivial as a few weirdos in a mailing list couldn't instantly kill their entire business?

    Stuff like this happens when upper management has zero clue about the business they are in. They believe they are in the business of selling certificates, while in reality they are in the business of selling trust. They treat things like the CA/B Forum and the various Root Programs as more like an optional networking event than the combination of judge, jury, and executioner that it actually is - with a completely predictable outcome.

  • skrebbel 2 days ago

    I’m really impressed. CAs are 100% rent-seeking businesses and their position is solely derived from having convinced browser and OS vendors to put them in a list. You’d assume their top prio would be to stay in the list.

  • vbezhenar 2 days ago

    I saw company being killed by failed backup system. One unfortunate hardware failure, bad backups and company service goes offline with no way to recover in timely manner. Big clients require big compensation, company goes bankrupt. One shell script put in crontab could have prevented that, but nobody cared enough. It was not a big company, though. But consequences of one simple overlook were dire.

    • ethbr1 a day ago

      Change Healthcare had a 4 month outage (Feb - June). And furthermore, didn't have functional fallback plans in place.

      Which means their business continuity planning was bullshit.

      The good news is this caused companies in the healthcare space (at least provider, facility, and insurer sides) to start asking more pointed BCP questions to their SaaS vendors.

  • ethbr1 a day ago

    > How dysfunctional does a company have to be to let this happen?

    Surprised no one pointed to the nature of the business as a source of this behavior.

    In a non-innovative, compliance-based industry, you make money by cutting costs.

    This affects the entire business, as you find managers who are effective at cutting costs and architects/engineers who will work for lower salary.

    Multiply that over enough years, and we know where it leads...

    • ethbr1 a day ago

      Thought this reply from 3 months ago was prescient, re: options in response to a previous Entrust failure to revoke issue. https://bugzilla.mozilla.org/show_bug.cgi?id=1890898#c21

      >> I see three possible outcomes:

      >> 1. The root programs continue to be lenient with Entrust indefinitely. Nothing changes.

      >> 2. The root programs continue to be lenient with Entrust for a while, but eventually the mistakes pile up enough that one of the root programs pushes for distrust.

      >> 3. The root programs immediately stop being lenient with Entrust. Entrust is forced to make internal changes to remain a CA.

      It raises an interesting point about what constitutes a historical pattern of behavior, sufficient for infering future deficiencies reliably enough to take present action.

      Here, the motivation seemed to be that (a) enough history had accumulated to estimate Entrust's rate of process improvement & (b) that rate was deemed insufficient. Which seems a decent metric: if perfection is not presently achieved, then remediation progress needs to be seen.

  • viraptor a day ago

    We've seen the CEO of a CA arguing in a public forum that their 3 month trial is better than Let's Encrypt. Yes, those can be dysfunctional and can be led by people who have little idea about the business.

  • quitit a day ago

    I'm guessing very (there's plenty of people on reddit who used to work there and stated as such.)

    Here's the email their CEO/President sent to everyone that uses them:

    Google Chrome announced yesterday that specific public roots used to issue public certificates by Entrust will no longer be trusted by default after October 31, 2024. This decision comes as a disappointment to us as a long-term member of the CA/Browser Forum community.

    To address your concerns, there have been no security implications to the events that led to this distrust event, and you can be assured that your certificates are secure. I also want to assure you that Entrust can and will be able to serve your digital certificate needs now and in the future. And, our ability to do this extends beyond the public roots covered in Google’s decision.

    Additionally, there is no impact on our private certificate offerings – including our PKI, PKI as a Service, and managed PKI – nor our code signing, digital signing, and email (S/MIME and VMC) offerings.

    While the announcement is disappointing, Entrust has been in the public and private digital certificate business for over 25 years and we continue to bring that expertise and capability to your use cases every day. It is our hope that you will allow us to continue to serve your needs and we stand ready to answer any questions you have regarding your ongoing needs.

    Sincerely,

    Todd Wilkinson President & CEO

    ---

    My personal take: I don't see why any of their customers (such as ey.com) would want to split their CA needs across multiple suppliers.

  • 1oooqooq 2 days ago

    All you need is one single [cc]?.gov as your client and you are in business forever.

    • lokar a day ago

      How? If you want to sell public certs you need Google (and apple and Microsoft) to grant permission.

      Private certs are not that big a business.

      • 1oooqooq a day ago

        after you sold a .gov then any discussion about not supporting your root means denying users access to that .gov service.

        • e63f67dd-065b 20 hours ago

          Many government websites use Entrust, and that didn’t stop this from happening. So I don’t think that this is a good theory.

        • lokar a day ago

          CA roots are not connected to tlds

          • devrand a day ago

            They’re saying that once you’ve sold certs to governments, distrusting that root will deny people access to government resources. They’re merely using “.gov” as a proxy for “some government”.

            Also roots can be TLD constrained, typically to ccTLD(s).

            • lokar 15 hours ago

              But they are very carefully not breaking anyone. If you have an entrust cert it will keep working, you can even renew it with them.

      • tialaramex a day ago

        None of this is a "big business". I think thirty years ago there was probably a perception that it could be a big business, it's potentially a license to print money, but sufficient incompetence got them here instead. Look at ISRG's volumes, that's the potential volume available in the Web PKI, but that's at $0, we know the resistance to even low prices is fierce.

        If you asked people from the for-profit CAs about Let's Encrypt before it launched, the impression you'd get was that they're issuing a lot more certificates and this doesn't matter. Millions per day? Ha, we'd barely notice. That was all bluster, they were never doing that.

        I think Apple probably had the best shot to turn this into free money. Apple's customers are very willing to pay more than something appears to cost on the basis that it's Apple so it's worth it. I think you'd struggle a lot more to undercut a $10 Apple PKI product with a free offering that's identical because Apple's customers are used to justifying why they spent more money on the same thing with the logo on it, and they are able to be completely irrational about it and it's OK - a brand rep would look unhinged if they violently attack people who point out that it's bullshit, loyal fans will get understanding or even praise.

        I actually thought about 10-15 years ago that Apple was about to do this, but they didn't and once Let's Encrypt happens there's no room really. Apple does still make money off some places where they're sole issuer and get to charge arbitrarily for doing nothing, but not like they would if they'd seized the entire Web PKI.

  • darkhorn 2 days ago

    Software developers say that we have issues in the software that needs to be fixed or updated. Managment, who has never seen one line of code in their life says "no, make new features". And then the software starts to fall.

mikeiz404 2 days ago

> This approach attempts to minimize disruption to existing subscribers using a recently announced Chrome feature to remove default trust based on the SCTs (signed certificate timestamps) in certificates.

I was wondering how Chrome was able to revoke a certificate based on time without trusting the CA to not back date certificates and it looks like this is due to being able to trust certificate transparency logs instead. This is where they get the signed certificate timestamps (SCT) from.

See also https://certificate.transparency.dev/howctworks/

  • tgsovlerkhgsel 2 days ago

    Even before CT, policies generally trusted the CA to not backdate. Of course a CA could try that, but if caught, all certificates, including previous ones, would be treated as invalid, so it worked.

  • olliej 2 days ago

    This kind of thing was one of the reasons CT was introduced :D

    • agwa 2 days ago

      CT was introduced to detect misissued certificates, but it was never intended to be used as a timestamping system like this. The point of the SCT timestamp was to start the clock on the deadline for the log to publish the certificate, not for use in trust decisions. So when Chrome announced they were using SCT timestamps for trust decisions, my first question[1] was whether anyone was auditing CT logs to detect backdated timestamps. Since then, I have added timestamp auditing to my monitor and should be able to detect a CT log backdating an SCT timestamp.

      [1] https://groups.google.com/a/ccadb.org/g/public/c/wRs-zec8w7k...

      • olliej 2 days ago

        Sorry I think my phrasing was not great.

        The goal of CT was not to make staged distrust of a CA possible (that's just a happy coincidence). It was to make it possible to detect CAs misissuing certificates, which includes CAs back dating certificates.

1oooqooq 2 days ago

All the google root security team's due diligence email are just a list of links to firefox's bugzilla who documented and followed up on all the issues.

https://groups.google.com/a/ccadb.org/g/public/c/29CRLOPM6OM...

  • tialaramex 2 days ago

    In practice the Web PKI is overseen by the general public, via Mozilla's m.d.s.policy. It makes no sense for the proprietary vendors, including Google, to insist on doing something themselves badly when Mozilla is the obvious host for this work.

    The older vendors are even less able to be properly open with their customers (let alone the general public) than Google. At Apple it's probably a firing offence to even confirm obvious decisions - it seemingly took months to get Apple's chosen representative to confirm that Apple's new 398 day rule was an issuance requirement, rather than just something where Apple wouldn't trust longer lived certs in Safari.

    • 1oooqooq 2 days ago

      none of what you list are good excuses for anything. I fail to see the point. Is it that marketing trumps technical know how and it should be ok?

  • agwa 2 days ago

    Representatives from the Chrome and Apple root programs participate in the Bugzilla discussions in an official capacity. But yes, there is significant help from the community in uncovering evidence and grilling CAs.

  • syncsynchalt 2 days ago

    Mozilla's bugzilla is the de-facto site for coordinating issues in CA/B.

    Any root program will refer to it for context on issues.

amluto 2 days ago

> Additionally, should a Chrome user or enterprise explicitly trust any of the above certificates on a platform and version of Chrome relying on the Chrome Root Store (e.g., explicit trust is conveyed through a Group Policy Object on Windows), the SCT-based constraints described above will be overridden and certificates will function as they do today.

This continues to annoy me. Chrome (and other browsers) have detailed trust constraints, e.g. SCTNotAfter, in their own root stores. Why can’t administrators do the same thing?

  • akoboldfrying 2 days ago

    I don't understand what is annoying about this. Wouldn't it be more annoying for Chrome not to offer end users a way to override policy decisions they make in a Google office halfway across the world about what websites you can view on your own laptop?

    • tsimionescu a day ago

      I think they're complaining that as an administrator, you only get a binary decision: trust all certificates signed by this CA, or trust none of the certificates signed by this CA. The Chrome devs can implement more fine-grained decisions, such as trust all certificates signed by this CA with SCT<October 2024, but they don't expose this type of control to admins.

      • amluto a day ago

        Exactly. I’d also like to be able to trust a certificate for a limited set of domains. This would be extremely valuable for all kinds of use cases.

        • tardy_one a day ago

          IMO the problem is worse if considered from the perspective of the user. There is no visual distinction that the chain of trust goes back to a local admin managed store and that the admin can arbitrarily trust certificates outside their proprietary domain.

          It should be perfectly reasonable and probably required for an employee to be able to order reimbursed things like travel arraingements with a credit card on their org provided device, but that org may MITM any trust chain for some administrative convenience.

          The org itself could cross sign with name constraints if they opt to be good, but would probably end up filing a lot of bugs in various software that can't handle it and their being good is the kind of selfless act that rarely happens without a regulatory requirement to pay for consequences of doing a MITM of your employees on the Internet.

          • ethbr1 a day ago

            To me, this seems like a solid tradeoff of authority.

            In practice, complexity and customizability breeds ossification, because "safe" becomes the tiny sunset of common configuration.

            I could definitely see network appliance vendors, IT network security admins, endpoint security vendors, etc. rapidly fucking up everything.

            At least with delegation to browser vendors + certificate transparency logs, we have a semi standard path for a detrust like this to be forced without exploding the ecosystem.

            Additionally, if there were more wiggle room, you'd alter the balance of power between browsers and CAs, which seems decently calibrated now.

          • amluto 16 hours ago

            I can imagine an organization wanting to run a CA for all kinds of reasons, and wanting to ignore some CA/B forum rules for all kinds of reasons. And, if that organization owns name.com and wants its employees to use ordinary web browsers (on corporate devices) to access resources protected by those certificates, then it seems entirely reasonable to have a *.name.com name constraint. The only problem is that browsers don’t support this.

            • tardy_one 8 hours ago

              If they understand that they are bound to respect this, why don't they add the name constraints to their CA certificate?

              The problem as I see it is that whatever method used is optional and insufficient to protect users until the browser highlights the source is not real public trust. Google knows this and started with the claim they prioritize user security while ending with the work around to prioritizing user security. (And without the slightest warning that sending your users to a bunch of financial institutions using improper trust chains is ethically dubious and requires more consideration than the time it takes to click the settings.)

          • richardwhiuk a day ago

            The local admin means "the user's employer's IT department", which, for the sake of a work laptop, they implicitly trust way more than Mozilla/Microsoft/Google/Apple etc who managed the public root stores.

            • tardy_one a day ago

              I don't think a lot of people have the ability to prioritize employer with an IT department in the top percentile over other factors like location, pay and willingness to hire.

              Whether CA/B is good or bad at what it does, it puts about a thousand times more effort into the question of whether to install a CA certificate in the browser than a company that just bought the cheapest solution to one of its problems and wants to install the corresponding vendor certs.

              For example: https://docs.umbrella.com/deployment-umbrella/docs/install-c...

              How many things could be wrong with that system and cause user's traffic to be compromised web wide? What community is checking transparency logs and threatening Cisco to revoke their authority to sell that product? What would that even mean?

dextercd 3 days ago

I wonder if Entrust can survive this. Even if Web-PKI doesn't account for the majority of their income (which it might, I genuinely don't know) this is a huge blow to their credibility.

And for a CA, credibility is everything

  • kseifried 3 days ago

    Entrust has BIMI certs which use a different root (CN = Entrust Verified Mark Root Certification Authority - VMCR1) and for which your choices of a BIMI certificate are: Entrust or Digicert. I doubt it makes as much money as their web certs (BIMI certs are not super common, and they are expensive to issue since there's an actual validation process that typically involves a public notary validating the ID of a corporate officer).

    If you believe https://bimiradar.com/glob

    it looks like Entrust is selling on the order of a few dozen certs a week to maybe upwards of 100-200.

    EDIT: I've asked Google if Gmail will be discontinuing support for Entrusts VMC certificate (and thus BIMI logos), I would guess not since BIMI has some actual requirements, but assumptions are not the best way to make decisions about risk (like our BIMI logo not working later this fall).

    • Kwpolska 2 days ago

      BIMI is a CA racket.

      • ethbr1 a day ago

        Email logo validation and prominent display seems like a perfectly valid use case.

        See arguments about red-warning unencrypted HTTP and how that pushed the web to update.

        Add in that genAI is going to make plausible-looking phishing emails a lot easier for the world to generate en mass, and giving the everyperson something better than "decide if it looks suspicious" is important.

  • farresito 2 days ago

    Entrust makes a ton of revenue from hardware-related products (for example, printing ID cards), so it is far from the end.

    • crote 2 days ago

      Right up until the next contract renewal. "Not trustworthy enough to secure a basic website" isn't exactly a great look.

      • bri3k 2 days ago

        Untrusted by Google is what most laypeople will get out of it.

        • nerdponx a day ago

          Google is still equivalent to the Web for a lot of laypeople.

        • lokar a day ago

          The others will probably follow

  • Animats 2 days ago

    > I wonder if Entrust can survive this

    They've pivoted to payment cards.

nahikoa 3 days ago

The issues identified really show a dumpster fire: https://bugzilla.mozilla.org/buglist.cgi?o2=greaterthaneq&sh...

Directly from Entrust: "Yes, there has been ongoing internal discussion and reflection on the issues found in this and other incidents, which has led to the action items described previously and ongoing changes, including the decision to revoke the certificates affected by this bug. Exceptional circumstances would need to be provided and justified by the Subscribers. Given the nature of the feedback we have received to date, we doubt that the community has any real interest in anything that Entrust could suggest, except to use against Entrust in a destructive, not constructive, way. We therefore would like more explicit and clear guidelines or a definition of “exceptional circumstances” to be adopted and applied equally to all CAs, perhaps through updates in the CA/B Forum requirements."

https://bugzilla.mozilla.org/show_bug.cgi?id=1888714

  • tg180 2 days ago

    A honest translation from the corporate speak would be

    We’ve been endlessly talking about our repeated screw-ups, which led us to revoke the affected certificates. If subscribers want an exception, they need to come up with an extraordinary excuse. We don't care, so we demand clear and strict rules about what counts as “exceptional circumstances” that apply to all CAs, and these should be updated in the CA/B Forum requirements. We are big, who are you?

    ... it's not promising

kseifried 2 days ago

Entrust has BIMI certs which use a different root (CN = Entrust Verified Mark Root Certification Authority - VMCR1) and for which your choices of a BIMI certificate are: Entrust or Digicert. I doubt it makes as much money as their web certs (BIMI certs are not super common, and they are expensive to issue since there's an actual validation process that typically involves a public notary validating the ID of a corporate officer). If you believe https://bimiradar.com/glob

it looks like Entrust is selling on the order of a few dozen certs a week to maybe upwards of 100-200.

EDIT: I've asked Google if Gmail will be discontinuing support for Entrusts VMC certificate (and thus BIMI logos), I would guess not since BIMI has some actual requirements, but assumptions are not the best way to make decisions about risk (like our BIMI logo not working later this fall).

  • Scaevolus 2 days ago

    Aren't BIMI certs an even sillier cash grab than EV certs?

lambdaone a day ago

I've always thought that company names like "Entrust" are hostages to fortune, daring the Fates to intervene. In this case the Fates are the browser vendors.

There's now also the problem of competing with a free alternative that increasingly almost everyone knows about.

  • nucleardog 15 hours ago

    > There's now also the problem of competing with a free alternative that increasingly almost everyone knows about.

    If you read through some of the incidents in bugzilla, you get the strong impression that Entrust’s market is specifically the people for which something like LetsEncrypt isn’t currently a viable alternative (or at least a difficult one).

    In trying to justify not revoking misused certificates, one example they gave for a customer they were granting extended deadlines to was some organization that was contractually obligated to their customers to provide at least 90 days notice of any certificate updates.

    While the deliverable is essentially the same, I don’t think Entrust and LetsEncrypt have really been in competition.

rxu 2 days ago

Can someone ELI5 what the violations linked in the first line are? They seem pretty minor to me but I don't understand certs

  • crote 2 days ago

    Correct, the violations are minor and should be trivial to deal with.

    The problem in this case is that Entrust displayed a complete disinterest into actually solving the underlying issues. Doing an oopsie is one thing. Doing an oopsie, lying about it, refusing to take precautions, and failing to take measures to prevent a repeat despite promising to do so? Completely different story.

    If they can't be trusted to respond properly to minor administrative issues, why should they be trusted to respond adequately during a real security incident?

    • dwaites 18 hours ago

      Indeed one of the things that got raised is that they do not appear to have adequate resources to conduct a mass revocation if signing key material were lost, because they rely on slow manual processes. If they are too constrained to do even minor things, then they REALLY cannot respond to emergencies.

  • whizzter 2 days ago

    Since contents of certificates contents sadly often diff there was a ballot to streamline the contents to lessen the burden on implementations to interpret the differences.

    Entrust missed/ignored the updates to how certificates were supposed to be formed and when caught declined to revoke the incorrectly issued ones (because it's probably a more or less manual process for many admins working in a pre-Let's Encrypt style of fashion) and they didn't want to inconvenience their customers and assumed that they themselves were the important party in the equation (CA's was that historically compared to site-admins).

    The certificate industry has always been quite ad-hoc with CA's being entitled middlemen, we have Let's Encrypt and almost ubiquitous encryption now because browser makers and other internet actors saw security as more important than protecting the CA's business and now that LE is established Google,etc aren't the slightest interested in pampering CAs if they aren't interested in cleaning up the system.

  • plorkyeran 2 days ago

    They are very minor, but because the consequences of an mis-issued certificate can be so high, there’s an explicit policy that misissued certificates must be revoked and reissued promptly. The distrust was due to them refusing to comply with the policy and outright stating that they did not intend to comply in future incidents either.

  • olliej 2 days ago

    There are some issues that are bad, but most of these issues with the _certificates_ are minor.

    The problem is that to be a CA in the root stores you agree to (and entrust voted in support of) a pile of rules, and entrust demonstrated a complete disinterest in complying with those rules.

    The reason for removing trust is not the severity of the original errors, it's the the severity of errors in the response.

    1. The BRs requires revocation of invalid certificates with 5 days unless there is an exceptional reason not to. Entrust did not.

    2. In the event a CA discovers that they are mis-issuing certificates they are required to stop issuing until they have resolved the error. In this case not only did entrust not do this, but they explicitly stated - after the issues were raised, and they were already told they were failing to revoke certs in the required time frame - that they were intentionally continuing to mis-issue

    3. They repeatedly made errors in the past, promised to correct them, and then kept making the same errors

    4. They made claims they were trying to get customers to prepare for revocation in the require 5 day window, but then it turned out they were telling customers that they had 30+ days (which was only discovered when one of the relevant customers forwarded info to someone else)

    5. When a CA discovers miss issuance they are required to file an incident report, and provide detailed information about how it occurred, why it was not caught, what remediating steps are being taken, and what mechanisms are being introduced to ensure a similar failure cannot occur in future. None of Entrust's responses came close to this, until the chrome root store rep came in to say "this is unacceptable", and even then their "improved" reports were incomplete and lacked sufficient detail.

    6. Once they were finally doing the basic steps they were meant to have done the moment they learned of the miss-issuance they repeatedly failed to produce an accurate set of the impacted certificates (as in the provided a list and people outside of Entrust were able to immediately turn around and say "but these certs are also broken, why aren't you listing those details")

    and so on and so forth.

    Google's post to CCADB provides more details than the blog post: https://groups.google.com/a/ccadb.org/g/public/c/29CRLOPM6OM...

  • vikarti a day ago

    A LOT of minor fuck ups which demonstrates Entrust is (likely) not malicious, they just stupid and don't care how to do things correctly. How long until they fuck up something serious?

    This reminds me about discussion about Russian Goverment's NUC Root CA (not trusted by default in Chrome/Firefox, Trusted by Yandex Browser only with some additional verifications to prevent abuse by goverment). Discussion was not about why this cert was necessary in first place, it was about it's creation violating Russian laws and procedures AND violate a lot of technical rules. A lot of people just said - this cert is necessary and it's clear who made it so why we should look to "minor details"? (Links - in Russian https://habr.com/ru/articles/666520/ / https://habr.com/ru/articles/708970/ )

aaomidi 2 days ago

I’m one of the people who really went in depth with Entrust (Amir on Bugzilla).

I’m also an author on https://webpki.substack.com. I will be writing my thoughts on the distrust soon.

I can try to answer any questions folks may have. I can also help folks find ways they can also be involved!

Root programs can only do so much and need surveillance of the CAs from the community.

  • doctorpangloss 2 days ago

    I am a layperson so I appreciate the attention on the matter.

    Regardless of how Entrust is operated, there appears to be significant complexity in CA program that the browsers operate. On the flip side, Let’s Encrypt is basically effortless for me to use, as an end user of an LE secured site and as a developer. Why misallocate all this toil on root CA compliance on the one hand, when LE could redirect that labor towards something valuable instead? What is so challenging about giving LE full leadership on this issue? Where does the proverbial political strength of Entrust and similar entities come from, in an ecosystem where there are functionally 5 cooperating, more or less transparent entities that decide the trust of certificates for 99% of end users? Why does anyone care about any of the CAs?

    • glzone1 2 days ago

      LE has to comply with the root CA standards too. For a variety of reasons there haven't been as many problems with LE - partly because they don't get paid by folks getting certs and issuing certs is a cost to them so their incentives are different.

      I don't get entrust here. It's not like they weren't told what to do.

    • aaomidi 2 days ago

      Let's Encrypt is just a player in the same ecosystem. Effectively they're no different from Entrust, GoDaddy, Google Trust Services, Digicert, Sectigo, etc etc.

      Let's Encrypt started their operations with _automated_ certificate issuance only. They also do not do OV/EV certificates that are much, much harder to automate without providing any real benefits.

      So, LE's mission is to issue certificates under the rules set by CAs and Browsers. (Yes, CAs do participate in setting up rules for CAs.

      > Where does the proverbial political strength of Entrust and similar entities come from

      Generally supporting non-automated certificate issuance. Effectively, technical debt. A lot of older enterprises have done manual certificate issuance, and they don't feel the pressure/reason to switch.

crazysim 3 days ago

Some popular users:

chase.com aa.com

  • dextercd 3 days ago

    Some more:

    washingtonpost.com, cdc.gov, dell.com, jpl.nasa.gov, mastercard.com

  • theandrewbailey 2 days ago

    api.cybersource.com

    This is gonna cause me some headaches, along with everyone else who processes payments through Cybersource, and possibly others :(

    • qmarchi 2 days ago

      CYBS Engineer here.

      We're already working on it. Keep an eye for merchant notifications if you use certificate pinning.

      Now, back to rotating certificates....

      • RulerOf 2 days ago

        Maybe you (or anyone) could shed light on something for me?

        I'm sure leaf certificate pinning is very common among your customers. Assuming that pinning is a manual process where customers decide to implicitly trust a specific cert, what's the point of using a third party CA for those customers all?

        Does anybody self-sign or use a private CA on specific endpoints with longer certificate validity, and let the pinning customers use those?

        • Plasmoid a day ago

          We have explicitly told customers not to pin our certificates and if they suffer downtime due to pinning it will not be considered a breach of our SLA.

          We have one customer who has demonstrated enough competence with certificates that we create a private ca endpoint and let them use that. The private root lasts around 5 years, and they pin to that.

      • crazysim 2 days ago

        Out of curiosity, is your organization planning to switch to Let's Encrypt or just another year long certificate provider?

        It'll be interesting to see what, if any, organizations affected by this switch to: Stick with 1 YR certs or go to the future with free 90 days?

      • mholt 2 days ago

        [flagged]

    • agwa 2 days ago

      I'm curious why that is. Is your API client using a root store that doesn't contain CAs other than Entrust, or pinning to an Entrust CA?

      • theandrewbailey a day ago

        I work on a managed platform (Salesforce B2C Commerce Cloud). Accessing and verifying CAs isn't something that's regularly done, but at least it's editable from the web management UI.

Animats 2 days ago

"Entrust encrypts and secures more than 24 million Swift messages daily."

Wonder how secure that is? That has real potential for extracting value.

  • opless a day ago

    Entrust bought nCipher's product line from Thales in 2019.

    nCipher make HSMs - a lot are used in banks to encrypt transactions on behalf of devices through APIs like PKCS11.

    To answer your question "how secure that is?"; The answer is yes, secure.

noname120 2 days ago

The real question is: why didn't they get booted out earlier?

  • sybercecurity 2 days ago

    Probably worried that it would break something for a significant number of customers, so they took a cautious approach. That's the problem with any Internet-wide change: you don't always know who will be impacted negatively or how severely. No one wants to make a quick change only to find out some critical system is now broken.

  • sieabahlpark 2 days ago

    My guess is Google wanted to really give them every opportunity so that most people come to your conclusion instead of "wow! Google is just trying to censor the internet"

ranger_danger 20 hours ago

I wonder what the chances are that some government has compromised one of the many "trusted" CA certs used by all browsers on earth?

mantra2 2 days ago

[flagged]

  • MavisBacon 2 days ago

    This is a very difficult post not to reply to with a pun

cedws 2 days ago

I wonder which root CA the intelligence agencies use to selectively MITM TLS traffic a la Crypto AG.

  • aaomidi 2 days ago

    Certificate transparency prevents this style of attack.

    • schoen 2 days ago

      As long as the victims are checking it and know what to look for!

      • Sateallia 2 days ago

        Chrom(e/ium) and Safari don't trust certificates that are not in public logs [0].

        [0] https://en.wikipedia.org/wiki/Certificate_Transparency#Manda...

        • schoen 2 days ago

          Right, and that's a fundamental sea change in PKI security posture since the Iranian "ComodoHacker" and the Soghoian and Stamm compelled issuance paper! My point is just that some attackers might be willing to have their attacks show up in public logs if their victims are unlikely to ever notice that and if nobody else is likely to notice it either.

          With Let's Encrypt we made a lot of people's certificate management a "fire and forget" thing, which is exactly what we hoped to do, but if they completely forget about it, it may be that there will be lots of targets against whom nobody would notice certificate misissuance.

          • Sateallia 2 days ago

            I got every self-hosting sysadmin I know to run certificate monitors for sites they maintain but it certainly isn't a common thing to do. I know Cloudflare has a beta certificate monitoring feature which would certainly help a lot with this problem considering their market share if they enable it by default. (Although one problem with this is that they issue backup certificates from other CAs so it'd easily trigger warning fatigue!)

            (I wasn't aware of your credentials when I made my previous comment so I assumed you didn't know about mandatory certificate transparency which is a mistake on my part, sorry! I'll make sure to check profile about sections before I assume again.)

            • amluto a day ago

              > Although one problem with this is that they issue backup certificates from other CAs so it'd easily trigger warning fatigue!

              Indeed, the fact that Cloudflare emails out CT warnings due to their own backup certs is rather embarrassing.

            • schoen 2 days ago

              Yeah, I think it's tricky to know how most sysadmins could make good decisions about this information, especially when misissuance is likely to be less than 1% of 1% of all CA issuance and automated renewal is working properly. Warning fatigue is a pretty big deal here!

              Also, we made Certbot randomize the subject key by default every time it renews, so you have a huge amount of churn in subject keys, so you can't just say "oh, well, this public key has been used for a long time, so it's probably correct!". Every subject key is typically new and is unrelated to every previous subject key.

              I hope that won't turn out to have been a poor trade-off. (We thought it was good to have more turnover of keys in order to reduce the impact of successfully stealing or cryptographically attacking one.)

          • aaomidi 2 days ago

            The other argument is, why bother MITMing when you can go to Cloudflare and get them to share the data with you :)

        • yugcesofni 2 days ago

          Not just Safari, but all TLS connections instantiated on Apple OSes

      • vikarti a day ago

        Not only. Example: Chrome on Android did change some time ago so if CA is in System store (which means it got here from manufacturer or from user which does have root access) - such CA MUST use Certificate Transparency. This rule doesn't apply if CA is in User store (installable by regular user) - https://httptoolkit.com/blog/chrome-android-certificate-tran...

        Another example: Yandex Browser ONLY trust Russian NUC certs if they are in public CT logs,not otherwise (https://habr.com/ru/companies/yandex/articles/667300/ - text is in Russian) (as far as I understood, NOT trusting this CA al all is not option for them or their users, and if user is using chrome/firefox and needs access to sites which use this CA - CA will be just be installed manually so Yandex's solution is more secure, thanks to CTs).

      • bawolff a day ago

        Even still, it turns it much more into a gamble.

        Intel people really don't want to get caught (and whatever CA they use really does not want to get caught), CT turns the attack into a gamble. Even if nobody is checking most sites, CT still creates a deterence factor. Not perfect, but a lot better than the previous status quo.

      • aaomidi 2 days ago

        There's definitely a lot of people watching CT for anomalies (I'm one of them), but more surveillance of it is also good and something I've been trying to advocate.

        It's also why I'm personally against SMIME and think it's a bad idea.