Open Bug 1712106 Opened 7 months ago Updated 26 days ago

Entrust: Invalid localityName

Categories

(NSS :: CA Certificate Compliance, task)

Tracking

(Not tracked)

ASSIGNED

People

(Reporter: dathan.demone, Assigned: dathan.demone)

Details

(Whiteboard: [ca-compliance] Next update 2021-12-15)

User Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:88.0) Gecko/20100101 Firefox/88.0

Steps to reproduce:

On May 19, Entrust received a certificate problem report indicating that two certificates were issued with invalid localityName data:

https://crt.sh/?id=3380704864
https://crt.sh/?id=3380599200

We have done a preliminary investigation to confirm that the certificates should be revoked and are planning to revoke the certificate within the 5 day period.

A full report will be posted here in the coming days.

Assignee: bwilson → dathan.demone
Status: UNCONFIRMED → ASSIGNED
Type: defect → task
Ever confirmed: true
Whiteboard: [ca-compliance]
  1. How your CA first became aware of the problem (e.g. via a problem report submitted to your Problem Reporting Mechanism, a discussion in mozilla.dev.security.policy, a Bugzilla bug, or internal self-audit), and the time and date.

On 19 May 2021 at 15:32 UTC, Entrust received a third-party report that we had issued 2 certificates with invalid locality values.

  1. A timeline of the actions your CA took in response. A timeline is a date-and-time-stamped sequence of all relevant events. This may include events before the incident was reported, such as when a particular requirement became applicable, or a document changed, or a bug was introduced, or an audit was done.

11 Sept 2020 – The subscriber enrolls for 2 Entrust OV SSL certificate and provides the business address locality information with a spelling mistake “Santiando”

11 Sept 2020 – An Entrust Verification Specialist updates the locality during the validation process to “Santiango”, introducing a different spelling mistake

11 Sept 2020 – The verification for this subscriber is completed by the Verification Specialist and is reviewed by a Verification Audit Specialist as part of our standard two-person verification check for OV certificates

14 Sept 2020 – The subscriber issues two certificates based on the certificate profile with the invalid locality value

19 May 2021 15:32 UTC: Entrust receives a certificate problem report from a third party through our incident reporting process. The problem report includes the crt.sh links for the two certificates with l= Santiango

19 May 2021 17:11 UTC: The certificate problem report is escalated internally to our compliance team for review

19 May 2021 17:31 UTC: The invalid locality data for the 2 certificates are confirmed by our compliance team

19 May 2021 17:50 UTC : The certificate profile locality value is updated to prevent further mis-issuance

19 May 2021 16:00 UTC: An internal compliance meeting is held to discuss next steps to contact the subscriber and to plan for revocation

19 May 2021 16:30 UTC: The subscriber was contacted to let them know that the certificate would need to be revoked and replace before Monday, May 24 15:32 UTC

20 May 2021 13:15 UTC: Preliminary discussions are held to discuss technical controls that could be implemented to enhance Entrust verification systems to reduce the possibility of future typos/mistakes with the locality field

24 May 2021 13:04 UTC: The two certificates are revoked

  1. Whether your CA has stopped, or has not yet stopped, issuing certificates with the problem. A statement that you have will be considered a pledge to the community; a statement that you have not requires an explanation.

The certificate profile with the incorrect locality value was updated within an hour of confirming the problem. The subscriber cannot issue any additional certificates with the incorrect locality field.

  1. A summary of the problematic certificates. For each problem: number of certs, and the date the first and last certs with that problem were issued.

https://crt.sh/?id=3380599200
https://crt.sh/?id=3380704864

  1. The complete certificate data for the problematic certificates.

https://crt.sh/?id=3380599200
https://crt.sh/?id=3380704864

  1. Explanation about how and why the mistakes were made or bugs introduced, and how they avoided detection until now.

Our verification system uses a text input field for the locality value. When a new verification request is populated in the system, the verification request includes the locality value that is supplied by the subscriber during the enrollment process. Our enrollment process also uses a text input field for the locality. If a bad locality value is supplied by the subscriber, the value would be updated during the verification process based on the evidence we collect to support the address information for the organization. Note that the two certificates were enrolled using a single order, which means the bad value was only entered a single time by the subscriber.

The original mistake was identified by our Verification team, as shown in the timeline. The agent changed the “d” to a “g” but did not notice that there was still an extra “n” in Santiago. We normally require the verification team to enter these values exactly as they appear in the business check document to avoid any mistakes or typos.

  1. List of steps your CA is taking to resolve the situation and ensure such issuance will not be repeated in the future, accompanied with a timeline of when your CA expects to accomplish these things.

This issue highlights the need for Entrust to implement some kind of field validation for the locality field to ensure that we are able to detect typos or other mistakes that can occur due to human error. We are exploring different technical solutions and will provide an update on our holistic approach to ensure that these human errors do not result in invalid locality information. We will provide a detailed plan and timeline to implement these changes in our system.

In this specific instance, the typo was correct in the certificate profile soon after we confirmed the issue to make sure that no new certificates would be issued against the bad certificate profile.

This issue highlights the need for Entrust to implement some kind of field validation for the locality field to ensure that we are able to detect typos or other mistakes that can occur due to human error. We are exploring different technical solutions and will provide an update on our holistic approach to ensure that these human errors do not result in invalid locality information. We will provide a detailed plan and timeline to implement these changes in our system.

This is a promise to a timeline, but without a timeline to the timeline. So what's the timeline for the timeline?

Flags: needinfo?(dathan.demone)

Ryan - The next release window we have to implement changes to the way we handle locality field information will be in December 2021. Over the next few weeks, we will be exploring various address verification databases that we could use, along with how we would implement this in our verification systems from both an enforcement and UI perspective. We will provide updates on our discussions and decisions as we work out the specific solution so that other CAs may hopefully benefit from the work we are doing. At this point, we are looking at using a third-party address validator to help us catch any potential typos or locality/country mismatches before a certificate is issued.

It should also be noted that part of our solution to this problem may involve a new linter that can help us check for bad locality values. We have already been working on a new linter for the last 2 months as part of our work with the PKI consortium that checks for bad stateOrProvince values. We are looking to extend this project to also include linting checks for the locality field. We are working with organizations such as the Universal Postal Union to see how we might be able to use their data to help us with this project.

Flags: needinfo?(dathan.demone)

Thanks. It sounds like Ben may want to set a Next-Update then to the end of June, to see progress on those discussions. That said, unless/until he or a Mozilla representative does, the presumption of weekly updates would remain.

Flags: needinfo?(bwilson)
Flags: needinfo?(bwilson)
Whiteboard: [ca-compliance] → [ca-compliance] Next update 2021-07-01

We have made some good progress on our plans to enhance our verification systems to detect potential spelling errors in the locality field.
We have analyzed different data sources that we can use as reference points in our system when checking the values that are entered by vetting agents. Challenges we identified include that not all authoritative sources (governments) provide access to this information, commercial data providers lack transparency about data sources, and initiatives such as GeoNames include user contributions that might have not been strictly validated.

We will make a decision in the near future in terms of which data source we will use based on a number of factors such as accuracy of data, frequency of updates, data access methods, and costs.

In addition, Paul van Brouwershaven from the Entrust Compliance team has been working in collaboration with the PKI Consortium (PKIC) and the Universal Postal Union (UPU) to analyze a list of all unique address combinations (state, locality, country, postal code) based on active certificates in CT as part of an overall effort to help improve the quality of data in certificates and to potentially create new linting capabilities that will help check these fields for accuracy.

The other design decision that is still being worked on is how we implement these extended address field checks in our vetting systems. We need to prevent invalid field data such as misspelled values or bad locality/state or province combinations in the address information obtained from the relevant verification source such as a QGIS/QTIS/QIIS. A reverse lookup using the postal code obtained from the verification source is considered the most reliable method to validate and normalize the address information but while postal codes are common for most countries, they do not cover everything.

In case the relevant address information cannot be validated using the aforementioned data sources (because the region is not covered or incomplete), we are considering different options, such as allowing vetting agents to continue entering locality values the way they do today and notifying the agent through the UI that the locality value is potentially problematic. From this point, we could introduce a new workflow that would trigger an additional review process to ensure that the value is valid before the certificate profile is approved for issuance.
As mentioned previously, these enhancements are scheduled for our December release. The development process will start on this feature in early September. We will continue to provide updates on design decisions as we lead up to the development kickoff.

(In reply to Dathan Demone from comment #5)

In addition, Paul van Brouwershaven from the Entrust Compliance team has been working in collaboration with the PKI Consortium (PKIC) and the Universal Postal Union (UPU) to analyze a list of all unique address combinations (state, locality, country, postal code) based on active certificates in CT as part of an overall effort to help improve the quality of data in certificates and to potentially create new linting capabilities that will help check these fields for accuracy.

Could you explain a bit more how this relates? It seems that you're highlighting here that CAs have had a poor time executing in this space, and so the certificates logged in Certificate Transparency seem to be the least reliable starting point. After seeing the spate of Some-City issues, as well as those of bad businessCategory, it seems a bit unwise.

I'm hoping this is something just lost in communication, and it's not that you're relying on CT as the basis for something new, but rather, using CT to identify the problematic practices. But perhaps I've misunderstood?

As mentioned previously, these enhancements are scheduled for our December release. The development process will start on this feature in early September. We will continue to provide updates on design decisions as we lead up to the development kickoff.

Just trying to make sure I parse the timelines here:

  • TBD: Actual decisions on what you'll develop
  • Early September: Implementation begins
  • December: Release to production

And the plan is to provide weekly updates on the TBD from now through September, right?

Flags: needinfo?(dathan.demone)

Ryan - Sorry for the confusion. We wanted to emphasize that Entrust takes this issue seriously, and that besides our own efforts, we are collaborating with the PKI Consortium, as we think that some form of standardization across the industry (like the reference tables published by the United Nations Code for Trade and Transport Locations [UN/LOCODE]) is the best way forward.

The CT data is used to learn from a variety of cases. The linter is identifying mis-issued certificates (which have been reported to the relevant CAs) and is also helping us to identify false positives so that the linter results can be trusted. The results identify common understandings of values that are not included in the authoritative sources, such as 'Baden-Wuerttemberg', which is used in 34k non-revoked certificates and a commonly accepted spelling for 'Baden-Württemberg'. Another example is 'Friesland', which is not included in ISO 3166-2 but is defined as an authoritative value by the European Commission.

By using the CT data with different sources, we have been able to identify coverage and issues in those data sources such as in the data of the Universal Postal Union (UPU).

We would like to only permit official or at least common/normalized spellings.

Your understanding of our timelines is accurate and we will provide weekly updates as we work through the design process to provide updates on important decisions.

Flags: needinfo?(dathan.demone)

We are getting closer to making a final selection on the address validation system/database that we will implement to help us check the accuracy of our locality data.

The main criteria that we are using to evaluate the various solutions are:

Reliable Data: We want to make sure that the data we are checking against is coming from a reliable source and that the data is updated as frequently as possible

Global Coverage: We issue certificates to subscribers globally and need a solution that can provide good data for as many countries as possible. Another item that is being evaluated is if the system can provide different acceptable variations of locality that are used due to abbreviations and translations.

APIs: The system should have robust APIs that we can use to check the data in various ways. For example, we are considering looking up data by postal code when possible as this would be a good way to check other address information. When this is not possible, we would need other ways to check the data depending on what address fields have been supplied by our customer or by a Verification Specialist.

We are hoping to make our final selection in the next couple of weeks.

We will also be working on finalizing some key decisions on what we will do with the data we get back from the address validation system, what we will do if a locality value that we have verified is not present in the address database, and new workflows/UI changes we need to implement in our verification system.

Our focus this week was on finalizing our selection of the address validation system we will use to verify locality data. We have decided to use a system that is based on the Universal Postal Union data. We also held multiple internal discussions around implementing the address validation system and are looking at implementing a new check based on a reverse postal code lookup. The postal code will return a set of valid locality values, along with the corresponding state or province value. We can use this data to ensure the correct spelling of the locality value and to verify that the locality – stateOrProvince - country combination is valid.

Whiteboard: [ca-compliance] Next update 2021-07-01 → [ca-compliance]

We have started manually testing the new address validation system with small batches of postal codes and verifying locality values on customer orders. We are seeing really good success so far. We have design meetings planned next week with our Engineering team to help them start preparing for the development project in September.

A design kickoff was held with our Engineering team this week on Wednesday, August 4. The team will evaluate various design options during the upcoming sprint in preparation for implementation in early September.

Our engineering team is working on high-level designs and is testing the address validation APIs. They will be providing proposals on workflows and UI changes in the coming weeks.

This week, we completed wireframe reviews for the new UI that is being introduced into our verification system as part of the requirements related to this incident. Based on a postal code lookup, our system will present a list of potential locality values that can be selected based on a real-time lookup performed with the external address validation system via their APIs.

Most of the design steps have been completed at this point and we should be ready to start development in our next sprint that starts 25 August.

Whiteboard: [ca-compliance] → [ca-compliance] Next update 2021-09-15

The new feature has been developed and is now in our QA process. We expect that this feature will be released as planned in early December.

Whiteboard: [ca-compliance] Next update 2021-09-15 → [ca-compliance] Next update 2021-11-15

The QA process is completed, we expect these changes to be released in early December as planned.

I suggest we provide a next update when the release has been deployed.

Whiteboard: [ca-compliance] Next update 2021-11-15 → [ca-compliance] Next update 2021-12-15
You need to log in before you can comment on or make changes to this bug.