Closed Bug 1066784 Opened 10 years ago Closed 8 years ago

Need an appropriate Content Security Policy header for wiki.mozilla.org

Categories

(Websites :: wiki.mozilla.org, defect)

x86
macOS
defect
Not set
normal

Tracking

(Not tracked)

RESOLVED INCOMPLETE

People

(Reporter: cliang, Unassigned)

Details

As a first pass, I was going to suggest the following:

Content-Security-Policy-Report-Only: default-src 'self' *.mozilla.org *.mozilla.com *.mozilla.net www.google-analytics.com ssl.google-analytics.com *.newrelic.com; img-src 'self' data:;   

Report-Only should mean that the policy is not enforced (i.e. "offending" content will load, but alerts should be generated).  Otherwise, if it's deemed safe enough to do so, we can go straight to "Content-Security-Policy". =) 

The default policy is a "catchall" for "anything that has no specific policy defined".  In this case, the default policy is that any object (images, embedded videos, etc.)  needs to be the same origin as the page itself -or- from one of the whitelisted domains.  I did not enforce that the sources be https; that can be done by prepending 'https://' to the domain specification

Images ("img-src") says anything that is a URL or sourced the same as self.  I don't know how many images are sourced from outside Mozilla to know if this is too broad.  We could just leave this out, resorting to the default policy.

I don't know how many images, videos, etc. are sourced from "elsewhere" (e.g. youtube, AWS, MoFo-named sites (i.e. mozillapopcorn.org), etc.)  
     
@ckoehler: I dimly recall discussions RE: font-loading.  If we're pulling from other sites, we'll probably need to add a font-src section.  


Sites that I found useful for understanding CSP:
  - http://content-security-policy.com/
  - http://www.w3.org/TR/CSP2/

Spelunking bugs, I did find a policy already in use at Mozilla:
  - http://marketplace.mozilla.org/services/csp/policy


[1] https://bugzilla.mozilla.org/show_bug.cgi?id=921423
Whiteboard: [kanban:https://kanbanize.com/ctrl_board/4/1271]
Content-Security-Policy-Report-Only: default-src 'self' *.mozilla.org *.mozilla.com *.mozilla.net www.google-analytics.com ssl.google-analytics.com *.newrelic.com; img-src 'self' data:; report-uri http://reportcollector.example.com/collector.cgi
Do we have such a collector somewhere? I'm not aware of one... don't even know what it would look like.
I think we are collecting them, or at least we have a test for collecting them. We also just heard from Yelp! that they are using Elastic Search (for theirs) and they shared the code. Stefan knows the most on our team so I am going to loop him into this.
Flags: needinfo?(sarentz)
We don't have a CSP log collector running at the moment. We are waiting for a domain name or subdomain to host it on. The project is on Github though, so if you are interested in running your own then you can do so:

https://github.com/st3fan/moz-csp-collector
https://github.com/st3fan/moz-csp-dashboard

All very heavy work in progress. And not even sure if this is the right approach.
Flags: needinfo?(sarentz)
Whiteboard: [kanban:https://kanbanize.com/ctrl_board/4/1271] → [kanban:https://webops.kanbanize.com/ctrl_board/2/63]
What's the status of this?
Flags: needinfo?(sarentz)
I'm not working on the project anymore.
Flags: needinfo?(sarentz)
Flags: needinfo?(nmaul)
Flags: needinfo?(curtis.koenig+bz)
Flags: needinfo?(curtis.koenig+bz) → needinfo?(yboily)
We still don't have a collector set up anywhere.

I don't know the ramifications of this header. What could go wrong if we deploy this in regular CSP fashion (not Report-Only)? I don't see a max-age like an HSTS or HPKP header, so presumably we can roll it right back if something bad happens. I'm wondering if it's feasible to roll this straight out and skip the Report-Only stage altogether.

I'd like to have the collector too, but realistically it's just not going to happen any time soon unless other things are set aside. My vote might even be to ask the OpSec team to handle it... it's kinda-sorta something that'd fall in their wheelhouse. I have no idea what their timeline would be on this either, though.
Flags: needinfo?(nmaul) → needinfo?(jstevensen)
Flags: needinfo?(jstevensen)
Flags: needinfo?(jvehent)
Assignee: nobody → jvehent
Flags: needinfo?(jvehent)
(In reply to Jake Maul [:jakem] from comment #8)
> We still don't have a collector set up anywhere.
> 
> I don't know the ramifications of this header. What could go wrong if we
> deploy this in regular CSP fashion (not Report-Only)? 

Clients could be blocked from browsing the site, we could lose google webmaster tools tracking (3rd party javascript), ...

> I don't see a max-age like an HSTS or HPKP header, so presumably we can roll
> it right back if something bad happens. I'm wondering if it's feasible to roll
> this straight out and skip the Report-Only stage altogether.

Yes, that's all correct.

> I'd like to have the collector too, but realistically it's just not going to
> happen any time soon unless other things are set aside. My vote might even
> be to ask the OpSec team to handle it... it's kinda-sorta something that'd
> fall in their wheelhouse. I have no idea what their timeline would be on
> this either, though.

Let's test it out on allizom first with this header, it's very close to the one used by webdevs on other sites:

Content-Security-Policy: "script-src 'self' https://*.allizom.org http://*.allizom.org https://*.allizom.net http://*.allizom.net http://login.persona.org  https://login.persona.org http://*.google-analytics.com https://*.google-analytics.com https://pontoon.allizom.org; default-src 'self'; img-src 'self' data: https://*.allizom.org http://*.allizom.org https://*.allizom.net http://*.allizom.net https://secure.gravatar.com https://*.akamaihd.net http://*.google-analytics.com https://*.google-analytics.com https://pontoon.allizom.org; style-src 'self' https://*.allizom.org http://*.allizom.org https://*.allizom.net http://*.allizom.net https://pontoon.allizom.org; frame-src 'self' https://login.persona.org; font-src 'self' data: https://*.allizom.org http://*.allizom.org https://*.allizom.net http://*.allizom.net"
Group: websites-security
@christie: Are you okay with us proceeding on this, knowing that we don't have a collector set up to learn about any errors that happen? That means that if there is any breakage, we'll only know by a) user reports, or b) running into it ourselves. Of course we'll do stage first...
Assignee: jvehent → nmaul
Flags: needinfo?(yboily) → needinfo?(ckoehler)
Is there a way we could put this in a report only mode to start with?
We can put it in report only, but it will report to nowhere, as we have no collector (per comment 10). :curtisk, do you want report-only mode so we can test in individual browsers (not to collect data from users)?
I know we don't have a collector; but testing was along my path of thought. An errors should show up in the web console I believe.
We could, in theory, do that. Once we've concluded testing, would we ship it without collectors, or simply remove them having completed our testing?
:atoll just found this recently... https://report-uri.io/

It appears to be a SaaS app that is a collector for exactly this purpose. Cost is right (free), at least for now.

My only question as to whether we can use it is, what sort of PII might be sent to a collector app? Anything we need to be concerned about?
Flags: needinfo?(jvehent)
Redirecting to Jeff as this looks like a vendor review.
Flags: needinfo?(jvehent) → needinfo?(jbryner)
302 to Jonathan
Flags: needinfo?(jbryner) → needinfo?(jclaudius)
:jakem :atoll - I'm happy to engage with one of you to assist in a vendor review.  If we were to engage with report-uri.io, who would be the service owner at Mozilla?
Flags: needinfo?(jclaudius)
I would personally feel a bit uneasy about shipping a giant pile of information to a third party, especially when said information could contain details about possible unfixed vulnerabilities inside Mozilla's sites.  This is less of a concern when it comes to public sites -- where a penetration tester could just stumble across this information on their own -- but it could reveal a bunch of unintended information about internal sites that end up with CSP enabled.

It shouldn't be *too* difficult to build a CSP reporting tool of our own: it's just a POST to a URL that contains a small bit of JSON. It would just need a small amount of parsing, data store, and maybe a simple front end behind the VPN to query it.
Mozilla uses Sentry, no? Sentry has built-in CSP reporting functionality (as of Sentry 8.x). You can just use that so that it's all stored within Mozilla.
(In reply to April King from comment #19)
> I would personally feel a bit uneasy about shipping a giant pile of
> information to a third party, especially when said information could contain
> details about possible unfixed vulnerabilities inside Mozilla's sites.  This
> is less of a concern when it comes to public sites -- where a penetration
> tester could just stumble across this information on their own -- but it
> could reveal a bunch of unintended information about internal sites that end
> up with CSP enabled.

I am quite confused at how a CSP report could contain data that would fall into the unpatched vulnerability class. However, given that the host and referer can contain an IP addrs and the report can contain resouce directories that can be an issue especially for internal sites.

> 
> It shouldn't be *too* difficult to build a CSP reporting tool of our own:
> it's just a POST to a URL that contains a small bit of JSON. It would just
> need a small amount of parsing, data store, and maybe a simple front end
> behind the VPN to query it.

:st3fan had at one time had a few projects to do this but they may be bitrotted by now
https://github.com/st3fan/moz-csp-collector
https://github.com/st3fan/csp-validator
https://github.com/st3fan/moz-csp-dashboard

Yelp had at one point also shared some of their code with Mozilla for doing this in Elasticsearch (http://engineeringblog.yelp.com/2014/09/csp-reports-at-scale.html).
(In reply to curtis.koenig from comment #21)
> I am quite confused at how a CSP report could contain data that would fall
> into the unpatched vulnerability class. However, given that the host and
> referer can contain an IP addrs and the report can contain resouce
> directories that can be an issue especially for internal sites.

Somebody testing for a vulnerability with a browser that does support CSP rpeporting could trigger an XSS that would be blocked and reported.  Not a big deal for users of browsers that do support CSP, but it is a problem for those who are using browsers that don't support it: IE, Android <4.3(?), ESR releases, etc.  Having those stored any place but our CSP reporting tool / Bugzilla is certainly a possible problem.

What the Yelp folks are doing looks really nice; Elasticsearch / Logstash / Kibana is a great way to go about it.
Flags: needinfo?(ck)
I can second April's comments about CSP reports containing sensitive info.  Here's an example snippet from some CSP testing I was doing yesterday...

{ "csp-report" :
  { "blocked-uri" : "self",
    "document-uri" : "http://127.0.0.1:4567/?id=%3Cscript%3Ealert(1)%3C/script%3E",
    "line-number" : 1,
    "original-policy" : "default-src 'none'; report-uri http://127.0.0.1:4567/_/csp-reports",
    "referrer" : "", 
    "script-sample" : "alert(1)",
    "source-file" : "http://127.0.0.1:4567/?id=%3Cscript%3Ealert(1)%3C/script%3E", "violated-directive" : "default-src 'none'"
  }
}

This XSS attempt would be useless on FF/Chrome, which implement CSP, but would still be an indicator of unfixed vulnerabilities that would still be viable on non-CSP web clients that we'd still want to get fixed.
OK, I can see that but this seems like more and more an odd point as I think about it more. 
Since Mozilla does not have an intranet there are no real internal sites (unless something has changed in the last year). We're talking about wiki.m.o which is still public as far as I can see. Everything is brows-able already, I can find this information by just poking at things and watching the console. Given the cost to maintain, the risk involved and the ability to do adequate 3rd party vetting it seems reasonable to use a 3rd party.
> Since Mozilla does not have an intranet there are no real internal sites (unless something has changed in the last year). We're talking about wiki.m.o which is still public as far as I can see. Everything is brows-able already, I can find this information by just poking at things and watching the console. Given the cost to maintain, the risk involved and the ability to do adequate 3rd party vetting it seems reasonable to use a 3rd party.

I'm not at all sure where you got that impression?  Not only do we have a lot of sites that are not accessible via the internet, but we also have a lot of employee-only sites that are hidden behind login screens.

Anyways, I get that this is for wiki.m.o, but I don't want to balkanize our CSP reporting if we can avoid it.  It would be better to get it right in the first place in our own environment.
Assignee: nmaul → nobody
We're not going to progress on this work for the time being. A future release of Mediawiki will ship native CSP support, built-in to and provided by Mediawiki, at which point we'll just use their implementation. Until then, closing RESO INCO for lack of time availability by all teams involved.
Status: NEW → RESOLVED
Closed: 8 years ago
Resolution: --- → INCOMPLETE
Whiteboard: [kanban:https://webops.kanbanize.com/ctrl_board/2/63]
You need to log in before you can comment on or make changes to this bug.