Closed Bug 758188 Opened 12 years ago Closed 12 years ago

Exclude Contents/Resources from Mac OS X signing

Categories

(Firefox Build System :: General, defect)

All
macOS
defect
Not set
normal

Tracking

(Not tracked)

RESOLVED WONTFIX

People

(Reporter: mkaply, Unassigned)

Details

I'm not sure this is feasible, but I thought I would throw it out there.

It's possible to rebrand Firefox by changing a couple branding files except on Mac.

On Mac, you also have to change some files in Contents/Resources (*.icns and en.lproj/InfoPlist.strings)

It would be nice if this directory was excluded from signing, allowing a third party to change these files but still maintain the signing integrity.
Blocks: 730924
Do we really want builds that have been rebranded to carry a Mozilla signature? I thought we didn't allow rebranding of our official builds at all.
As far as I know, you're allowed to rebrand as long as you remove all vestiges of Mozilla.

Also, it's OK for someone to rebrand Firefox and distribute it within their organization.
This is not technically possible: the key hashes the entire contents of the package quite intentionally (and those branding files you'd change would also invalidate the signature). In order to change the branding you need to resign the package with your own developer ID key (or remove the signature).
> This is not technically possible: the key hashes the entire contents of the package quite intentionally (and those branding files you'd change would also invalidate the signature).

That doesn't jive with what Ben said in his post (http://blog.mozilla.org/bhearsum/archives/287)

> Michael, code signing shouldn’t affect anything that only touches distribution/, because we’ve excluded that from being signed: https://github.com/mozilla/mozilla-central/blob/5c6ff20e7d60ff56ee5747542d9875dcef45c65c/browser/app/macbuild/Contents/_CodeSignature/CodeResources#L19


So will files be able to be added to distribution and extensions and still have the package signed?
Yeah, we can exclude files from the signature. For example, we already exclude a bunch of files that aren't included with all MARs (updates.xml, updates/*, removed-files). I don't see a reason why this wouldn't be technically possible.
(In reply to Ben Hearsum [:bhearsum] from comment #5)
> Yeah, we can exclude files from the signature. For example, we already
> exclude a bunch of files that aren't included with all MARs (updates.xml,
> updates/*, removed-files). I don't see a reason why this wouldn't be
> technically possible.

Given this, should we consider removing anything but the actual binary from the code signature? Would that meet the requirements Apple has placed on us?
That doesn't seem like a good idea. The point of a code signature is to verify that the app hasn't been modified. We should sign all the functional parts of the application, including the binaries and the chrome. I think leaving the branding unsigned is probably harmless, since while it would allow people to redistribute official builds with renamed branding but valid Mozilla signatures, they wouldn't be able to change the core application code without invalidating the signature.
No longer blocks: 730924
(In reply to Ted Mielczarek [:ted] from comment #7)
> That doesn't seem like a good idea. The point of a code signature is to
> verify that the app hasn't been modified. We should sign all the functional
> parts of the application, including the binaries and the chrome. I think
> leaving the branding unsigned is probably harmless, since while it would
> allow people to redistribute official builds with renamed branding but valid
> Mozilla signatures, they wouldn't be able to change the core application
> code without invalidating the signature.

I'm most interested in just passing Apple's requirement - the security team hasn't weighed in about the other benefits of code signing with an Apple dev cert. Including Dan for his take.

From the release POV, I'm more worried about a broken code signature on update (causing the app to not launch) as opposed to keeping our users more secure through code signing. More security feels like a feature request, while continuing to function on 10.8 is a requirement.

I also sent a quick email to QA to make sure they're including code signature verification on Mac as part of their sign-offs.
(In reply to Michael Kaply (mkaply) from comment #0)
> It's possible to rebrand Firefox by changing a couple branding files except
> on Mac.

That's arguably a bug in the other platforms. A signed executable only means so much if you can replace everything in omni.ja and turn it into a completely different kind of application. From that standpoint the Mac app signatures are already better than what we've got on Windows, although on Windows it's clear that some files are signed and some aren't whereas on Mac we've got a signed "app" where the partial nature is not clear.

(In reply to Alex Keybl [:akeybl] from comment #8)
> I'm most interested in just passing Apple's requirement - the security team
> hasn't weighed in about the other benefits of code signing with an Apple dev
> cert. [...] I'm more worried about a broken code signature on
> update (causing the app to not launch) as opposed to keeping our users more
> secure through code signing. More security feels like a feature request,
> while continuing to function on 10.8 is a requirement.

What are Apple's requirements and why did they impose them? Surely it's to get "more security" for their platform. Do we really want to subvert that intent?

The items in Contents/Resources don't pose a security risk if they're unsigned, it just seems like a strange scenario for a user to receive a branded "Not Firefox" and find it signed by Mozilla. Signatures are supposed to be a recipient's assurance that the item (executable bundle in this case) came from the signer and has not been tampered with and that's not entirely true here. But it's basically a branding issue and I'll let the brand people decide if they care.

More worrying to me is comment 4 because those excluded directories can definitely change the behavior of the application, possibly in ways contrary to Mozilla's mission or in ways that make the application less secure (accidentally or not). It could also provide a friendly host to anonymous malware that would otherwise have to get (steal?) a dev cert in order to infect the machine, a cert that could then be revoked when discovered. It would seem to violate the spirit of what I think Apple is trying to accomplish, anyway.

Anyone know if there's any Apple TOS associated with signing apps? If there is it may say something about this kind of extensibility; if it does and we violate it we may end up with our own cert revoked.
(In reply to Daniel Veditz [:dveditz] from comment #9)

> More worrying to me is comment 4 because those excluded directories can
> definitely change the behavior of the application, possibly in ways contrary
> to Mozilla's mission or in ways that make the application less secure
> (accidentally or not). It could also provide a friendly host to anonymous
> malware that would otherwise have to get (steal?) a dev cert in order to
> infect the machine, a cert that could then be revoked when discovered. It
> would seem to violate the spirit of what I think Apple is trying to
> accomplish, anyway.

If we didn't exclude those directories, it would not be possible for an organization (school/enterprise/govt) to customize Firefox and redistribute it without getting their own developer certificate for signing.

How easy is it to get a cert for signing for Mac? (I know on Windows, it's quite expensive and time consuming)

As far as the malware issue goes, that's already possible on Windows and has been for years sinec they only sign the executable. Has it been a problem? Have we seen rogue Firefoxes in the wild.

Amd what Apple seems to be trying to accomplish is to lock people into the App Store and make it more difficult for developers to distribute their own apps.
(In reply to Michael Kaply (mkaply) from comment #10)
> How easy is it to get a cert for signing for Mac? (I know on Windows, it's
> quite expensive and time consuming)

It's $100 and relatively simple. It's free if you have the ability to install your own root certificate on all of the machines you care about (you can use a self-signed cert in that case IIRC).
I've sent an email about this to the EWG mailing list to get some feedback.
Maybe this is a non issue? From someone on the  list:

> I'm not in the developer beta so I haven't personally tested it,
> but my understanding from those who have is that the signing restriction
> only applies to applications that are given the com.apple.quarantine.xattr
> attribute (typically as a result of downloading it from a web browser).
> Other methods of deployment like Munki do not trigger this quarantine,
> and therefore the Gatekeeper program in 10.8 doesn't care about whether
> or not it is signed.
(In reply to comment #11)

> It's free if you have the ability to install your own root
> certificate on all of the machines you care about (you can use a
> self-signed cert in that case IIRC).

But if you sign your app with anything else than a cert signed by
Apple's "Developer ID Certification Authority", your app won't run on
OS X 10.8 in the normal use case.

More about the "normal use case" in a later comment.

An ordinary developer, after paying $100 for a year-long membership in
the "Mac Developer Program", can get a "Developer ID Application
Certificate".  Apps signed with this will run on OS X 10.8.

Getting	a "Developer ID Application Certificate" for a whole
organization may be more difficult.  I know Mozilla's done it, but I
don't know the details.
(In reply to comment #13)

> but my understanding from those who have is that the signing
> restriction only applies to applications that are given the
> com.apple.quarantine.xattr attribute (typically as a result of
> downloading it from a web browser).

This is correct (as of the current ML DP).  And yes, it's a gaping
hole in Apple's Gatekeeper program.  And not the only one.

So (currently) an app can run unsigned, or with the "wrong" kind of
signature, if it's been installed using an installer (like the pkg
installer).  Adobe Reader is an example.

But Gatekeeper was only turned on in Mountain Lion as of DP3 (about a
month ago).  And who knows how Apple may change it before the final
release.
> More about the "normal use case" in a later comment.

Currently you need the "right" kind of signature if you've dragged your app out of a dmg package downloaded using most web browsers (e.g. FF releases or Safari).  These web browsers add a quarantine extended attribute to the dmg file, which gets copied by the OS to the app itself when you drag it out of the dmg file.

The signature only gets checked the first time you double-click on your app.  If it's unsigned (or has the "wrong" kind of signature) you'll see the "unrecognized developer" dialog.  Otherwise you'll see the quarantine dialog ("this app was downloaded from the web ...").
(Following up comment #16)

But your app will run fine even in the above-described "normal use case" if you've changed "Allow applications downloaded from" to "anywhere" (from "Mac App Store and identified developers").

And even if you haven't changed this setting, you can right-click on an unsigned (or wrongly signed) app and choose "open".  Do this once and your app will now run normally every time you double-click on it.  There's no documented way to get back to the original state (though I know an undocumented way).

Finally, any app will run unsigned from the command line, or dynamically loaded (like a plugin).

This is how things are currently.
Note that only FF releases add quarantine info to a downloaded dmg file.  Nightlies don't, because the bundle id of nightlies isn't "org.mozilla.firefox".  This is an FF bug.  But fixing it had no real urgency before Apple tied signature checking to its quarantine checking infrastructure.

See bug 400227.
You can use "xattr -l" to view extended attributes.
(In reply to Daniel Veditz [:dveditz] from comment #9)
> What are Apple's requirements and why did they impose them? Surely it's to
> get "more security" for their platform. Do we really want to subvert that
> intent?

Subvert's a strong word - in the short term (FF13, 14), my only concern is to make sure Firefox runs on 10.8 without any extra prompting. That's why I suggested making a very loose code signature with as much excluded as possible, to prevent the risk of breakage.

Taking advantage of Apple's new security is a secondary goal that could wait till FF15.
I agree with Alex.

Mountain Lion's current signature checking infrastructure is badly broken, and may very well be in flux.  We shouldn't rely on it for security until it's been finalized, and then only once we know more about it and can decide whether it really makes our apps more secure.

In the meantime it should be treated simply as an obstacle to running on Mountain Lion.  We just need to make sure we're compatible with it.
(In reply to Steven Michaud from comment #21)
> I agree with Alex.
> 
> Mountain Lion's current signature checking infrastructure is badly broken,
> and may very well be in flux.  We shouldn't rely on it for security until
> it's been finalized, and then only once we know more about it and can decide
> whether it really makes our apps more secure.
> 
> In the meantime it should be treated simply as an obstacle to running on
> Mountain Lion.  We just need to make sure we're compatible with it.

We also shouldn't open ourselves to any(zero, none, nada) easy of install risk with our consumer build to save enterprises a $100 certificate. That would be crazy.
Based on recent comments, and related work in bug 759318, I think this is WONTFIX.
Status: NEW → RESOLVED
Closed: 12 years ago
Resolution: --- → WONTFIX
won't fix - removing sec-review-needed flag
Product: Core → Firefox Build System
You need to log in before you can comment on or make changes to this bug.