Closed Bug 377709 Opened 17 years ago Closed 4 years ago

[SoC] Protecting Firefox Malicious Code in Extensions

Categories

(Firefox :: Security, defect)

defect
Not set
normal

Tracking

()

RESOLVED WORKSFORME

People

(Reporter: chofmann, Assigned: mterlo1)

References

()

Details

Attachments

(4 files, 3 obsolete files)

More to follow.
This first patch exposes the crypto needed to sign extension installation certificates.  This bug should only use sign() and verify(), although encrypt() and decrypt() are provided for completeness.

I think I found the best place to put this stuff.  I'm a new Mozilla developer so suggestions are always welcome.
This patch adds an option to the security preference pane to enable/disable the add-on integrity assurance feature.  It is a simple on/off switch.  Enabling this feature has the effect of creating a certificate and key in the user's databases for use in signing extension code.  Disabling the option will remove the certificate and key.  The option is only available when a master password is set.

I chose to implement the signing keys in the user's databases contrary to the original plan of having the public key kept in a place owned by root.  This is a practical compromise as it is not straightforward for me to handle the authentication and file ownership permissions in a cross-platform way.  I will return to the problem later in the project as time permits.

This patch needs some polish as it does not offer to sign the existing installed extensions when the authorization option is enabled.  Since these extensions remain unsigned, they will be disabled if the browser starts up with authorization enabled (so long as the upcoming extension manager patch is also applied).

The security preference pane was crowded by the addition of the new option (UI elements flowed out of visibility at the bottom of the pane).  I resolved this issue by moving the security warning message options into another tab.  I also grouped the options related to add-ons into their own groupbox.  Finally, the "Show passwords" button was repositioned in the shuffle.  I think it is now in its contextually correct location (next to "Remember passwords" option).  If there was a good reason for it to be below the master password setting, then I believe it can be moved back without causing the pane to overflow.
This is a preliminary patch to the extension manager which adds support for user signing of extensions and integrity assurance.  This does not yet tie into the preference set by the GUI patch, has a few loose ends to be tied, and contains some profiling code which will ultimately be removed.  The patch is provided so that the basic functionality of signing/verifying extensions can be demonstrated.
I have been unable to complete the milestone of closing the gap between verify-time and use-time of extension files.  At the time the proposal was submitted, I hoped that this would be accomplished by hooking into the XPCOM file API.  It turns out that files are opened in a number of ways, most notably via NSPR.  I don't think patching NSPR to interact with a XPCOM feature is the right thing to do in this situation.

I will explore the problem further in the second half of the project.  I plan to find out if the file access points that are key to extensions (and protection against malicious extensions) confine themselves to using nsIFile et. al.  For instance, if XUL/XBL overlays and JavaScript files are accessed using XPCOM abstractions, then it would be worthwhile to guard them against tampering between the time they have been verified and the time they are actually loaded/used.
The full paper describing the work proposed for this bug is available online and will be presented at DIMVA'07 in a few days.

http://research.mike.tl/view/Research/ExtensibleWebBrowserSecurity

I'm not sure how to update this bug's URL and assignee.  It may be because I just don't have the privileges.  Can someone help me with this?

Mike
(In reply to comment #5)
> I'm not sure how to update this bug's URL and assignee.  It may be because I
> just don't have the privileges.  Can someone help me with this?

I've granted you editbugs+canconfirm, you should now be able to take this bug and change the URL.
Assignee: nobody → mterlo1
Status: ASSIGNED → NEW
This is an update to the RSA crypto patch (minor cleanup).
Attachment #270794 - Attachment is obsolete: true
Attachment #279156 - Flags: review+
I don't think you meant to set review+ on the patch. If you want to request review from somebody, you should select "?" and enter a person's bugmail address.
Thank you, Reed.  You're right, I didn't intend to ask for review yet.  Sorry for the false alarm!


These are the performance benchmarks of the extension authorization and integrity assurance solution.  I'm attaching the full numbers in OpenOffice Spreadsheet format.  A brief synopsis follows.

The numbers in the paper (see bug URL) suggest that the performance of the prototype solution could be improved by implementing the RSA cryptographic routines using NSS.  They showed:

Certificate generation (performed once per extension install/update):
Average time spent generating each certificate: 18.7 seconds
Percent of that time used for RSA crypto: 99.5%

Certificate validation (performed each time the browser starts up, per extension):
Average time spent validating each certificate: 751 ms
Percent of that time used for RSA crypto: 94.1%

The certificate generation performance was way too slow for production use, and required the user to click through stalled-script dialogs for the process to complete.  The speed of the system after adapting it to use NSS is:

Certificate generation:
Average time spent generating each certificate: 27.1 ms (down from 18.7 seconds)
Percent improvement over the JavaScript RSA implementation: 99.9% (689x faster)

Certificate validation
Average time spent validating each certificate: 39.1 ms (down from 751 ms)
Percent improvement over the JavaScript RSA implementation: 94.8% (19.2x faster)

The performance exhibited after incorporating crypto from NSS indicates that the solution is now fast enough that it will not negatively impact end-users.  In fact, it will take several tens of installed extensions for the user to notice degradation of the browser's startup speed.
Attachment #279156 - Flags: review+
I don't see how this improves security.  If an attacker has access to your user account, she can modify Firefox (or Firefox preferences) directly.
Hi Jesse,

Please stay tuned for my final report, which will address the issue of tamper-resistant preferences.  The paper has a good description of how we prevent an adversary with only user account access from attacking the proposed mechanism, using a combination of crypto and file system access controls.  If the user can modify Firefox itself with only user account access (as is prevalent now but hopefully not for good) then you have much greater things to worry about than injection of a malicious extension.

Mike
When operating systems start to isolate programs from each other, they'll also isolate programs' "Library" files (preferences, extensions, caches) from each other.  It would be silly to require every program to constantly verify that its data hasn't been tampered with.
Here is an update to the patch implementing a user control for enabling extension authorization and integrity assurance.  It improves on the earlier patch in the following ways:
   * When the user removes their master password, the setting is no longer only disabled, it is also unchecked.
   * The user is now prompted to sign the existing installed extensions when the option is activated.  There is a security concern in having the new feature controlled in this way which I will address in a later comment.

Screenshots:
https://www.mike.tl/view/Research/EWBSIntegrityInterface2#Enabling_the_feature
Attachment #271453 - Attachment is obsolete: true
This is the production version of the extension authorization and integrity assurance solution patch to Firefox's extension manager.  I've called the feature "Add-on Vault" as it is my hope that it will ultimately protect the whole of a user's profile-based add-ons.  Differences from the previous patch are:
   * Code cleanup and review
   * Support for signing all existing extensions (needed by the GUI patch)
   * UI to assist the user in effectively resolving issues with extensions that fail an integrity check
   * Fixes a problem where, during extension installation, the "Phone Home Service" modifies the extension's install manifest but doesn't immediately flush the changes.  This can result in a certificate being created using the XPI's install manifest rather than the updated one, in-turn causing the extension to fail its first integrity check.
   * Provides support for rejecting extension installation during the authorization procedure.  This was in the UI before but the behavior was not implemented.
   * Localisation friendly - moved UI strings into extensions.properties.
   * Benchmark code removed
   
Screenshots:
https://www.mike.tl/view/Research/EWBSIntegrityInterface2#Initial_confirmation_dialog
Attachment #271455 - Attachment is obsolete: true
This report goes into detail about the implementation of the project, issues encountered and solutions explored.  It is broken up into the following sections:

   A. Project contributions
   B. Call for standardization in Firefox install security model
   C. Security considerations on the solution as proposed and how it was implemented
   D. Advice to developers of extensions
   E. Future considerations

-----------------------------------------------------------------------------

A. Project contributions

The following things were achieved during the project.  They may hold value for aspects of the Mozilla platform outside the scope of this initiative.

   1. Developed a framework to support user authorization and integrity of add-ons.
   2. RSA public-key cryptographic functions (encrypt, decrypt, sign, verify) were made available via XPCOM and xpconnect.
   3. Code was written that demonstrates how to generate keys and use them for RSA public-key crypto, all from within Mozilla/Firefox.
   4. Options cleaned up in the security preferences pane.
   5. Underscored the usefulness of greater security in the way Firefox is installed.   
   
-----------------------------------------------------------------------------

B. Call for standardization in Firefox install security model

On many Linux systems, you will find the files that constitute Firefox owned by an account (e.g., root) different from the user accounts that invoke the browser.  This makes Firefox read-only to the user, except for the files in the user's profile.  This is the necessary configuration to provide a high level of security to the parts of the browser that are owned by the end user.

This type of install could be performed on Mac OS X if an installer package is used, rather than having the user simply copy Firefox.app to the /Applications directory.  The installer package would require the user to authenticate as admin (sudo) and the file permissions would be set accordingly.  I'm not sure how much closer Vista brings Windows to being able to support such a configuration for all users.  AFAIK, Windows XP users often run Firefox with administrative privileges.  Perhaps the same is true in Vista, though UAC may impose an additional obstacle in writing \Program Files.

I'm thinking that another bug should be broken out to discuss the security/usability trade-offs in moving Firefox toward more this secure installation configuration (Linux install model) on all platforms.  Note that this model does have problems of its own, not necessarily exclusive to Linux (see bug 318855).  If Windows users are logged in as Administrator, maybe there is a way to have Firefox configured to run with reduced privs.

-----------------------------------------------------------------------------

C. Security considerations on the solution as proposed and how it was implemented

The roots of security in the authorization and integrity solution, as proposed, can be traced down to three things:
   1. The user is the only one who can decrypt the signing (private) key.
   2. The validating (public) key is not writable by the user.
   3. The security features are always on.

I ensured #1 by storing the signing key in the user's key3.db, encrypted with a master password.  This is a slight inconvenience to the user, as they are required to use a master password if they want the benefit of added protection on their extensions.  I rationalized this decision by asserting that security-conscious users will also want to protect their keys and passwords.  Without a master password on key3.db, malware doesn't have to hook into the browser to steal your sensitive data -- it can break into that file on its own.  Alternatively the signing key can be stored in another key database, avoiding the dependency on the master password feature, and further imposition on the user (but still leaving their keys vulnerable).

Problem #2 was a tough one.  Currently the validating key is stored writable by the user, in cert8.db.  Ideally it would be stored in a database owned by a different user (e.g., root), and not writable by others.  This would prevent an adversary from being able to replace the key with one that validates their malicious extensions.  For people that execute a read-only version of Firefox, we could store the validating key in a database that is part of the read-only Firefox, having the user sudo when necessary to write the key.  An alternative is to do verification with the signing key (compare two ciphertexts of a known plaintext (certificate hash)).  This imposes the burden of having the user authenticate every time extensions are loaded.

I think that on secure installations we should leverage the secure base to further protect the user against unauthorized changes to their add-ons.  As the benefit of securely installing Firefox grows, we will find greater motivation to work towards more secure installations on all platforms.

Requiring a master password for add-on authorization to work means that the system becomes optional.  If the user decides not to use a master password, we can't provide our enhanced protection.  Having optional security means that an adversary can attempt to disable it.  A preference panel setting can be altered without much trouble.  A master password can be circumvented by deleting key3.db.  The latter is more destructive, but if we store the setting in an encrypted file the again we are looking at requiring authentication each time we need to load extensions.  The best solution to this problem is a tamper-proof setting that controls the feature.  Having a validating key stored such that the user can't write it, would be the simplest way to indicate that the security feature should be enabled.

-----------------------------------------------------------------------------

D. Advice to developers of extensions

It is worth noting that the extension integrity checks can contribute to an environment that is too strict for some current extensions to operate in.  If they make changes to their file space, it will trigger an integrity violation.  Sometimes these changes can be done inadvertently.
   * ChatZilla inadvertently triggered an exception as its install manifest (install.rdf) was updated by the Firefox extension manager after phoning home to see if the version being installed was compatible.  This problem has been addressed in the extension manager.
   * Foxytunes triggered an exception by renaming its platform-specific (native) components to remove the platform suffix (.linux, .mac, .macintel, etc.).  Google Toolbar solves this problem by having a separate XPI for each platform.  There may be better ways of implementing a cross-platform extension that utilizes native components.
   * Forecastfox triggered an exception as it included an .autoreg file.  It is suggested that extensions register components by other means, as the .autoreg file is automatically removed by the component registrar after registration.  Perhaps the solution could be addressed in the extension manager by ignoring all .autoreg files, provided they are zero bytes.

-----------------------------------------------------------------------------

E. Future considerations

There were a couple of items that I would have liked to address, though unfortunately the project ran out of time for.  For the first, it would be nice if we could close the time gap between when an extension file is checked for integrity and when that same file is actually used.  If an observer detects our scan, it can wait for the file to close, then quickly tamper with the file before the extension is loaded.  This may need to be broken out into a separate bug, calling for a secure add-on loading procedure, after the authorization and integrity assurance code becomes an option for end users.  Solving this problem is made difficult by the fact that NSPR (which doesn't know about extensions) can be used to access files after the extension has loaded.  That is, if we force integrity checks on nsIFiles, our security can be circumvented by going through NSPR.  Looking into running extensions with a reduced set of chrome privileges is probably a good solution to more than just this problem.

Secondly, it would be ideal to protect all add-ons that live in a user's profile.  It should not require too much effort to apply the solution for extensions to other types of add-ons as the system matures.  Extensions are a critical entry vector that should be addressed first, however.  Also, it is important to apply the same type of protection to components registered in the user's profile.  This will likely require changes to the component registrar, having it require user authorization before allowing components to be loaded and ensuring their integrity.

Status: NEW → ASSIGNED
Status: ASSIGNED → RESOLVED
Closed: 4 years ago
Resolution: --- → WORKSFORME
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Creator:
Created:
Updated:
Size: