Closed Bug 588704 Opened 10 years ago Closed 10 years ago

Lack of cookie encryption security hole

Categories

(Core :: Networking: Cookies, defect)

x86
All
defect
Not set
major

Tracking

()

RESOLVED DUPLICATE of bug 19184

People

(Reporter: antithesis, Unassigned)

References

()

Details

User-Agent:       Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.3) Gecko/20090824 Firefox/3.5.3
Build Identifier: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.3) Gecko/20090824 Firefox/3.5.3

It is my understanding that browsers store cookies unencrypted, and I will explain why this is a critical security hole.

I will explain why it may not be widely exploited yet, probably because of the lower hanging fruit, i.e. there are easier login CSRF to attack poorly programmed servers. But in the case that the server has been programmed correctly, I will explain that the lack of cookie encryption is THE main security hole for login CSRF attack.

Also it is critical that you understand that the client script and/or server can not do this encryption. It is must be done by the browser and in a very careful way. The password used to encrypt the cookies is never stored in storage that survives a power down and not stored any where (not even in application accessible memory) that is not restricted by hardware to provide access only to the operating system. I assume the operating system would store the user's password on login to the computer, in a volatile memory location that has hardware enforced access rules such that only the operating can access to the password. The operating system would also encrypt and store a cryptographically secure hash of the application (a trust key), so that requests to decrypt data would be restricted to the application that had requested the data be encrypted. Upgrades to an application executable binary file would need to be registered with the operating system by the former application binary, in order to upgrade this trust key.  Isn't the browser "remember password for this site" implemented with similar concern for the fact that hackers __WILL__ (!!) get inside the computer (no matter how good your firewall is, unless your firewall is to disconnect from the network)?  The only threat then is a rootkit attack on the operating system, or exploitation of OS or hardware bug.

Apparently there is some overhead for encrypting and decrypting cookies, so this encryption may only be done on request, or some browser's may decide to encrypt always (to protect all scripts).  An optional argument could be added to the setcookie script API call (and maybe one day to the HTTP specification for _setting_ cookies from the server side too, but this isn't required nor a priority).

The following explains that login CSRF attacks are very difficult to defend against with high reliability.

Robust Defenses for Cross-Site Request Forgery
Adam Barth, Collin Jackson, and John C. Mitchell
In Proc. of the 15th ACM Conf. on Computer and Communications Security
(CCS 2008)
http://www.adambarth.com/papers/2008/barth-jackson-mitchell-b.pdf

However, I have implemented a superior solution for my site, except for one main weakness-- the cookie file is vulnerable.  I think others will want to emulate my solution.

Never is relying on __optional__ features of the client a security solution (e.g. Referrer for HTTPS, or Origin for HTTP). That is the antithesis of security because it creates a false level of trust, based on an optional feature of the client which the server has no control of. The hacker can create any client he wants to pound your server. The most reliable solution to the CSRF login problem is as I have implemented at AsiaDear.com . The server never uses any persistent client data for login (never accesses the cookies), except when /index.php is requested (assuming "remember me" was checked) and injects it as GET input into the login form /rest/user.php submission request. But since /index.php never accepts any GET or POST inputs, then no server side-effects can occur when index.php is requested. Thus no CSFR is possible by sending requests from other origins to /index.php. The /rest/* APIs (e.g. user.php, photos.php, msg.php) are always require the session id GET/POST input to prove the user is signed in (note, non-geeks prefer term "sign in" than "log in"), and there is no way to get that that session id unless you can access the unencrypted cookie file, hack the browser binary, exploit a browser bug, network intermediary attack, or if due to my fault index.php has a bug. If the user is signed in with "remember me" checked, and then you request one of /rest/* apis, "you were signed out" is returned. But then if you do something at index.php, you are still signed in. Exactly how it should work.

Tangential note: only a unique session id is stored in the cookie (and this id does not expire if "remember me" is checked), not the user's password. So even if the unique id is compromised, the hacker will not be able to change the user's password or do any other type of action where the server requests the user password (user has to type password in again) instead of the session id.

Reproducible: Always
I forgot to mention that the trust by the operating system (the trust key I proposed) is of the application (identified by the trust key), because that is the granularity of trust and responsibility involved from the perspective of the user and the server.
Also I forgot to point out that the optional feature of cookie encryption is not the same as relying on the optional Referrer header. The former can only be subverted if a unfixed (non popular) browser is not used by the user (assuming popular browsers implement this proposed fix), whereas the latter can be subverted by the hacker using Curl (to fake the Referrer header) irregardless of what browser the user used.
Typo:

is not used by
irregardless

Change to:

is used by
regardless
I had written before:

The only threat then is a rootkit attack on the operating system, or exploitation of OS or hardware bug.[1]

Add a footnote:

[1]Or of course a brute force or other attack on the encryption password and algorithm, but the hacker doesn't need the cookie to do that, he could just attack directly the session id or the user password. So obviously I am assuming that the cyptrography is secure.
Of course it's possible to encyrpt the cookies on disk, but the cookies should not contain any real dangerous information in the first place (which makes it a problem for the website, not the browser).

Note that cookies can indeed be read from the disk currently, but it can also be read by 3rd parties by default (and even if you disable it, websites can work around it anyway). Or they can be read from the network or in the proxyserver, since they're *send* unencrypted (unless you use SSL). I repeat : THEY'RE SEND IN PLAIN TEXT. Encryption cookies.txt (or cookies.sqlite like it currently is), would not fix that.

If you're really worried about the cookies, then please don't look at the passwords - you'll get a heart-attack. Or the history. Or the bookmarks. Or the Flash LSO objects. Or the certificate. Or ... Seriously, this is one of the smallest problems.

A general encryption bug (encrypt the entire profile) is bug 19184
Note that a possible solution would be to only use cookies over SSL (or make the secure flag mandatory). Though you would break 95% of the web ...
It is impossible to remember that a user is login, without using a cookie (or other form of client side persistent data store). Can anyone refute that?

And "login CSRF" results from that fact. My solution does as much as the server can do for security against "login CSRF". However, there is still a vulnerability because the cookies are not properly encrypted by the browser and operating system. If we close that vulnerability, then my solution is the optimum one.

Does the browser encrypt the cookie when the connection is SSL? And even after the browser is closed?

Yes we need the correct form of encryption something like what I described so that our data is secure even if the hacker is behind our firewall. I assumed the browsers are not doing the encryption correctly now.

I am not expecting this to get fixed quickly. But I am raising the issue now, so that I have given ample warning. Also to see if any one can refute my assertions.

I hope my explanation was coherent and straightforward enough. I don't consider this is necessary to throw back into academic research. It is clearly characterized by my example.
Clarification:

It is impossible to remember that a user is login,

Means:

It is impossible to remember that a user is login between browser sessions,
Because of this vulnerability in the browser, and because it is impossible to do it any other way and be secure (see the research article I referenced), I have been forced to display the following disclaimer to the user when they click the "remember me" box on my login form:

===========
WARNING #1: uncheck this, unless you accept that a virus can sign in to
your account if that virus can get inside your computer and read the
browser's cookie file.


If the browser will fix this, I can remove that ugly warning for those browsers that do the right thing.

It is unfortunate that other sites don't display that disclaimer, but perhaps they will once hackers exploit the cookie file for grabbing session ids.  Remember I am talking about session ids that span the time between the browser is closed and restarted.  Most websites offer that feature, even this buzilla site does!
Has any bug been filed on the current browser "remember the password for this site" encryption being hackable?

If you are correct that it is, then isn't that very wrong to ask the user if you want to remember his password, when you don't warn the user that your encryption is fundamentally hackable by any virus inside the firewall of the computer?
OS: Windows XP → All
(In reply to comment #7)
> It is impossible to remember that a user is login, without using a cookie (or
> other form of client side persistent data store). Can anyone refute that?
> 

Depends on the website - some use variables that are passed as GET data (in the URL) or as POST data. But most websites depends on cookies, which are easier to program. That's the reasons why so many websites fail when cookies are disabled.

As for remembering between sessions, you could use a regular login interface on your website, forcing the user to log in every time (the browser might help with username/password, but that's another matter). But that's what every website is trying to avoid of course.

The alternatives are the Storage API (non-standard), DOM:Storage (standard) and Flash LSO objects. The first 2 aren't used very often at this moment (doesn't work in older browsers), while the second method doesn't work without Flash (and privacy nuts freak out over them, even them it's the same as a cookie, except bigger and better hidden).

Note that all these methiods have the same problems as cookies - there's always the possibility that someone can read the data somewhere (on disk, in memory, on the network), so you should NEVER send any dangerous data (passwords, credit cards, ...). Even a simple user-id might explain a bit to much. So the website (NOT the browser) should take care of some form of encryption. And make sure that the data is resistant to a replay attack (it should remain valid for a limited time). Maybe store the ipaddress from which it was send too, so that you have to log in again if you try it form a different location (pretty weak, and won't block a virus).
(In reply to comment #9)
> Because of this vulnerability in the browser, and because it is impossible to
> do it any other way and be secure (see the research article I referenced), I
> have been forced to display the following disclaimer to the user when they
> click the "remember me" box on my login form:

That's way to much FUD - too many users will be scared when they see the word 'virus'. And it's inaccurate and incomplete anyway. "Stay off the Internet" is the best advice you can give.
(In reply to comment #11)
> (In reply to comment #7)
> > It is impossible to remember that a user is login, without using a cookie (or
> > other form of client side persistent data store). Can anyone refute that?
> > 
> 
> Depends on the website - some use variables that are passed as GET data (in the
> URL) or as POST data.

That does not apply to the case of this bug.  For this bug, I mean only the case where the user has instructed the website to "remember me" even after browser is closed and restarted.

> But most websites depends on cookies, which are easier to
> program.

This point is off topic to this bug, but I will tell you that for my site when "remember me" is not chosen, I am able to pass the session id without storing it in cookie, nor in the HTML of the page. I store it in a script var. This is I think the most secure.

> As for remembering between sessions,

I mean remember after the browser is closed and restarted.

> you could use a regular login interface on
> your website, forcing the user to log in every time (the browser might help
> with username/password, but that's another matter). But that's what every
> website is trying to avoid of course.

As stated above, I do offer that option (they can uncheck the "remember me" box), but this bug report is only concerned with the case where the user checks the "remember me" box, which is very popular choice offered by most logins on most websites.

> The alternatives are the Storage API (non-standard), DOM:Storage (standard) and
> Flash LSO objects. The first 2 aren't used very often at this moment (doesn't
> work in older browsers), while the second method doesn't work without Flash
> (and privacy nuts freak out over them, even them it's the same as a cookie,
> except bigger and better hidden).

I am aware of all of those.  These are of no use as a solution to my bug report, because afaik all of those are unencrypted, or at least not encrypted in a secure manner as I stated is required in my initial description of this bug report.

> Note that all these methiods have the same problems as cookies - there's always
> the possibility that someone can read the data somewhere (on disk, in memory,
> on the network), so you should NEVER send any dangerous data (passwords, credit
> cards, ...).

Agreed, so those are non-solutions to this bug report.

> So the website
> (NOT the browser) should take care of some form of encryption.

You are missing the KEY point.  The KEY point is that it is impossible for any website in world to offer a "remember me" without using a cookie to store the session id.  Again I mean remember from time browser is closed until it is restarted again.

There is no way to offer such a feature without using a cookie.

Thus the browser MUST encrypt the cookie for the session id.  It is impossible for the server to encrypt that cookie.  Let me explain why, because it may not be obvious.

If the server were to encrypt the cookie, then it means the encrypted version of the cookie is now logically equivalent to the unencrypted version of the cookie-- the hacker can simply feed the server the encrypted cookie and the server will decrypt it and allow the hacker access.  Think about that deeply.  Most people won't be able to wrap their mind around that.


> And make sure
> that the data is resistant to a replay attack (it should remain valid for a
> limited time).

Limited time is not security.  Sorry. Any time window (no matter how small) for a hacker is a security hole.

> Maybe store the ipaddress from which it was send too, so that
> you have to log in again if you try it form a different location (pretty weak,
> and won't block a virus).

IP address should never be used like this, because other users and even the hacker can be behind the same IP address (due to routers).

Just to give you a little background on me, because I am not sure you realize I know all that what you wrote about above. I have created websites with millions of users (e.g. I am author of CoolPage which had 700,000 confirmed downloads on download.com by 2001 and thus millions of visitors to the coolpage.com website, I am co-author of Corel Painter which sold millions of copies world wide, I wrote one of the first WYSIWYG sophisticated word processors back in 1980s that sold 30,000 copies worldwide on Atari ST, I wrote downloadfast.com, miningpedia.com, etc...all of these sites have since declined or died because I haven't been working so much past few years), so it is unlikely you will think of something that I haven't. But I do appreciate your feedback, so I could clarify those misunderstandings above.
(In reply to comment #12)
> (In reply to comment #9)
> > Because of this vulnerability in the browser, and because it is impossible to
> > WARNING: uncheck this, unless you accept that a virus can sign in to
> > your account if that virus can get inside your computer and read the
> > browser's cookie file.
> 
> That's way to much FUD - too many users will be scared when they see the word
> 'virus'.

Agreed, but if I do not warn the user, I am going to liable because I know the security hole exists.  I do not believe burying it in a legal contract that the users don't read will protect me from a lawsuit, if I knowingly offer users a feature that has a confirmed security hole.

Luckily, I do not check that "remember me" box by default, so unless the user checks it, they are never going to see that warning.  And many naive users don't bother to check it.  I think other sites that check that box by default (e.g. person.com) and who provide no warning, are really taking a risk.  I bet their lawyers are not aware of this bug report.

P.S. I like the terminology that facebook uses instead of "remember me" they use "keep me logged in".  That is more coherent in my opinion.

> And it's inaccurate and incomplete anyway. "Stay off the Internet" is
> the best advice you can give.

That on/off choice for internet is something that I am trying to change.  I don't believe it has to be that way.  If we harden behind the firewall correctly, as I am doing by filing this bug report, then we can get instead a diversity of choices "secure well programmed sites" and "insecure poorly programmed sites".  Btw, since you brought it up, I have written about this at the IETF standards organization:

http://www.ietf.org/mail-archive/web/hybi/current/msg03359.html
http://www.ietf.org/mail-archive/web/hybi/current/msg03343.html
http://www.ietf.org/mail-archive/web/hybi/current/msg03334.html
http://www.ietf.org/mail-archive/web/hybi/current/msg03312.html
Now my question is can any one think of a way to offer login sessions that persist between browser sessions, and do not need an encrypted cookie?

If someone can tell me a way, I am willing to pay them a nice prize.
And if you can not, it means every website on the internet that has a "remember me" checkbox, has this security hole!!  Imagine that! I hope some of you vote for this bug, if you agree.
(In reply to comment #13)

> > And make sure
> > that the data is resistant to a replay attack (it should remain valid for a
> > limited time).
> 
> Limited time is not security.  Sorry. Any time window (no matter how small) for
> a hacker is a security hole.

It's a timestamp, not a timeout. And it also has a counter. The whole point is to prevent that someone takes the session-data, and later reuses it again, be it 1 second later or 1 week later. That's quite easy to prevent.

full disclosure : I have worked in security systems for voice switches and routers, including anti-spoofing, deep packet inspection and lawful intercept. I've programmed replay attacks several times, and written defenses against it.
(In reply to comment #17)
> (In reply to comment #13)
> 
> > > And make sure
> > > that the data is resistant to a replay attack (it should remain valid for a
> > > limited time).
> > 
> > Limited time is not security.  Sorry. Any time window (no matter how small) for
> > a hacker is a security hole.
> 
> It's a timestamp, not a timeout. And it also has a counter. The whole point is
> to prevent that someone takes the session-data, and later reuses it again, be
> it 1 second later or 1 week later. That's quite easy to prevent.

That applies to a session key that you want to use only once, i.e. where request to the server gets a different session key.  That does not apply to the problem of this bug report, wherein we want to have a session key that lives for a long duration, even past when the browser is closed and run again later.

Even if we make the session key stored in the cookie good for only one future request to the server, it is one time too many if the hacker gets that key from unencrypted cookie file.

You are apparently knowledgeable about various web security concepts.  I am just trying to get you to focus on the KEY point of this bug report.  I do appreciate you raising many issues, so that I have been able to explain why they don't apply to this bug report.  In that way, the mozilla developers will have a better understanding. Thank you so much for your help. I really appreciate it.

Do you now understand my KEY point and now concur this must be fixed? If not, why?
Typo:

where request to the server

Change to:

where each request to the server
Private discussions with Jo Hermans illuminated the following:

(In reply to comment #18)
> That applies to a session key that you want to use only once, i.e. where
> request to the server gets a different session key.

What gets stored on client side is changing every time, and that is what matters. The server may (or may not) be keeping the session key constant and encrypting it with a timestamp or counter, but from the perspective of the client this is not logically different than changing the session key each time.

Thus we agree the above does not fix the problem of this bug report, because it can not offer a "remember me" capability that persists between browser sessions.

(In reply to comment #13)
> (In reply to comment #11)
> > But most websites depends on cookies, which are easier to
> > program.
> 
> This point is off topic to this bug, but I will tell you that for my site when
> "remember me" is not chosen, I am able to pass the session id without storing
> it in cookie, nor in the HTML of the page. I store it in a script var. This is
> I think the most secure.
> 
> > As for remembering between sessions,
> 
> I mean remember after the browser is closed and restarted.
> 
> > you could use a regular login interface on
> > your website, forcing the user to log in every time (the browser might help
> > with username/password, but that's another matter). But that's what every
> > website is trying to avoid of course.
> 
> As stated above, I do offer that option (they can uncheck the "remember me"
> box), but this bug report is only concerned with the case where the user checks
> the "remember me" box, which is very popular choice offered by most logins on
> most websites.

Actually even the case of not using "remember me" (regardless whether or not using the one-time session key described at top of this post), is going to have the security hole of the session key being unencrypted in memory (or disk cache) while the page is loaded in the browser.  But at least that is probably more difficult for a virus to locate than the cookies file.

So we need the option to apply encryption to not only cookies, but also to script variables.

The encryption must have the qualities that I stated in the opening description of this bug, i.e. user _and_ application signatures, and protection of the encryption password (probably a hash of the user _and_ application signatures) in volatile memory that is hardware access protected so that only the operating system has access.

By 'signature', I mean a user password and an application binary cryptographic hash as I had described in the description of this bug.

> > Maybe store the ipaddress from which it was send too, so that
> > you have to log in again if you try it form a different location (pretty weak,
> > and won't block a virus).
> 
> IP address should never be used like this, because other users and even the
> hacker can be behind the same IP address (due to routers).

Apparently this Bugzilla site is using the IP address method?  Which is a security hole as I had described-- the hacker can masquerade as the user if he behind the same IP (due to NAT, etc).
(In reply to comment #20)
> Apparently this Bugzilla site is using the IP address method?  Which is a
> security hole as I had described-- the hacker can masquerade as the user if he
> behind the same IP (due to NAT, etc).

If it sees a cookie from a different ip-address, it forces you to provide the password. You can switch it off if you keep changing ipaddresses. But it's not used an an identifying mark for the user (wouldn't work behind a proxy).

I repeat : force the user to log in every time again - you can not keep a session open over sessions, as you can not guarantee that the other side was able to prevent someone from stealing the data. Encrypting the cookie-file isn't enough, you can also read it from the network, from memory, etc ...

It's best to always use a SSL session, and then use session-cookies (which are never be saved). In regular browsers, that would prevent anyone from capturing the data (*). In a hacked browser, or when someone is able to read the memory of the browser itself, that's another matter. Banks would probably install certificates and use onetime-passwords (f.i. hardware keys) to provide mutual authentication.

(*) Note that Firefox 4.0 will cache the SSL pages to disk, just like Internet Explorer has always done. Firefox 3.6.8 and older don't cache SSL to disk by default (but they're cached in memory). So make sure that the content can't be cached too for the parts that are really important (f.i. bank account info), for example by using the "Cache-Control: private" header.
(In reply to comment #21)
> (In reply to comment #20)
> > Apparently this Bugzilla site is using the IP address method?  Which is a
> > security hole as I had described-- the hacker can masquerade as the user if he
> > behind the same IP (due to NAT, etc).
> 
> If it sees a cookie from a different ip-address, it forces you to provide the
> password.

But to repeat what I wrote in prior comments above, that does not stop a hacker who has the cookie and is behind the same IP address.

Thus it is the same security hole as this bug report, and could be fixed properly by fixing this bug report.

> You can switch it off if you keep changing ipaddresses. But it's not
> used an an identifying mark for the user (wouldn't work behind a proxy).

Of course you can turn it off, because it isn't security, and it causes breakage for some users.

It is just a heuristic that the hacker can go around. If the virus has your cookie, then it is also can use your IP address.

> I repeat : force the user to log in every time again

I disagree.  We shouldn't force the user to do something that isn't necessary, assuming we fix this bug report.  We should give the user choice.

> - you can not keep a
> session open over sessions,

There are 1000s if not millions of websites that keep a user login session persistant over browser sessions, including this Bugzilla site.  And they all have the security hole of this bug report.

> as you can not guarantee that the other side was
> able to prevent someone from stealing the data.

We can not even guarantee that the server won't be compromised someday, but that is not an valid excuse to not do correct security.

Browsers attempt to do the best security they can, so that they don't contribute to insecurity.  This bug report affects the entire web in a major way.  I can not fathom why you would argue not to fix this properly.

> Encrypting the cookie-file
> isn't enough, you can also read it from the network, from memory, etc ...

There are infinite possible things that can be attacked on computers in general, but some are more likely than others.

If we properly encrypt the cookie-file, and perhaps even later add a feature to keep certain things encrypted in memory, then we will have closed the big gaping holes.

As for the network issues, lets leave that to the network stack experts.  Their job is to make sure SSL is secure.  It is certainly possible for them to design in such a way that the HTTP cookie over SSL is handled in such a way that it is never handled unencrypted in unprotected memory.

For now lets start by closing this gaping hole in the cookie file, which is the responsibility of the browser in unison with the operating system.  The browser can take the first step of doing encryption without the assistance of the operating system.  Then later we can push for the operating system support I described. I have some clear ideas about how to do this.

> It's best to always use a SSL session, and then use session-cookies (which are
> never be saved).

But if the cookies are never saved then the remember feature of all the websites in the world, is not going to work at all.

I have no disagreement about using SSL, but that is irrelevant to this bug report.

> In regular browsers, that would prevent anyone from capturing
> the data (*).

Not if the memory used by the browser to hold the data is unprotected.

> In a hacked browser, or when someone is able to read the memory
> of the browser itself, that's another matter.

Those can be solved by the operating system by having encrypted and/or protected memory and cryptographic hashes of executable binaries, so that corruption of binaries is detected.

But that is not the priority of this bug report.  I want to close the gaping hole of unencrypted cookie-file first.  First priority first.

> Banks would probably install
> certificates and use onetime-passwords (f.i. hardware keys) to provide mutual
> authentication.

Agreed they do. But that has nothing to do with this bug report.  Banks (even name.com has this option) who and when they do that level of security, are not going to try to persist a login session between browser sessions.

> (*) Note that Firefox 4.0 will cache the SSL pages to disk, just like Internet
> Explorer has always done. Firefox 3.6.8 and older don't cache SSL to disk by
> default (but they're cached in memory). So make sure that the content can't be
> cached too for the parts that are really important (f.i. bank account info),
> for example by using the "Cache-Control: private" header.

Yes it is really big hole in their SSL support.  That is why I return all sensitive data with HTTP expired headers, so it should not be cached on disk (only in memory for life of that page in the browser session).
I use following to tell user agents not to cache HTTP data (returned from GET or POST request):

// Turn off caching for HTTP/1.1 compliant UAs
header( "Cache-Control: no-cache, private, must-revalidate" );
// Date in the past to turn off caching for non-compliant UAs
header( "Expires: Mon, 26 Jul 1997 05:00:00 GMT" );
I want to make it clear that the user's anonymity is not additionally compromised by the best method I proposed for implementing the encryption (see the initial description in this bug report).

Let me re-explain that proposed encryption:

1. User logs on to the operating system with his/her password.
2. The login password is verified against one-way encrypted copy of the password.
3. Operating system (OS) creates a hash of user password and stores it in protected memory.
4. OS receives a request to encrypt data from an application.
5. OS encrypts the data with a hash of the application's binary.
6. OS encrypts the encrypted data from #5, with the hash from #3.
7. OS returns the result of #6 and hash from #5.
8. OS receives a request to decrypt data and hash from #7, from an application.
9. OS verifies that hash from #8 is same as hash of current or prior application's binary saved in step #11.
10. OS reverses step #6, then #5 using hash from #8.
11. Application is upgraded, using hash from #3, OS encrypts and saves to disk a tuple of old and new hash of the application's binary. The disk location may be publicly be readable, but only writeable by the OS.
12. OS receives a request from user to change password. This has to be stored in an encrypted tuple same as for #11, and handled similarly (details ommitted because they are obvious).

* all the hashes must be cryptographically secure
* application = browser (or any other application)

==============
The above is the ideal method of doing the proposed encryption.  There are other ways to do it just as securely. For example, we could instead require the user could logon to the application (the browser). I don't know why the current browser "remember password for this site" does not require this, as it is the only way to build a secure feature. We could delay that logon until the user accessed a website that required the encryption functionality.
Jo, I think I understand now why you made the point about using SSL and bank security methods. Seems you are trying to make the point that those users who ask for a "remember me" feature, should not expect strong security. My response to that is we should give them the best security we can.

It seems to me you are pointing out that cookies can not be stored between browser sessions if SSL is employed, and thus there is no justification for encrypting cookies because if SSL is not employed then the network is not secure.

My response is that we need to be able to store encrypted cookies for SSL.

Actually the cookie standard does not say that cookies must be deleted for SSL.  The standard actually has orthogonal SECURE and DISCARD flags.

http://www.ietf.org/rfc/rfc2965.txt

=====================
[Page 5]
Discard
      OPTIONAL.  The Discard attribute instructs the user agent to
      discard the cookie unconditionally when the user agent terminates.

[Page 6]
Secure
      OPTIONAL.  The Secure attribute (with no value) directs the user
      agent to use only (unspecified) secure means to contact the origin
      server whenever it sends back this cookie, to protect the
      confidentially and authenticity of the information in the cookie.

      The user agent (possibly with user interaction) MAY determine what
      level of security it considers appropriate for "secure" cookies.
      The Secure attribute should be considered security advice from the
      server to the user agent, indicating that it is in the session's
      interest to protect the cookie contents.  When it sends a "secure"
      cookie back to a server, the user agent SHOULD use no less than
      the same level of security as was used when it received the cookie
      from the server.
The standard for best practices with cookies is self-conflicting:

http://tools.ietf.org/rfc/rfc2964.txt

===================================
[Page 3]

2.2.1.  Leakage of Information to Third Parties

   HTTP State Management MUST NOT be used to leak information about the
   user or the user's browsing habits to other parties besides the user
   or service, without the user's explicit consent.  Such usage is
   prohibited even if the user's name or other externally-assigned
   identifier are not exposed to other parties, because the state
   management mechanism itself provides an identifier which can be used
   to compile information about the user.

[snip]

2.2.2.  Use as an Authentication Mechanism

   It is generally inappropriate to use the HTTP State Management
   protocol as an authentication mechanism.  HTTP State Management is
   not designed with such use in mind, and safeguards for protection of
   authentication credentials are lacking in both the protocol
   specification and in widely deployed HTTP clients and servers.  Most
   HTTP sessions are not encrypted and "cookies" may therefore be
   exposed to passive eavesdroppers.  Furthermore, HTTP clients and
   servers typically store "cookies" in cleartext with little or no
   protection against exposure.  HTTP State Management therefore SHOULD
   NOT be used as an authentication mechanism to protect information
   from being exposed to unauthorized parties, even if the HTTP sessions
   are encrypted.

===================================

The above says MUST NOT leak the cookies, so that means the cookies MUST be encrypted, otherwise they can be leaked to a virus.

Then is says SHOULD NOT use cookies for logins (authentication), but it says the reason is because cookies are not encrypted.  That is illogical.  Also MUST is a stronger word than SHOULD in standards.  Also the entire web is already using cookies for authentication, so the SHOULD is already useless.  Thus it is clear that we MUST encrypt authentication cookies.
What bug 19184 wants is an encryption of the entire profile (including the cookie-store). How it is done is another matter. Current code uses a master password for the password-database, which might be extended for the other files and databases.

Your comment 24 is about a OS mechanism to encrypt data om behalf of the application, using the login password. That has nothing to do with the browser itself (although, if the API is available, then it can be used).
(In reply to comment #27)
> What bug 19184 wants is an encryption of the entire profile (including the
> cookie-store). How it is done is another matter. Current code uses a master
> password for the password-database, which might be extended for the other files
> and databases.


Assuming the OS encryption I proposed in (comment #24) is not available, then the master-password should be provided by the user each time the browser runs.  It should be verified against a one-way hash by the browser.

> 
> Your (comment #24) is about a OS mechanism to encrypt data om behalf of the
> application, using the login password. That has nothing to do with the browser
> itself (although, if the API is available, then it can be used).

Last paragraph of (comment #24) also explains that the browser could do a similar sort of encryption without any API from OS, by requiring the user login to the browser (once per browser session and the login could be delayed until encryption is needed by a visited website).  Do you need me to detail the steps of how the browser can do it without the OS?
Proposed browser encryption algorithm, if the preferred OS encryption APIs from comment #24 are not available:

1. User runs the browser. Navigates to websites.
2. User navigates to a website that requires our encryption or decryption.
3. Browser prompts user, user logs on to the operating system with his/her password.
4. The login password is verified against one-way encrypted copy (hash) of the
password.
5. Browser creates a hash (cryptographically distinct from hash in #4) of user password and stores it in memory. If OS has any API to protect memory locations, then use it.
6. Browser processes a request to encrypt data.
7. Browser encrypts the data with a hash of the its own binary.
8. Browser encrypts the encrypted data from #7, with the hash from #5.
9. Browser stores the result of #8 and hash from #7.
10. Browser processes a request to decrypt data and hash from #9.
11. Browser verifies that hash from #10 is same as hash of current or prior
browsers's binary saved in step #13.
12. Browser reverses step #8, then #7 using hash from #10.
13. Browser is upgraded, using hash from #5, browser encrypts and saves to disk
a tuple of old and new hash of the browsers's binary. The disk location may
be publicly be readable, but only writeable by the browser. If write protection is not possible, and if tuple is overwritten/erased then step #11 can't be done, so step 12 can't be done, so encrypted data must be discarded.
14. Browser processes a request from user to change password. This has to be stored in an encrypted tuple same as for #13, and handled similarly (details ommitted because they are obvious).

* all the hashes must be cryptographically secure
* application = browser (or any other application)
Jo, I appreciate you sharing a key insight.  I just had an epipheny stated below...

(In reply to comment #22)
> (In reply to comment #21)
> > (In reply to comment #20)
> > > Apparently this Bugzilla site is using the IP address method?  Which is a
> > > security hole as I had described-- the hacker can masquerade as the user if he
> > > behind the same IP (due to NAT, etc).
> > 
> > If it sees a cookie from a different ip-address, it forces you to provide the
> > password.
> 
> But to repeat what I wrote in prior comments above, that does not stop a hacker
> who has the cookie and is behind the same IP address.
> 
> Thus it is the same security hole as this bug report, and could be fixed
> properly by fixing this bug report.
> 
> > You can switch it off if you keep changing ipaddresses. But it's not
> > used an an identifying mark for the user (wouldn't work behind a proxy).
> 
> Of course you can turn it off, because it isn't security, and it causes
> breakage for some users.
> 
> It is just a heuristic that the hacker can go around. If the virus has your
> cookie, then it is also can use your IP address.

There is no costly breakage if we use the IP address heuristic to fallback to not "remember me".

In other words, we can check the IP address at the server to make sure unencrypted authentication cookie is not being used by a hacker at another IP address.  This is will not stop the virus on your computer, but it will stop the use of the unencrypted authentication cookie at another IP address.

And it doesn't really break anything costly for the user, it just make them login again every time their IP address changes. Also NOT if they haven't closed my site's window, assuming we are storing the authentication key client side in a "non-vulnerable" way (i.e. holding it in script memory as I do, which isn't perfect security but much better than unencrypted in a disk file).  So thus in my method, I can just refuse to honor "remember me" if they restart the browser and using a different IP address, but I don't have to login them out if their IP address changes and they haven't closed my site window.

This does not remove the need to fix this bug with proper encryption, but it does provide a heuristic that adds some security, without any significant breakage cost to the user (assuming my method is employed).
However, the prior heuristic is still pretty much useless if the primary way a hacker can read your unencrypted cookie file is to get a virus onto your computer, because the virus can surely use your IP address too to send a request too.

And the above will require capturing the onunload event so as to store the unencrypted cookie only when page exits.

In short, we really need the proper encryption as this bug report requests.
(In reply to comment #30)
> In other words, we can check the IP address at the server to make sure
> unencrypted authentication cookie is not being used by a hacker at another IP
> address.  This is will not stop the virus on your computer, but it will stop
> the use of the unencrypted authentication cookie at another IP address.

Doesn't work. The server doesn't know the (real) ipaddress of the browser (NAT in your modem, CG-NAT in the network, proxy-servers, firewalls, NAT46 or NAT64, ...). And besides, ipaddresses are still easy to fake.

If you use Bugzilla behind a company firewall or NAT device, then your problem is that the ipaddress doesn't change, and the cookie is accepted, even though it could come from an attacker.
This bug is useless in my opinion. You're looking for a way to protect the cookies of your website, but you're proposing all kinds of stuff which have nothing to do with your website anymore.

- what can be done is to encrypt the cookie-file (or the entire profile), see bug 19184. That hides it a bit better for the most trivial attack : someone of something reading your cookie-file and reusing the cookie. Note that if someone can do that, there are tons of other stuff that can be done.

- but you leave one huge gaping hole in your security : the cookie can just be read from the network. Use SSL at at least (and the secure flag for every cookie). No issue for the browser.

- a better protection against replay attacks (I mean, trying to detect old or fake cookies), is something for the webserver to do, not the client. No issue here for the browser.

- the remarks about a different authentication and encryption algorithm, are for the OS. No issue for the browser. If an OS offer suck an API, then it can be used. 

I find it really strange that you talk about security all the time, but you insist of using cookies as an authentication mechanism ????
(In reply to comment #32)
> (In reply to comment #30)
> > In other words, we can check the IP address at the server to make sure
> > unencrypted authentication cookie is not being used by a hacker at another IP
> > address.  This is will not stop the virus on your computer, but it will stop
> > the use of the unencrypted authentication cookie at another IP address.
> 
> Doesn't work. The server doesn't know the (real) ipaddress of the browser (NAT
> in your modem, CG-NAT in the network, proxy-servers, firewalls, NAT46 or NAT64,

That was my point too.  It is a useless heuristic. So why is Bugzilla bothering to do it?

And that is why we need this bug fix.
> ...). And besides, ipaddresses are still easy to fake.
> 
> If you use Bugzilla behind a company firewall or NAT device, then your problem
> is that the ipaddress doesn't change, and the cookie is accepted, even though
> it could come from an attacker.
Jo you keep repeating the same misunderstandings, I am keep having to repeat the same logic to your over and over again.

Let me try to explain to you again...

(In reply to comment #33)
> This bug is useless in my opinion. You're looking for a way to protect the
> cookies of your website, but you're proposing all kinds of stuff which have
> nothing to do with your website anymore.


I am proposing to encrypt the cookie file.  That will fix the problem identified in this bug report, which affects 1000s if not millions of websites that provide the "remember me", aka "keep me logged in", feature.

There is no other way to provide the "remember me" feature, we must encrypt the cookie file.  Period.


> - what can be done is to encrypt the cookie-file (or the entire profile), see
> bug 19184. That hides it a bit better for the most trivial attack : someone of
> something reading your cookie-file and reusing the cookie. Note that if someone
> can do that, there are tons of other stuff that can be done.


No there are not tons of others things that can be trivially done.

If in fact, we use SSL and encrypt the cookie file, then there is nothing trivial that can be done to hack into the "remember me" feature.


> - but you leave one huge gaping hole in your security : the cookie can just be
> read from the network. Use SSL at at least (and the secure flag for every
> cookie). No issue for the browser.


I will use SSL, once we allow secure cookies to be encrypted and saved between browser sessions.  No need to keep repeating this.  I agree we must use SSL.

 
> - a better protection against replay attacks (I mean, trying to detect old or
> fake cookies), is something for the webserver to do, not the client. No issue
> here for the browser.

That is incorrect.  We already discussed that in the prior comments.  There is no way to offer the secure "remember me" feature, which persists between browser sessions, without using a cookie that is encrypted by the browser.  You've already agreed with that.

This is bug report is about the "remember me" feature, because that is a very popular feature at 1000s if not millions of websites already.

If you want to continue to argue that 1000s and millions of websites need to remove the "remember me" feature, then this bug report is not the proper forum, because this bug report is specifically about the security hole with respect to the "remember me" feature.


> - the remarks about a different authentication and encryption algorithm, are
> for the OS. No issue for the browser. If an OS offer suck an API, then it can
> be used. 

Disagree, that is an issue for the browser.

That OS encryption method was detailed in comment #24.

However, in comment #29, I also offered a way for the browser to do the encryption without the OS.

> I find it really strange that you talk about security all the time, but you
> insist of using cookies as an authentication mechanism ????

Because the "remember me" feature is very popular.  Users demand it.

And I find it very strange that you don't want to help make the "remember me" feature secure?  Could you please explain why you want the "remember me" to remain insecure?  You want a popular feature to be a big security hole?
The prior comment #35 is more than sufficient to justify this bug report and can stand on its own.  I suggest you review comment #13 too.

Orthogonally, I can also give even more reasons we need encryption in the browser.

We need the browser encryption not just for cookies.

We need the encryption to guard:

1) passwords when the browsers offers to "remember your passwords
2) browser data caches, especially those requested over SSL
3) session ids stored in memory by scripts

And as I explained in comment #20 (and read comment #18 too), #3 means that the lack of encryption is security hole not just for "remember me", but ALSO FOR ANY SESSION!!!, and replay protection from the server won't close this hole.

So to argue that 1000s if not millions of websites should remove the "remember me" feature, ignores that millions more websites use sessions without "remember me" feature and replay heuristics will not protect them if the session id is unencrypted in memory (even if no data cache) of the browser.

It seems to me Jo that you are arguing that security for sessions is not important.  Your only security solution (replay heuristics is still a security hole) so far is that everyone must have a serial number (i.e. get an SSL certificate for their browser) as banks do.  That would destroy anonymity.  At least the solution I am offering will work for everyone and retain anonymity, as I explained in comment #24.

In short, this bug report applies in a very big way to everything about security and sessions.

I can not fathom why you don't want to implement at least the encryption detailed in comment #29, so you can close the big security hole that applies to items listed above, that affect every SSL website in the universe (regardless of whether they use "remember me" or not).

You logic is that the only way to close the security hole is to use SSL and give each user an SSL certificate (serial number) on the client side.  Because that is the only solution you have offered that can protect the session.  The IP address and replay heuristics do not protect the session, as I explained above.

I find it really strange.  Why are you trying to stop proper encryption in the browser?
(In reply to comment #36)
> ...Your only security solution (replay heuristics is still a security
> hole) so far is that everyone must have a serial number (i.e. get an SSL
> certificate for their browser) as banks do.  That would destroy anonymity.  At
> least the solution I am offering will work for everyone and retain anonymity,
> as I explained in comment #24
...
> You logic is that the only way to close the security hole is to use SSL and
> give each user an SSL certificate (serial number) on the client side.  Because
> that is the only solution you have offered that can protect the session.  The
> IP address and replay heuristics do not protect the session, as I explained
> above.

I want to clarify that I am also proposing to use SSL to guard the network transmission of the session.  What I meant above is that the only proposal Jo has offered which is secure, is to require an SSL certificate for each user, in addition the SSL certificate for the server.  The SSL most of us use every day only requires an SSL certificate for the server.  The problems with requiring an SSL certificate for each user are manifold:

1) It eliminates anonymity, because each user then has a "serial number", which is his/her SSL certificate.

2) It is impractical, because most users do not have SSL certificates, and it would be impractical to issue every user one.

3) SSL user certificates can not easily follow the user when the use different computers.

4) And actually the SSL user certificate STILL REQUIRES THIS BUG FIX!!! We would still need to encrypt the SSL certificate require the user to login to the browser or OS with their password, because otherwise a virus on your computer can use your SSL user certificate.

So actually Jo has offered no solution which is an alternative to fixing this bug report.  Jo sorry to be so adamant, I appreciate your help in illuminating the issues.  I just hope you understand that this bug report is critical for client side security.
Let me summarize what I think Jo's points were and my position on each of them, because I think he does have some valid points, but they do not refute the need to fix this bug.

1) Jo made the point that man-in-middle attacks on the network can defeat SSL:

http://en.wikipedia.org/wiki/Man-in-the-middle_attack#Defenses_against_the_attack
http://en.wikipedia.org/wiki/SSL_certificate#Security
http://en.wikipedia.org/wiki/X.509

This is correct. When the SSL certificate is only on the server, then the data coming from the server will not be corrupted/changed by the man-in-middle, but it will be read by the man-in-middle. The data coming from the client can be both read and changed by the man-in-the-middle.

Using SSL certificates on both ends (server and client/user) makes the attack very, very unlikely. Although it would be possible to use self-signed SSL certificates on the client side, these would need to be registered with each website (currently manually because there is no automated protocol).

However, as I explained in the prior post, if the private key for a certificate is not encrypted, then the virus inside the computer can steal and use the certificate (this is true both on server and client), thus masquerading as the owner of the certificate. Thus we still need proper encryption in the browser, as this bug report requests. This is analogous to the need for encryption for the session cookie. The SSL client side certificate just adds protection against man-in-middle attacks, and I would welcome a new automated standard protocol for registering these on signup for new user account at a SSL website.

We still need the proper encryption requested by this bug report, both for the SSL private key, and for the session key (whether stored in cookie for browser session persistence or stored in memory for short-term persistance).

2) Please be clear that man-in-middle attacks compromise any data that can be sent over SSL, not just authorization keys or cookies. So if man-in-middle attack is your concern, then STOP using the internet for private data. So this point is really irrelevant to this bug report. Yes we should make an automated standard protocol for registering self-signed client SSL certificates with SSL websites on account signup, as discussed in #1 above, to defeat man-in-middle attacks. But we still need the proper encryption requested by this bug report. Virus attacks on personal computers are an orthogonal attack vector to man-in-middle. We have to defend against both, one does not entirely defend against the other. Thus we still need the proper encryption requested by this bug report.

3) Jo also stated that we should guard against replay attacks by changing the session id or timestamp/count on each server request. I think I sufficiently explained in prior comment that with regard to security, that is a useless heuristic. If anyone needs me to clarify that further, just ask or present your logic.

4) Jo also mentioned IP address heuristic, but we both agreed in comment #34 that it is useless.
This comment is off-topic from this bug report, because it discusses the utility of SSL certificates on the client side.  However, I want to correct somethings I wrote in prior comment.

(In reply to comment #37)
> [snip]...The SSL most of us use every day
> only requires an SSL certificate for the server.  The problems with requiring
> an SSL certificate for each user are manifold:
> 
> 1) It eliminates anonymity, because each user then has a "serial number", which
> is his/her SSL certificate.

That is not really true if we use the self-signed SSL certificates and register them with each website when we signup for our account (hopefully via some new standard automated protocol so user doesn't even know or bothered).

The reason it doesn't lose any anonymity is because it simply associates you with that website account, a self-signed certificate does not prove who you are, it just proves you are the same person who signed up for the account at that website originally.

> 2) It is impractical, because most users do not have SSL certificates, and it
> would be impractical to issue every user one.

It would not be impractical if browsers automatically generated self-signed certificates for each website new account signup.  We would need some automated protocol standard for this.  The users wouldn't even know or be bothered.

> 3) SSL user certificates can not easily follow the user when the use different
> computers.

This is the big problem I see with self-signed SSL user certificates.  However, this could be solved.

One solution would be to signup a new account at your website from your other browser/computer, and then go login to the former account at the former browser/computer, then the website would have a feature where you could merge the two accounts.

Another solution would be to have centralized webserver for self-signed certificates, which would do the same as the above (merging accounts), and then tell your browser what the former certificate was.

I prefer the first solution, because I prefer decentralized solutions.

> 4) And actually the SSL user certificate STILL REQUIRES THIS BUG FIX!!! We
> would still need to encrypt the SSL certificate require the user to login to
> the browser or OS with their password, because otherwise a virus on your
> computer can use your SSL user certificate.

This is true that the virus can steal the unencrypted private key.  We still need this bug fix.
The Zlob.Trojan virus steals passwords, login data, and will render secure (even bank) sites insecure, as this bug report predicts:

http://www.ehow.com/how_6829475_delete-zlob-downloader.html
http://srnmicro.com/virusinfo/zlob.htm

Even though anti-virus software claims to detect and remove this virus, new variants keep appearing and new computers keep getting infected (verified this from a friend today who removes viruses as a consultant in the VA area).

Some variants of Zlob will infect the DNS:

http://en.wikipedia.org/w/index.php?title=Zlob_trojan&oldid=379330995

This means that the virus can trick (re-direct) your computer into loading a website from the wrong server, and the browser user won't know because the website name in browser will be as expected (and the page may look exactly the same too). However, this DNS cache poisoning man-in-the middle attack will not work against an SSL (i.e. TLS, https://) website, if the user's browser also has an SSL certificate:

http://en.wikipedia.org/wiki/DNS_cache_poisoning#Prevention_and_mitigation

However, if the virus can read the unencrypted private key of the SSL certificate on the user's computer, then the secure website is compromised.

This again confirms that we need this bug report fixed. It will help protect secure websites against a Zlob type of virus.
The encryption proposed by this bug report (whether it be done exclusively by the browser as described in comment #29 or with the assistance of the OS as described in comment #24) is a partial step towards the ideal of Full disk encryption:

http://en.wikipedia.org/w/index.php?title=Full_disk_encryption&oldid=379237606

Full disk encryption if done correctly is never susceptible to a virus, and is only susceptible to a person who has PHYSICAL access to the computer:

http://en.wikipedia.org/w/index.php?title=Cold_boot_attack&oldid=369329269

As I had mentioned in prior comments, any encryption we do that is not a complete Full disk encryption, could be susceptible to rootkit attacks if the OS is not properly secured against kernel mode level attacks:

http://en.wikipedia.org/w/index.php?title=Rootkit&oldid=379959708#User-mode

It is true that operating systems (OS) will need to improve too, but that is not a valid argument to not fix this bug. The reason is because until we fix this bug, there problem will not be isolated and focused on the OS vulnerabilities. Users won't know to choose a better OS, if we don't push the responsibility on the OS, by closing the gaping hole that this bug report identifies.

Some have stated to me in private that the encryption cryptography must be secure and that it can also be attacked over time.  This is true, we must use the best cryptography we know of and continue to release improved versions. That is not valid argument against using encryption. If it was, we would just stop using security completely. Note that the encryption algorithms I proposed must be properly proofed and analyzed, but it doesn't hurt for us to fix this bug report and get the browser infrastructure rolling. The exact encryption algorithm can change in future without the user of the browser knowing.
People keep repeating to me in private the same replay heuristic that Jo mentioned in prior comments.

I want to make it clear that if the website has any persistence (meaning user is not asked to enter his/her password on each page and/or request to the server), then it means there is some data stored on the user's computer (possibly in memory if you don't want persistence than spans browser sessions).  And thus encryption of that data is still needed. Now this bug report asks to prioritize the encryption of cookies, but that same encryption could also (later?) be used to encrypt other sensitive data on the user's computer, whether it be on disk and/or memory.

Also data can be encrypted or protected sufficiently in memory such that virus can not get it. To be 100% effective, that protection needs to derive from the operating system.  My prior comment #41 argued that we need to get the ball rolling and not wait for operating systems to be improved with more security features.
I continue to get feedback from experts. If the execution environment can not be trusted, then there is no security. That is what I am saying by filing this bug report. The execution environment can be made secure as I described in comment #41. The only point to add to that is that the executing process of the virus can not read or corrupt the memory of the browser process, if the operating system is properly implementing memory protection:

http://en.wikipedia.org/w/index.php?title=Memory_protection&oldid=371668125

As for guarding against other applications masquerading as the browser and asking for the user's password (in case browser level encryption of comment #29 is being used, instead of the preferred OS encryption of comment #24), I had included a hash of the browser's executable binary to prevent decryption by other applications.

I have endured the feedback of numerous experts, and the conclusion is crystal clear to me by now.  We can secure the execution environment (i.e. harden behind the porous firewall+anti-virus).  And thus fixing this bug report is a step in that direction.

Fixing this bug report will not be useless, even if the execution environment is still not fully secured. On the contrary, it will make theft of important data non-trivial. If the OS implements memory protection, then it will make theft very difficult (need to rootkit the OS). Any consumer OS that is being routinely rootkit-ted, is going to probably be forced to be improved by the consumers.
This should probably be moved to the newsgroup, bugs aren't exactly suited to such a discussion as this is turning out to be.
Which newsgroup?
(In reply to comment #37)
> The problems with requiring an SSL certificate for each user are manifold:
> 
> 1) It eliminates anonymity, because each user then has a "serial number",
> which is his/her SSL certificate.

No more than a cookie.

> 2) It is impractical, because most users do not have SSL certificates, and it
> would be impractical to issue every user one.

Servers can easily issue user certificates valid for that site. The UI is sucky, but if that's the way secure sessions should go then we should fix the user experience.

> 3) SSL user certificates can not easily follow the user when the use different
> computers.

Neither do cookies. In both cases the site reissues the identification when they log in from a new machine.

> 4) And actually the SSL user certificate STILL REQUIRES THIS BUG FIX!!! We
> would still need to encrypt the SSL certificate require the user to login to
> the browser or OS with their password, because otherwise a virus on your
> computer can use your SSL user certificate.

SSL certificates are more secure -- they force users to create a Master Password and then are stored encrypted (in Mozilla code, anyway. Don't know about other browsers).

Later on you mention Zlob. Zlob and infections like it hook into running processes and will have no problem getting your encrypted information whether you encrypt just the cookies or use full-disk encryption. If the running process can get it then so can Zlob.

Despite the easy pickings that plain-text cookies appear to represent, infections like Zlob have not bothered with them. They don't want session ID's (that might be ip-address-locked), they want passwords. Even if they couldn't read the cookies they could erase them and wait for the user to log in again.

As an expression of "work to be done" this issue is covered by bug 19184. The task is quite clear. Arguments for why a particular bug should be prioritized over other bugs belong in the discussion groups.

Which one, you ask? If you insist on restricting the argument to cookies rather than the profile as a whole then try the mozilla.dev.tech.network group otherwise mozilla.dev.platform or mozilla.dev.security perhaps.
Status: UNCONFIRMED → RESOLVED
Closed: 10 years ago
Resolution: --- → DUPLICATE
Duplicate of bug: 19184
(In reply to comment #46)
> (In reply to comment #37)
> > The problems with requiring an SSL certificate for each user are manifold:
> > 
> > 1) It eliminates anonymity, because each user then has a "serial number",
> > which is his/her SSL certificate.
> 
> No more than a cookie.

I agree for self-signed (non-CA) certificates as I had written in comment #39.

> 
> > 2) It is impractical, because most users do not have SSL certificates, and it
> > would be impractical to issue every user one.
> 
> Servers can easily issue user certificates valid for that site. The UI is
> sucky, but if that's the way secure sessions should go then we should fix the
> user experience.

Yes that would be wonderful step forward!

I will support it if you do.

So refreshing to read something positive about movement forward.  You made me happier today.

Maybe I can hire someone to work on it for mozilla?  (email me privately if so)

> > 3) SSL user certificates can not easily follow the user when the use different
> > computers.
> 
> Neither do cookies. In both cases the site reissues the identification when
> they log in from a new machine.

Agreed as I wrote in comment #39.  That is definitely the way to go.

> > 4) And actually the SSL user certificate STILL REQUIRES THIS BUG FIX!!! We
> > would still need to encrypt the SSL certificate require the user to login to
> > the browser or OS with their password, because otherwise a virus on your
> > computer can use your SSL user certificate.
> 
> SSL certificates are more secure -- they force users to create a Master
> Password and then are stored encrypted (in Mozilla code, anyway. Don't know
> about other browsers).

I did not know that! Great! So we only need to harden your Master Password perhaps?

> Later on you mention Zlob. Zlob and infections like it hook into running
> processes and will have no problem getting your encrypted information whether
> you encrypt just the cookies or use full-disk encryption. If the running
> process can get it then so can Zlob.

Agreed that is an operating system problem.  The OS should be using memory protection as I explained in comment #43.

> Despite the easy pickings that plain-text cookies appear to represent,
> infections like Zlob have not bothered with them. They don't want session ID's
> (that might be ip-address-locked), they want passwords.

But that may just be because the lower hanging fruit is available.  We shouldn't leave this hole open.

> Even if they couldn't
> read the cookies they could erase them and wait for the user to log in again.

Not if the OS implements proper memory protection (and of course defeating key loggers too, which should be an OS function).

> As an expression of "work to be done" this issue is covered by bug 19184. The
> task is quite clear. Arguments for why a particular bug should be prioritized
> over other bugs belong in the discussion groups.
> 
> Which one, you ask? If you insist on restricting the argument to cookies rather
> than the profile as a whole then try the mozilla.dev.tech.network group
> otherwise mozilla.dev.platform or mozilla.dev.security perhaps.

Thank you so much.  God bless.

I will be making an comment on the bottom of the following webpage, to announce this good news!

http://www.marketoracle.co.uk/Article22098.html

840 reads already (may get syndicated too):
http://www.marketoracle.co.uk/UserInfo-Shelby_H_Moore.html
Almost-Better-Than-Nothing security decreases security due to conflation of the true holes with the non-holes, and points us towards a white-listed hell in the future:

http://www.ietf.org/mail-archive/web/http-state/current/msg00938.html
http://www.ietf.org/mail-archive/web/http-state/current/msg00939.html
Excuse me, I forgot to mention that I also made the prior Comment #49, because it contains a great idea for encrypting the cookies on the client disk, using a password provided by the server over HTTPS.  I hope it can become a recommended internet-draft. Please help support that idea.
Ah I forgot that I was supposed to go make comments in the suggested discussion lists. Sorry I have been so swamped, I totally forgot about that suggestion in Comment #44 and Comment #46. Will do if I have time.
You need to log in before you can comment on or make changes to this bug.