Closed Bug 873349 Opened 11 years ago Closed 10 years ago

Add a whitelist for mixed content blocking

Categories

(Firefox :: Security, enhancement)

23 Branch
enhancement
Not set
normal

Tracking

()

VERIFIED WONTFIX

People

(Reporter: raysatiro, Unassigned)

References

(Blocks 1 open bug)

Details

Attachments

(2 files)

User Agent: Mozilla/5.0 (Windows NT 6.1; rv:17.0) Gecko/20100101 Firefox/17.0 (Beta/Release)
Build ID: 20130505034501

Steps to reproduce:

I'm loading pages with mixed content.

gecko.mstone = 23.0a2
gecko.buildID = 20130514004016



Actual results:

The mixed content is blocked, even if on past occasions I had unblocked it. That may be what is intended:
tvyas wrote on May 16th, 2013 at 6:13 am:

    We don’t have a whitelist mechanism. Mixed Content is blocked on a per-page load basis. “Disabling Protection” and allowing the content is hence also per-page load.



Expected results:

Feature request: The mixed content should remain unblocked, or there should be a whitelist or something similar. It's a nuisance to have to unblock the content on each load. Thanks
Component: Untriaged → Security
OS: Windows 7 → All
Hardware: x86 → All
Maybe it's more of a "remember my choice" rather than a "whitelist" type of feature
Ray can you please attach a couple of pages with mixed content that provide the issue mentioned in Comment 0?
Here are some websites with mixed info in them:
https://www.nytimes.com/
https://okfn.org/
https://yibis.com/
https://proximize.me/
I have given my vote for this, here's my particular use-case:

I am developing a web application[0] that has a UI and management features that I want to be under SSL, while providing an iframe to arbitrary websites (mostly insecure, with few exceptions). In this case, Active Mixed Content is an expected side-effect, and the Blocker's inability to remember the user's choice and/or whitelist a website is a serious impediment.

[0]: https://jeb.io
Please consider this.
I use HTTPS-Everywhere and many sites are still mixed content (that's why they're not the default). But the alternative is to disable the https rules, and that's worse than what we started with...
Severity: normal → enhancement
Status: UNCONFIRMED → NEW
Ever confirmed: true
(In reply to Ivo Anjo from comment #5)
> Please consider this.
> I use HTTPS-Everywhere and many sites are still mixed content (that's why
> they're not the default). But the alternative is to disable the https rules,
> and that's worse than what we started with...

HTTPS Everywhere is working on a fix to make the add-on compatible with Firefox 23:
https://trac.torproject.org/projects/tor/ticket/9196
I have mixed feelings about implementing a whitelist for Mixed Content Blocker.

When a user decides to "Disable Protection", they are making a decision to allow a HTTP request on an HTTPS page that can expose data sensitive data and alter the behavior of the web page.  To elaborate, the request (along with the users cookies) is sent over HTTP an eavesdropper can retrieve the cookies and may now know what site the user is visiting.  Moreover, a man-in-the-middle can change the server response and inject malicious content on the user's browser.  By blocking Mixed Active Content, we are trying to protect users from man-in-the-middle attackers and eavesdroppers.

We are not trying to protect users from the website itself, and hence the safety or trustworthiness the user feels in the website is not relevant to whether or not they should disable protection.  The threat we are trying to protect against is a network attack.  It is not an attack against the webservers of a specific website.  The sensitivity of data that the website has about the user is relevant though (since stolen cookies can compromise this data).  In conclusion, when deciding to disable protection, an advanced user should consider:
1) How confident they are in the network connection.  Are they at home or using a vpn?  Or are they at an internet cafe?
2) What data the website has about the user?  If the user is logged out and browsing a news website, perhaps the website does not have any sensitive data about the user.
3) Are they are viewing a website that they don't want others to know they are viewing?

The answer to number 2 and 3 probably don't change from one page load on a website to another.  But the answer to question 1 does.  I may not mind "disabling protection" on a news website that loads a Mixed Content java object when I'm at home.  But if I'm at an internet cafe, the Mixed Content java object could turn into malware by a MITM attacker.  Hence adding that news site to a whitelist and allowing Mixed Content permanently on that site is risky.  Unfortunately, Firefox doesn't have any features that detect your network and make decisions based on trusted vs un-trusted networks (although, that would be cool).

On the other hand, if a user is sufficiently annoyed of the Mixed Content Blocker on their favorite news site, they may just turn off the blocker all together.  Overall, is this better than allowing them to whitelist one specific site?  Probably not.  It would be better to allow the user an option to avoid daily annoyance on their favorite site without turning the feature off entirely.

The reality of the situation is that the majority of user's won't know to ask the above 3 questions when "disabling protection".  We can try and educate them (which we do with learn more links, etc), but at the end of the day this is just too complicated for an average user to understand.

The next question to consider is whether a whitelist is really necessary.  Is it over-engineering the feature?  Is it worth the UX/Development time?  Chrome and IE already have a Mixed Content Blocker (without a whitelist) so many websites have already removed their Mixed Content issues.  When Firefox introduces their blocker to stable users in Firefox 23, maybe the long tail of remaining websites (who perhaps have mostly a Firefox user base) will also fix themselves.  Making this whitelist less of a necessity.

If we do decide to implement a whitelist, the next step would be to sit down and figure out what the UX would be like (via the doorhanger, in permissions manager?).
Attached image Capture.PNG
Hey Tanvi thanks for sharing your thoughts.

I like the mixed content block feature and I think it's a great step forward. For a long time now I've used a browser profile for HTTPS only. It has HTTP and FTP set to bogus proxy addresses and plugins are blocked. I use it to mitigate risk when I'm accessing my finances.

I have a general profile that I use for daily internet browsing. When I'm accessing a website with mixed content through this profile I'd prefer if the default behavior could be 'always ask' or 'always notify' or something. Right now there's just a white icon and I don't recall seeing any notification. If 'always ask' is not the planned default behavior it would be helpful if there could be a preference to change the default behavior.

If a website's permission for mixed content is set to 'always ask' then for those websites where I don't care about the mixed content controls I could choose to 'allow always' or something. This could be exposed similar to the way popups are handled, where you get a bar at the top of the screen and you can make your choice.

I used the DOM inspector to edit chrome ui page info and added 'Mixed Content' to Permissions to help visualize a possibility for what I'm suggesting. A screenshot is attached.

Thanks
(In reply to Tanvi Vyas [:tanvi] from comment #7)
> If we do decide to implement a whitelist, the next step would be to sit down
> and figure out what the UX would be like (via the doorhanger, in permissions
> manager?).

If I may suggest, I think the plugin whitelist UX (used for java, etc) could be re-used, as you pretty much want the same things: warn the user and to allow him/her to enable once or enable always this behavior for a certain site.
I am strongly against implementing any whitelist for mixed content, unless and until we've tried doing other things to improve the UX. In particular, we can make the user's choice to allow mixed content more "sticky" by making it persist across navigations on the same site, and we could make it easier to allow the mixed content within the doorhanger by giving the doorhanger two buttons ("Keep Blocking" and "Disable protection for this page") or similar. Or, we could convert the doorhanger to an infobar (perhaps for just a few releases, like Chrome did). We should try all of these things before implementing the whitelist. Even then, I'd try to avoid the whitelist.

So, I vote for RESOLVED WONTFIX/INVALID.
(In reply to Brian Smith (:briansmith), was bsmith@mozilla.com (:bsmith) from comment #10)
> we can make the user's choice to allow mixed content more "sticky" by making
> it persist across navigations on the same site

How is that different from a whitelist? That it has a TTL? So you're against a permanent whitelist, but not a temporary one?

> and we could make it easier
> to allow the mixed content within the doorhanger by giving the doorhanger
> two buttons ("Keep Blocking" and "Disable protection for this page") or
> similar.

Keeping in mind that this only works if there's a (temporary or permanent) whitelist. And I'm not sure it would even be a good thing then: the change from a two-click action to a one-click one reduces the bother, but with a whitelist there's no real need to make it easier to disable the behaviour.


I'm not sure what the Mozilla policy is like for hidden options; I didn't put much thought in this, either, but: what about having an about:config-only whitelist, or even just a hook in the process, and letting addons the (dis)pleasure of managing UX?
@Tanvi I'm a firm believer in 'repetition is the father of learning'. However, not allowing users to be asked about blocking "mixed content" once only, instead of every single time they visit a website is not going to make the safety issue of doing so sink in for the user. And, <a href="https://en.wikipedia.org/wiki/Unix_philosophy#Quotes">like Unix</a>, its not really Firefox's place to stop people from doing things that may cause harm, as that would also stop people from doing clever things. That doesn't mean that Firefox can't warn people about stuff that may cause harm, but people should be able to ignore such warnings in the same way they can the ignore an attack (virus-infected) site warning screen.
(In reply to Ray Satiro from comment #8)
> I have a general profile that I use for daily internet browsing. When I'm
> accessing a website with mixed content through this profile I'd prefer if
> the default behavior could be 'always ask' or 'always notify' or something.
> Right now there's just a white icon and I don't recall seeing any
> notification. If 'always ask' is not the planned default behavior it would
> be helpful if there could be a preference to change the default behavior.
> 

Hi Ray,

Right now, when Mixed Active Content exists on a page, the user will see a shield next to the location bar: https://people.mozilla.com/~tvyas/FigureA.jpg.  When they click on it, they will see the doorhanger pop open: https://people.mozilla.com/~tvyas/FigureB.jpg

By "always ask" do you mean that the doorhanger would pop open automatically?  Or do you mean something else?
I was using alert() today and I had an idea about UX for this: what if, instead of just providing a "don't ask again, allow this site" option on the first time, it only appears after two or three times someone selects the "show mixed/unsafe content" option? That would provide the learning-by-repetition, and you'd only whitelist sites you access often.

After further reflection, I've also changed my mind about temporary vs. permanent whitelisting: while some sites do intentionally need to have HTTP content mixed in with HTTPS one, most don't. Sites that have their assets served over HTTP while their main page is HTTPS should be reminded (by a user whose browser periodically breaks the site) that this is not good... so they can fix it. The best way to get rid of the doorhanger is for the website to change. Anything else is inherently unsafe.
(In reply to Tanvi Vyas [:tanvi] from comment #13)
> By "always ask" do you mean that the doorhanger would pop open
> automatically?  Or do you mean something else?

I'd prefer some type of notification other than the white icon in the awesome bar. I've attached a screenshot to help visualize what I described. Thanks
Ah okay.  We have a bug open to make the Mixed Content Blocker more discoverable (https://bugzilla.mozilla.org/show_bug.cgi?id=834828) and I'm adding your comment there.
Please, please make this a feature. Yes, I understand the argument against it, but here's the reality of the situation. I'm going to disable this feature in about:config because it's a horrid annoyance to enable a site I visit daily every single time. Even if you make it persist across a session, that's still extra wasted clicks every single day for no good reason. And I'm sure I'm not alone in doing this.

The result will be that users won't use the feature at all, and it will be completely pointless. So what's really more secure?

I really don't care if someone sniffs my cookies and discovers I'm browsing the NYtimes website. And as for injection, add-ons like NoScript already take care of that sort of thing. If I'm browsing on public Wifi I'm using a VPN anyway, and if I'm at home, denying this feature is pointless.

Refusing to implement such a trivial thing because you're attempting to hold the hands of a bunch of users who don't even understand what this feature does is simply idiotic.
About this whitelist, I think that a "Add to Whitelist" button in the doorhanger is a good idea. When clicked it adds a entry into the GUI whitelist. I think a whitelist would be a very good feature, by defualt its empty, but uses its own file.
I have a CMS supporting hundreds of users in pushing content to various platforms and the v23 has been giving them issues because of the mixed content being blocked. I have directed them to select "disable protection on this page" but would be great if the whitelist is implemented so they can whitelist the CMS link and dont have to worry about enabling scripts or content on the page everytime they load.
I second ca_aok. Please make this a feature. I love the idea of content blocking, and I would appreciate the extra security it provides in the vast majority of cases. However, I must use a couple of sites on a daily basis for work that have broken because of active content blocking. Disabling it on every page load is such a severe disruption to my workflow that I have disabled the feature in about:config. I was delighted to discover this was a possibility, and it is the only reason that I am still using Firefox. If I had not taken the time to look up this issue--and if I weren't the kind of user comfortable with making changes in about:config--my only reasonable solution would have been to start using a different browser. (The issue with the sites I use is one of treating frames as active rather than passive content; Chrome allows them at the moment.) Yes, the long tail of websites will probably catch up eventually, but until then I can't reasonably use the content blocker without a whitelist (or, I suppose, a change in the treatment of frames as active content).
(In reply to Mihai Morar, QA (:MihaiMorar) [ PTO 08/09 - 08/26] from comment #2)
> Ray can you please attach a couple of pages with mixed content that provide
> the issue mentioned in Comment 0?

Another place is the Cover Art Archive uploader on MusicBrainz.org. It's using an iframe for the uploading to CAA. Right now I have two options (other than having to reselect everything on the cover art upload form and re-initiate the upload after having told Firefox the mixed content is ok for the page - which isn't an option in the long run ("security warning fatigue")): to use musicbrainz.org via HTTP instead of HTTPS, or disable FF's Mixed Content Blocking.

Neither of those two options leave me more secure than adding musicbrainz.org to a whitelist and allowing mixed content on this one page. It also seems to be blocking the links to "http://127.0.0.1:8000/..." which is how MusicBrainz "communicates" with the Picard (and probably other) tagger(s) running locally.

So for now, I'll be disabling mixed content blocking until a whitelist is implemented.
(In reply to Mihai Morar, QA (:MihaiMorar) [ PTO 08/09 - 08/26] from comment #2)
> Ray can you please attach a couple of pages with mixed content that provide
> the issue mentioned in Comment 0?

https://www.newsblur.com 

It's an RSS reader which has the feature to load the original page directly from the site. When that site is HTTP the mixed content blocking prevents it loading.

At least with NewsBlur selecting "disable" keeps things working until you close the tab as there are no full page (re)loads. However this is still sufficiently annoying that without a whitelist I will be switching mixed content blocking off.
(In reply to Martin Barry from comment #23)
> https://www.newsblur.com 
> 
> It's an RSS reader which has the feature to load the original page directly
> from the site. When that site is HTTP the mixed content blocking prevents it
> loading.
> 
> At least with NewsBlur selecting "disable" keeps things working until you
> close the tab as there are no full page (re)loads. However this is still
> sufficiently annoying that without a whitelist I will be switching mixed
> content blocking off.

For newsblur specifically, we have a bug open and the developer is engaged: https://bugzilla.mozilla.org/show_bug.cgi?id=879075
The use case that the whitelist is desperately needed for is not where the users trust the site with their private information, it is where non-private information is being blocked. 

For example the navigation popups gadget on Wikimedia wikis transfers it seems http content from Wikipedia, even when used in places like Wiktionary. This is being blocked despite no private information is being or could be transferred.
(In reply to Frederik 'Freso' S. Olesen from comment #22)
> (In reply to Mihai Morar, QA (:MihaiMorar) [ PTO 08/09 - 08/26] from comment
> #2)
> > Ray can you please attach a couple of pages with mixed content that provide
> > the issue mentioned in Comment 0?
> 
(...)
> 
> Neither of those two options leave me more secure than adding
> musicbrainz.org to a whitelist and allowing mixed content on this one page.
> It also seems to be blocking the links to "http://127.0.0.1:8000/..." which
> is how MusicBrainz "communicates" with the Picard (and probably other)
> tagger(s) running locally.
> 
> So for now, I'll be disabling mixed content blocking until a whitelist is
> implemented.

Speaking with http://localhost through AJAX requests is a popular way of doing inter-processes communications between a website and a small daemon running on someone's computer (driving machine tools, card readers, among other).

Could we (at least) have a small "allow communications between Firefox on my computer" as a checkbox + preferences ?

If not we will have to disable the mixed_content blocks for all websites, which does not sounds like an optimal solution.
Yeah, I already disabled this in about:content.  It'd be great if I didn't have to do that and the various web sites I use would make all their content https but that's not the reality.  Even big companies like yahoo have their fantasy sports pages with mixed content and this feature is way too much hassle.

I think a good solution would be to allow white listing (session or permanent) and make the grey shield "mixed content" icon into the yellow warning triangle on all sites that you're visiting which are white listed but displaying mixed content.  You should also be able to click on the warning icon and de-whitelist (session or permanent).

Then, if you rarely go on public networks, you could have pages white listed, and easily preserve your security when you're in public.  As it is, with the filter turned off (like it now is on my and certainly many other's machines), I don't even get the warning when I'm seeing mixed content.
(In reply to Chris McKenna from comment #25)
> The use case that the whitelist is desperately needed for is not where the
> users trust the site with their private information, it is where non-private
> information is being blocked. 

Exactly!  There seem to be sites where some of the pages are moving to https, but not all of the content is encrypted (i.e. "mixed content", some of it "active", is present).  

But just because there is some "mixed content", doesn't always mean that security is a prime concern on that specific site (not all https sites are banks, or other equally sensitive info).  With some sites functionality may be more of a concern than maximum "security" on that site.  

For example, what about SPDY enabled sites?  If you want to get the performance advantages of SPDY, the top level site will be https (because SPDY isn't implemented in FireFox for simple http sites).  Which means that a simple SPDY site may very well have mixed content (even if/when the site is not "secure", per se).  

And FWIW that is exactly the "use case" I tripped over within hours of upgrading to FF-23.  I was trying to access "https://webcache.googleusercontent.com" URLs, i.e. pages from the google web cache, and noticed that a lot of those "cached pages" were less functional then before.  After digging into the matter, I discovered the "problem" was the new "mixed content blocker" that wasn't previously enabled in my earlier versions of FireFox.  And, as a result, it was the same day I installed FF-23, that I also disabled this feature in about:config.  

Now I really like what this blocking is trying to do, and would happily use it if it worked for me (I already routinely use no-script, for example).  However, the current implementation, without a "whitelist", means that a user frequently visits even one site that has this "mixed content", their only (current) options are:

1) Put up with the feature, with a lot of "click fatigue"
or
2) Disable this security feature globally for the entire web (as I reluctantly did).  

There really is a need for a third option, where the security feature remains on generally, but can be turned off for specific web site domains.  Because being able to make specific "whitelist" choices (for just the sites that need them), has to be much more secure than turning off the feature totally (which is the moral equivalent of doing a "whitelist" of the entire web)!
I will add my two cents and say that this feature is necessary if I am going to continue using Firefox. 

Security aside - my company using certain internal tools that require mixed content. I'd rather they use a more elegant solution, but these sites are designed to transfer information. 

My only solutions right now are to use another browser; turn off the blocking entirely; or click an extra step multiple times every day. 

What I love about Firefox is that it gives the user control.
(In reply to Florian Le Goff from comment #26)
> Speaking with http://localhost through AJAX requests is a popular way of
> doing inter-processes communications between a website and a small
> daemon running on someone's computer (driving machine tools, card
> readers, among other).

Really? That sounds horribly insecure to expose those things to the random internet (the user's browser is the bridge). You're only safe as long as security by obscurity holds up.

> If not we will have to disable the mixed_content blocks for all
> websites, which does not sounds like an optimal solution.

You're right, that's not great either. I guess we're stuck deciding which bad solution is the least bad.
(In reply to Daniel Veditz [:dveditz] from comment #30)
> (In reply to Florian Le Goff from comment #26)
> > Speaking with http://localhost through AJAX requests is a popular way of
> > doing inter-processes communications between a website and a small
> > daemon running on someone's computer (driving machine tools, card
> > readers, among other).
> 
> Really? That sounds horribly insecure to expose those things to the random
> internet (the user's browser is the bridge). You're only safe as long as
> security by obscurity holds up.

Well, if you have connected objects, they will be exposed in one way or another to the random internet. The browser is only a bridge, but a quite convenient bridge, a bridge only launched when the user want it on. You have to add appropriate authentification of the requests & security on the local daemon (ie: for us — signing requests server side with a private key and checking the signature on the daemon).
Looks like I'm going to have to open up about:config and disable security.mixed_content.block_active_content, hopefully I remember to turn it back on when this gets fixed.
In todays world of mashups you cannot guarantee that the content you are going to be displaying is https. Although this is a good security measure you've broken the web for a lot of users who do not understand what's changed. At least in Internet Explorer the user is displayed a message saying that "Only secure content is displayed". The small shield display is inadequate. I didn't even realize it was displaying the first time I saw it.

Also, a white list option is needed. I should have the choice to white list a web application and not go through click-fatigue to disable protection on a page by page basis. My other choices are to turn it completely off or install yet another plug-in to help me manage the core functionality that should be built within the browser. Neither of those choices are good options for myself or my users.
(In reply to Scott Taylor from comment #33)
> In todays world of mashups you cannot guarantee that the content you are
> going to be displaying is https. Although this is a good security measure
> you've broken the web for a lot of users who do not understand what's
> changed. At least in Internet Explorer the user is displayed a message
> saying that "Only secure content is displayed". The small shield display is
> inadequate. I didn't even realize it was displaying the first time I saw it.

I have to say that this is very true. It took me several days to notice the shield on broken sites, and then to click on it to find out what the hell it was about.
(In reply to Tanvi Vyas [:tanvi] from comment #6)
> (In reply to Ivo Anjo from comment #5)
> > Please consider this.
> > I use HTTPS-Everywhere and many sites are still mixed content (that's why
> > they're not the default). But the alternative is to disable the https rules,
> > and that's worse than what we started with...
> 
> HTTPS Everywhere is working on a fix to make the add-on compatible with
> Firefox 23:
> https://trac.torproject.org/projects/tor/ticket/9196

Yes, just disable that website's rule altogether. So it's better that I don't have the security offered by HTTPS-Everywhere, unless that site happens to be perfect?

Maybe some functionality could be added where plugins -- specifically HTTPS-Everywhere -- can tell the page to load the Http-only content anyway? But the problem is not with HTTPS-Everywhere.

(In reply to Félix Saparelli [:passcod] from comment #14)
> After further reflection, I've also changed my mind about temporary vs.
> permanent whitelisting: while some sites do intentionally need to have HTTP
> content mixed in with HTTPS one, most don't. Sites that have their assets
> served over HTTP while their main page is HTTPS should be reminded (by a
> user whose browser periodically breaks the site) that this is not good... so
> they can fix it. The best way to get rid of the doorhanger is for the
> website to change. Anything else is inherently unsafe.

If I allowed the page once, I will by definition allow it on every subsequent pageload, so either make it sticky or don't allow me to load the unsecure content at all. But make up your mind because it makes no sense as it is to allow me to load unsafe content but then bug me about it, as if in punishment for daring to defy the Wise New Rule.
(In reply to Jack Harper from comment #32)
> Looks like I'm going to have to open up about:config and disable
> security.mixed_content.block_active_content, hopefully I remember to turn it
> back on when this gets fixed.

Agreed, I've also disabled it completely.  This "security feature" is not useful when I'm forced to disable it.  Just like the pop-up blocker, there are some site where I want pop-ups... there are some sites where I definitely want mixed content.
I'm an about:config abuser and have disabled the mixed-content blocking mechanism as well.

I have two questions....

- Although I could test on my server, if the primary page is https://mysite.com will information be blocked on http://mysite.com for mixed content?
- In a completely paranoid world, is HTTPS going to block bad javascript, bad flash, bad "anything" from happening from an HTTPS site?

AFAIK, HTTPS is an encryption mechanism that includes information about where the data is coming from, includes encryption details, etc, etc.  The entire role of HTTPS is to transfer unencrypted data stored on point A, then get that data to point B in an encrypted fashion in a kind of tunnel, and remove the possibility of someone listening in on my conversation with another machine.  Pretty much one step away being on a VPN.  So once the data gets decrypted on my computer, I'm vulnerable to whatever content I received from point A.  If I can get to https://a-good-site.com and for whatever reason that site has information that the browser is to download from https://a-bad-site.com the information isn't blocked because it is all encrypted information.  But the payload that https://a-bad-site.com has "could" affect my machine in whatever way it wants and is designed to do.

The absolute truth is that I LOVE the idea of blocking potentially malicious sites, but hiding behind the pretense that "all mixed-content is bad" and should be handled in a blanket fashion is absolutely bogus.  If you're going to get nailed with something malicious via web services, you're going to get nailed via HTTP or HTTPS.  The HTTP/HTTPS protocol isn't at fault, the content isn't at fault, its the individuals/groups who write the content who are at fault.

Now, is blocking mixed-content a BAD thing.  In the terms of banking, absolutely not.  I don't need any other web service to know what I'm doing on my banking site.  I would actually prefer to block ALL content that DOESN'T come from my banking site with matching certificates, but from what I understand, mixed-content will allow https://mybank.com to pull info from https://a-bad.site.com.  I'd NEVER whitelist my bank site.  I'd be an idiot to do so.  However, my home page on the other hand, which pulls non HTTPS content from other services, which I look at more often than my bank, the extra clicks every time I REFRESH the page is what brought me to using about:config.
(In reply to Todd Trann from comment #36)
> (In reply to Jack Harper from comment #32)
> > Looks like I'm going to have to open up about:config and disable
> > security.mixed_content.block_active_content, hopefully I remember to turn it
> > back on when this gets fixed.
> 
> Agreed, I've also disabled it completely.  This "security feature" is not
> useful when I'm forced to disable it.  Just like the pop-up blocker, there
> are some site where I want pop-ups... there are some sites where I
> definitely want mixed content.

This is my situation as well. An enterprise site at my institution uses mixed content, so I have had to disable the block. Accessing the site is required for doing my work, so this is the necessary option for now. It is not an acceptable situation!
Completely agree with the previous posters: The idea behind this is good, the implementation is a first step. We definitely need a way to permanently white-list.

It would be sad to see firefox turn into an internet nanny...
I am the lead developer of the Copyright Review Management System for the HathiTrust digital library, and the Firefox and Chrome MCBs have been an enormous headache. We embed (in iframes) a number of information resources (Stanford renewal DB, VIAF, Wikipedia, etc.) used in making copyright determinations. Our main review page is served over https because access is tightly restricted, but we have no control over whether these info sources are accessible thru https. A "fire and forget" whitelist is badly needed, because in some scenarios a review page is opened in a new tab, requiring users to do several gestures per review just to get the ifames to load.

I have recommended users disable the MCB completely via about:config (one user replied to my instructions the single line "Oh god thank you!!!!") and I think this serves as a compelling argument for how badly botched this so-called security feature has been.
My use case is another CMS/web publication. Our content is accessible over http and our editorial team often embeds videos or other content that is only accessible over http. The backend server used by the editorial staff is accessible only via https. When they're previewing content, they run into the mixed content warnings and can't see all the content.

How about providing this functionality, but not immediately exposing an obvious/easy way to enable? This would reduce the mis-use of this feature but provide it for people that have a very specific (and, in my opinion, very valid) reason for needing this.
Well, I see that after over 8 months of discussing this issue, absolutely no progress has been made towards implementing a solution.  It seems that some commenters, who sound like they may be Mozilla developers, are digging their heels in about not allowing users to bypass the mixed content blocker on a permanent basis for a particular URL.  Allow me to submit my opinion on this.  I work in a school district with VERY effective firewalls and antivirus protection.  In order to accomplish our work, many of the more than 6,000 staff members must use an internal website that is essentially a gateway to an external database manager called Infinite Campus.  I run into this issue each and every work day where I have to bypass FF's mixed content blocker for each different page load for each particular login session.

I totally understand the need for the mixed content blocker, and in general, think it's a great idea.  But to not allow users to make their own informed choices as to which pages to white list, I think is seriously bad business.  Probably 85% of our users would have no idea what issues are involved here, and would simply contact IT for assistance in accessing the pages that suddenly will not open now.  Most of the remaining 15% would be very cautious about whitelisting any sites because they do understand the risks and don't want to be blamed for a security breach.

If you were to allow a white list for the mixed content block, which I highly encourage you to, you could certainly emblazon the UI with a huge warning note about the risks involved.

Bottom line: Not doing anything IS making a decision.  And the current decision is going to drive thousands of users into a) less secure solutions or b) abandoning FF as their favorite browser.
Besides all that (in my previous comment), any user can bypass the blocker by right-clicking on the link being blocked and choose to "Open link in new tab" or "Open link in new window" or even by holding down the Command-Shift or Option-Shift while left-clicking on the link.  So what's the big difference, in terms of "protection" provided by the mixed content blocker?

Of course, most of our users don't know about the above options.  They just know that with the advent of FF version 26, our Infinite Campus pages are "broken".  Time to call IT.  Thanks, Mozilla...
(In reply to Brian "Moses" Hall from comment #40)
> enormous headache. We embed (in iframes) a number of information resources
> (Stanford renewal DB, VIAF, Wikipedia, etc.) used in making copyright
> determinations.

Stanford renewal DB is available (AFAICT) over HTTPS: 
https://collections.stanford.edu/copyrightrenewals/bin/search/simple

VIAF is available (AFAICT) over HTTPS:
https://viaf.org/

Wikipedia is available over HTTPS (except sometimes in Iran and China):
https://en.wikipedia.org/

YouTube embedding is available over HTTPS. See this blog post:
http://apiblog.youtube.com/2011/02/https-support-for-youtube-embeds.html

Vimeo embedding is available over HTTPS. See this:
http://vimeo.com/help/faq/sharing-videos/embedding-videos#can-i-embed-my-video-on-an-https-domain

(In reply to dpbrick+bugzilla_mozilla from comment #42)
> Bottom line: Not doing anything IS making a decision.

I agree that it isn't helpful to keep this bug open when nobody is intending to actually work on the feature. The fact is that the further we get from the Firefox 23 release, the less pressure there is on us to implement this feature, because more sites are now providing HTTPS, which means that there is less need for the feature, because there is less of a compatibility impact.

If there is a public site that you depend on using in an <iframe> or through XHR that isn't HTTPS and is causing your HTTPS site to break, please email me (brian@briansmith.org) and I will help you by advocating to that website that they adopt HTTPS to help you. Our efforts in doing this so far have been surprisingly successful.

I understand it is common for end-users to try to embed YouTube videos or other things using http:// links instead of https:// links. I recommend that you contact your CMS vendor or open-source project and ask them to implement an "auto-fix" type feature that rewrites these links to HTTPS automatically for your users.

FWIW, mozilla.org also had to do a lot of work to eliminate mixed content, so we do understand that it is a non-trivial task. However, we (at least I) think that it is a very worthwhile thing to do because it has real privacy and security benefits for the users of the sites.

I am going to resolve this as WONTFIX since it is unlikely to get implemented and I don't want the existence of an open bug to keep people waiting for us to implement it.
Status: NEW → RESOLVED
Closed: 10 years ago
Resolution: --- → WONTFIX
Brian--
I respect that mixed content is a difficult problem to solve, as it's hard to get various sites to conform to the requirement to offer HTTPS. That's a security issue that I understand you can't fix alone. There's also a User Experience problem here that you can solve. 

It is entirely within Mozilla's power to help the user know what's happening when their content is being blocked. Part of our problem is that there is almost no warning that mixed content has been blocked unless the user notices the little tiny gray shield in the URL bar. What they do notice is that their content is just GONE with no explanation. 

When Firefox encounters other security risks, it flashes a huge error message to the user to let them know there's a problem. Maybe it will prompt the user to allow mixed content once or always per site. Maybe it could suggest a workaround, like to visit the site directly or to tell the site admin. At very least it could explain what happened and why mixed content is a security risk. Right now it does none of that, leaving users wondering what happened to their content. 

Please reopen this ticket and consider providing a better user experience when content is blocked.
Flags: needinfo?(nobody)
(In reply to tedcurran from comment #45)
> Brian--
> I respect that mixed content is a difficult problem to solve, as it's hard
> to get various sites to conform to the requirement to offer HTTPS. That's a
> security issue that I understand you can't fix alone. There's also a User
> Experience problem here that you can solve. 
> 
> It is entirely within Mozilla's power to help the user know what's happening
> when their content is being blocked. Part of our problem is that there is
> almost no warning that mixed content has been blocked unless the user
> notices the little tiny gray shield in the URL bar. What they do notice is
> that their content is just GONE with no explanation. 
> 
> When Firefox encounters other security risks, it flashes a huge error
> message to the user to let them know there's a problem. Maybe it will prompt
> the user to allow mixed content once or always per site. Maybe it could
> suggest a workaround, like to visit the site directly or to tell the site
> admin. At very least it could explain what happened and why mixed content is
> a security risk. Right now it does none of that, leaving users wondering
> what happened to their content. 
> 
> Please reopen this ticket and consider providing a better user experience
> when content is blocked.

I'd go +1 on asking that at least FF shows the yellow warning toolbar or something. When this came out at first I was puzzled for a day or two until I discovered that little icon. Common users won't ever notice it is there.
(In reply to tedcurran from comment #45)
> Brian--
> I respect that mixed content is a difficult problem to solve, as it's hard
> to get various sites to conform to the requirement to offer HTTPS. That's a
> security issue that I understand you can't fix alone. There's also a User
> Experience problem here that you can solve. 
>
> Please reopen this ticket and consider providing a better user experience
> when content is blocked.

See https://bugzilla.mozilla.org/showdependencytree.cgi?id=815321&hide_resolved=1

That is a list of the open issues for mixed content blocker. Several of those issues are about making it easier to understand that content was blocked, why it was blocked, and how to unblock it.

This particular bug is about a particular proposed feature, and "better user experience when content is blocked" is out of scope for it. If none of the existing bugs linked above cover that, then feel free to add a new bug with a *specific* request.
Flags: needinfo?(nobody)
(In reply to Brian Smith (:briansmith, :bsmith; NEEDINFO? for response) from comment #47)

> This particular bug is about a particular proposed feature, and "better user
> experience when content is blocked" is out of scope for it. If none of the
> existing bugs linked above cover that, then feel free to add a new bug with
> a *specific* request.

There are still definitely situations where a whitelisting feature would be extremely helpful to both users (for ease of use) and developers (who have to support non-tech-savvy users) alike. In my case, our customer solution conglomerates a lot of data from various sources, some of which is not available via an API, so we must provide an iframe to easily show that data. The second part of the problem for us (and sorry I can't give much detail) is that the websites we are gathering data from are resistant to change, non-public-facing, and we have no power over whether they open a https port to us or not (and the Firefox team contacting them wouldn't make any difference).

Our plan is to hopefully use our software to warn our users about using non-https sites in an iframe, but we will still have issues when they view our page in Firefox vs IE, which blocks mixed content, but provides much more adaptability around showing vs hiding mixed content and also has a more visible alerting system (see http://msdn.microsoft.com/en-us/library/ee264315%28v=vs.85%29.aspx ).

If I had the time, I'd work on this issue in Firefox on my own, but I don't, so any support from the Firefox dev team to expand the browser's abilities in the future would be greatly appreciated.
If someone can convince Microsoft to go all HTTPS for TechNet, then I will have only a little problem with not implementing a white-list.  As it stands without a white-list, I have to allow mixed content every time I load/reload a TechNet page (quite often), or turn off the mixed content all together.  Both of which I am loath to do.  

How can this be "resolved wontfix"???  Are the developers really so closed-minded that they can't imagine a situation for which users would benefit with a few sites with mixed content allowed while the user remains protected for most of the web.  

At least allow a white-list as a hidden setting so those knowledgeable of the implications can remain sane.  Give it a hard limit of 3, 5 or 10 sites to prevent people from simply adding every site they come across.  Isn't that a reasonable compromise?
I agree that closing this as "wontfix" seems awfully close-minded.

Please reconsider.
We're using Firefox in a corporate intranet and lots of sites offer https/http mixed content. While in an ideal world, this should not happen, the reality cannot be changed short term. Having some kind of whitelist (e.g. "Always allow mixed content on this site") would be very helpful indeed.

Also, I don't see how one user can close this as WONTFIX when so many people clearly need more flexibility here. As so often, security is not a binary choice. Please reconsider.
I learned today that my company's editorial team is just disabling the mixed content blocking altogether to get around this. (see comment #41 for our need for the white list).

Awesome.
I run into this problem often with my simple.tv . It is a DVR that uses HTML5 based web application to manage the recording of shows. Disabling this active content for the page is necessary to make this work. Since Mozilla added this feature for a reason, I don't believe that disabling it altogether to fix this annoyance is acceptable.

http://community.simple.tv/index.php?/topic/114-firefox-problems-with-simpletv/?p=440
Every single time I load a page from the MSDN forums <https://social.msdn.microsoft.com/Forums/whatever> I have to do this in order to get the content to display properly. This is incredibly annoying when I'm trying to research something and I get a lot of hits in the forums. Please reconsider this!
(In reply to Jon Baumgartner from comment #56)
> Every single time I load a page from the MSDN forums
> <https://social.msdn.microsoft.com/Forums/whatever> I have to do this in
> order to get the content to display properly. 

I suspect you have an extension which is taking you to the HTTPS version in the first place. You should disable the rule in that extension to avoid the problem. In HTTPS Everywhere you can disable the "Microsoft (partial)" rule.
Quite unfortunate that the solution appears to be forcing people to use plain HTTP without any security instead of providing the possibility to have at least some security.
I have to use a website regularly that I do not have control over, but has mixed content.  The site is not one that really needs to be secure, but they have opted for their pages to redirect to https, while not updating the includes.  Because of the inability to whitelist certain sites, I had to disable mixed content protection entirely.

The irony is that with https everywhere I wrote a rule for this site so it is actually redirecting the mixed content requests to https, and never sending http requests, but the mixed content blocker comes before extensions get to rewrite requests.  So to get firefox to allow https everywhere to redirect the requests I still had to disable the mixed content filter anyways.  If firefox put the mixed content gate later in the process, just prior to sending the request, then I could re-enable the mixed content protection and effectively satisfy this requested feature by utilizing https everywhere.
(In reply to Aaron from comment #59)
> The irony is that with https everywhere I wrote a rule for this site so it
> is actually redirecting the mixed content requests to https, and never
> sending http requests, but the mixed content blocker comes before extensions
> get to rewrite requests.  So to get firefox to allow https everywhere to
> redirect the requests I still had to disable the mixed content filter
> anyways.  If firefox put the mixed content gate later in the process, just
> prior to sending the request, then I could re-enable the mixed content
> protection and effectively satisfy this requested feature by utilizing https
> everywhere.

We are well aware of this problem and have been working on a solution for months. Unfortunately, it requires a lot of code refactoring.  The data that Mixed Content Blocker needs to determine whether to allow or block a connection was no longer available at the time that HTTPS everywhere was invoked.  We're fixing that in the dependencies of bugs 1006868, 1006881.  And there is a bug specifically about fixing the HTTPS Everywhere problem - bug 878890.
Thank you for the reply and information.
A whitelist or allow function is necessary for me.  I'm a web designer and server admin, so when I migrate a site from one server to another, or am testing sites before the DNS has propagated on a Plesk server for example, I can use the preview site function through Plesk.  This works on https however, so any javascript or database functions are broken until propagated almost negating the whole testing aspect.
I need a whitelist too to use MY bookmarklet, if I wanted to use a standard NON CUSTOMIZABLE browser I would have used Chrome... I use FF cause it is Customizable for Power Users.

In meantime for other like me I found this experimental add-on: Toggle Mixed Active Content https://addons.mozilla.org/en-US/firefox/addon/toggle-mixed-active-content/?src=ss
I forgot to include this other add-on in my previous post to enable mixed content: 

Toggle Mixed Display Content: https://addons.mozilla.org/en-US/firefox/addon/toggle-mixed-display-conten/?src=search
Whitelist! I discover another user, at least weekly, who has disabled the mixed content warning because of a single site they visit daily. It leaves them exposed on the road and I do not have the authority to enforce any policy to the contrary. 
It's a lesser of evils situation, and the whitelist is the lesser, BY FAR!
It's absurd that the developers are so dead set against this. There are definitely legitimate reasons to disable mixed content blocking for specific URLs. My particular use case, which is similar to that which others have posted, is for my RSS reader. I run my own Tiny Tiny RSS site which I access over https. Many of the items in my feeds have embedded data (images, videos, etc.) which are loaded over http. These things do not load unless I go through the annoying task of disabling mixed content blocking for the session. I should be able to disable it just for my RSS reader page once and for all, leaving it intact for everything else. Instead, I'm about to disable it entirely because I'm sick of digging into the certificate dialog constantly to disable it temporarily.
Just want to mention the "internet of things". Some older, expensive devices aren't accessible via https. The problem is sure to evanesce over time but right now it exists.
@Ryan DeShone: The reason Firefox developers are determined to punish users for the mistakes of webmasters here is that this is part of a growing policy of disempowering users, even and *especially* power users, so as to undermine the principle and expectation that you are in charge of your own computer. Not only is the latter no longer an accepted argument, it's a view the current torchbearers of what used to be the flame of freedom are actively working to suppress and relegate to obscurity. Esprit de corps? More like esprit de corporate, amirite?
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Creator:
Created:
Updated:
Size: