Closed Bug 1251229 Opened 4 years ago Closed 4 years ago

request.url includes the hash of the loaded URL

Categories

(Core :: DOM: Service Workers, defect)

46 Branch
defect
Not set

Tracking

()

RESOLVED FIXED
mozilla47
Tracking Status
firefox44 --- unaffected
firefox45 --- unaffected
firefox46 + fixed
firefox47 + fixed

People

(Reporter: valentin, Assigned: bkelly)

References

(Blocks 1 open bug)

Details

(Keywords: regression, Whiteboard: btpp-fixnow)

Attachments

(2 files)

Load https://awesome-sw.github.io/hello-world/#/2 (repeatedly to make sure the resources are put into the cache).

In the console you'll see that the intercepted URL includes the hash.
Moreover, in the devtools Cache storage, multiple entries for the same URL with different hashes are available.

There should only be one entry for each URL, and it should not have a hash set.
Pretty sure we try to implement fragment stripping.

Ehsan, any thoughts?  I think you were just looking at this code.

Does our parser just get confused if there is a slash inside of the fragment?

Valentin, does this work if you use URLs like "hello-world/#2"?
Flags: needinfo?(valentin.gosu)
Flags: needinfo?(ehsan)
(In reply to Ben Kelly [PTO, back Feb 29][:bkelly] from comment #1)
> Pretty sure we try to implement fragment stripping.
> 
> Ehsan, any thoughts?  I think you were just looking at this code.
> 
> Does our parser just get confused if there is a slash inside of the fragment?
> Valentin, does this work if you use URLs like "hello-world/#2"?

It doesn't seem to be the slash. It still caches hello-world/#2
Flags: needinfo?(valentin.gosu)
This might be fallout from bug 1201664.  We used to run the Request constructor which stripped the fragment, but are no longer doing that.  Maybe we miss stripping the fragment now.
Yea, this does not reproduce in FF44.  Bug 1201664 was landed in FF46.  I think that's probably the culprit.  We should fix before it hits release and write a test for this.

Ehsan, can you take?  Or I can do it next week.
Blocks: 1201664
Actually, I may have time for this today while the movers load the truck.
Assignee: nobody → bkelly
Status: NEW → ASSIGNED
Flags: needinfo?(ehsan)
This fixes the problem on the comment 0 site for me.  I'll write a mochitest next.
Attachment #8723585 - Flags: review?(ehsan)
Attachment #8723585 - Flags: review?(ehsan) → review+
This test fails without the P1 patch.  It succeeds with the P1 patch.

Ehsan, do you think I need to worry about a Cache migration?  I could delete request entries with a fragment.  Alternatively we just tell people on dev edition and nightly to wipe any Cache objects affected by this.  (They will trip on the MOZ_ASSERT in debug builds.)
Attachment #8724087 - Flags: review?(ehsan)
Regarding the possibility of writing a migration, the only observable case will be if someone does cache.keys().  Those resources will not be found by match() any more because the URL will be different.  So if script or devtools does a .keys() the urls-with-fragments will show up.  In debug builds they will trigger the MOZ_ASSERT.

I'm inclined not to write a migration for this.
Whiteboard: btpp-fixnow
(In reply to Ben Kelly [PTO, back Feb 29][:bkelly] from comment #7)
> Created attachment 8724087 [details] [diff] [review]
> P2 Add wpt test verifying FetchEvent.request.url does not include fragments.
> r=ehsan
> 
> This test fails without the P1 patch.  It succeeds with the P1 patch.
> 
> Ehsan, do you think I need to worry about a Cache migration?  I could delete
> request entries with a fragment.  Alternatively we just tell people on dev
> edition and nightly to wipe any Cache objects affected by this.  (They will
> trip on the MOZ_ASSERT in debug builds.)

I think it's better to do a migration anyway.  It should be fairly straightforward.
(Assuming that you're not planning to backport this to 46, of course.  If you do, then I'm fine without having a migration.)
Attachment #8724087 - Flags: review?(ehsan) → review+
I am going to uplift to 46.
Comment on attachment 8723585 [details] [diff] [review]
P1 Strip fragment from request URL when creating FetchEvent. r=ehsan

Approval Request Comment
[Feature/regressing bug #]: Service worker regression caused by bug 1201664.
[User impact if declined]: Sites can end up with multiple copies of the same resource in Cache and in could fail to work offline in some cases.
[Describe test coverage new/current, TreeHerder]:  Tests included.
[Risks and why]: Minimal.  Only affects service workers.
[String/UUID change made/needed]: None
Attachment #8723585 - Flags: approval-mozilla-aurora?
Comment on attachment 8724087 [details] [diff] [review]
P2 Add wpt test verifying FetchEvent.request.url does not include fragments. r=ehsan

See comment 14.
Attachment #8724087 - Flags: approval-mozilla-aurora?
https://hg.mozilla.org/mozilla-central/rev/982cf5b26ae1
https://hg.mozilla.org/mozilla-central/rev/d6df13193dc8
Status: ASSIGNED → RESOLVED
Closed: 4 years ago
Resolution: --- → FIXED
Target Milestone: --- → mozilla47
Tracking since this is a regression from 46.
Comment on attachment 8723585 [details] [diff] [review]
P1 Strip fragment from request URL when creating FetchEvent. r=ehsan

Fix for SW regression, includes tests. Please uplift to aurora.
Attachment #8723585 - Flags: approval-mozilla-aurora? → approval-mozilla-aurora+
Comment on attachment 8724087 [details] [diff] [review]
P2 Add wpt test verifying FetchEvent.request.url does not include fragments. r=ehsan

Test only changes, fine to uplift to aurora
Attachment #8724087 - Flags: approval-mozilla-aurora? → approval-mozilla-aurora+
Version: unspecified → 46 Branch
You need to log in before you can comment on or make changes to this bug.