Open Bug 1494848 Opened 6 years ago Updated 2 years ago

Don't try to backfill records that fail more than twice

Categories

(Firefox :: Sync, enhancement, P3)

enhancement

Tracking

()

People

(Reporter: lina, Unassigned)

References

Details

https://searchfox.org/mozilla-central/rev/6d1ab84b4b39fbfb9505d4399857239bc15202ef/services/sync/modules/engines.js#1251-1254 suggests that we should only retry failed records once, then forget about them after the next backfill.

I don't think that works, though; looking at ckarlof's logs for an unrelated issue, I see the same bogus record show up multiple times. This is particularly inefficient for the old bookmarks engine, where we'll end up building the GUID map on _every_ sync, only to fail to apply the record again.

Walking through the code:

* Let's say `previousFailed` is empty, `toFetch` is empty, and A fails in the current sync. We'll record A in https://searchfox.org/mozilla-central/rev/6d1ab84b4b39fbfb9505d4399857239bc15202ef/services/sync/modules/engines.js#1201,1246-1249,1255. Since `toFetch` and `previousFailed` are empty, there's nothing to backfill.
* On the next sync, we sync B successfully, `previousFailed` has A and `toFetch` is empty. So we clear `previousFailed` (since B succeeded), and `idsToBackfill` has A.
* We try to backfill A, which fails again. So we add A to `failedInBackfill`, and eventually add that list to `previousFailed`: https://searchfox.org/mozilla-central/rev/6d1ab84b4b39fbfb9505d4399857239bc15202ef/services/sync/modules/engines.js#1282,1293,1299. So A is back in `previousFailed`, and we'll keep trying to backfill it on subsequent syncs.

I _think_ the fix is to remove `failedInPreviousSync` from `previousFailed` after the backfill. That way, we'll still retry records that fail to backfill on the next backfill, too, and ignore ones we already tried.
Priority: -- → P2
Priority: P2 → P3
Severity: normal → S3
You need to log in before you can comment on or make changes to this bug.