Closed Bug 1185031 Opened 10 years ago Closed 10 years ago

duplicates in Socorro processing queues for the processor result in cache collisions

Categories

(Socorro :: Backend, task)

task
Not set
normal

Tracking

(Not tracked)

RESOLVED FIXED

People

(Reporter: lars, Assigned: lars)

Details

when there duplicate crash_ids in the queues and the duplicates are picked up by the same processor, both the job acknowledgement cache and the dumps on disk experience a collision. This results in failed queue ack'ng (and therefore stuck jobs) as well as failures in running the stackwalker. The QueingThread should detect this situation and decline to accept the duplicate crash jobs.
Commit pushed to master at https://github.com/mozilla/socorro https://github.com/mozilla/socorro/commit/3585d11ec8f1b3c5a1b95f39c6ca76982a9fc08a Merge pull request #2903 from twobraids/queue-dupes-break-cache FIxes Bug 1185031 - make Rabbit refuse to offer crashes already in progress
Status: NEW → RESOLVED
Closed: 10 years ago
Resolution: --- → FIXED
You need to log in before you can comment on or make changes to this bug.