Closed Bug 1337148 Opened 7 years ago Closed 4 years ago

regularly validate release URLs

Categories

(Release Engineering Graveyard :: Applications: Balrog (backend), defect, P3)

defect

Tracking

(Not tracked)

RESOLVED MOVED

People

(Reporter: bhearsum, Unassigned)

References

Details

(Whiteboard: [lang=python])

We recently discovered an issue where we had clean up some stuff on s3 that was thought to be unused, and it broke updates for a significant number of users.

One thing that would help here is to regularly check all of the mapped to Releases, and make sure that all of the URLs they point to are 200s. I wrote a hacky hacky script to do this as a one off: https://github.com/mozilla/balrog/compare/master...mozbhearsum:find-bad-mars?expand=1

find-active-mar-urls2.py finds everything that is pointed at and outputs that to a json file. check-urls.py goes through a json file and does a HEAD request on each of them.

This needs to be polished and probably enhanced before we can run it in automation or anything.
Whiteboard: [lang=python]
Priority: -- → P2
Aki suggested that we should be checking for active MARs before doing any non-automated cleanup on S3 or other places that Balrog may point. He also suggested that publishing the list of active URLs regularly would be helpful - it should let other systems or tools do this or other checks we haven't conceived of yet.
When we have the script, we might be able to run it as a Taskcluster hook (https://tools.taskcluster.net/hooks/).
Priority: P2 → P3
Status: NEW → RESOLVED
Closed: 4 years ago
Resolution: --- → MOVED
Product: Release Engineering → Release Engineering Graveyard
You need to log in before you can comment on or make changes to this bug.