Closed
Bug 1337148
Opened 7 years ago
Closed 4 years ago
regularly validate release URLs
Categories
(Release Engineering Graveyard :: Applications: Balrog (backend), defect, P3)
Release Engineering Graveyard
Applications: Balrog (backend)
Tracking
(Not tracked)
RESOLVED
MOVED
People
(Reporter: bhearsum, Unassigned)
References
Details
(Whiteboard: [lang=python])
We recently discovered an issue where we had clean up some stuff on s3 that was thought to be unused, and it broke updates for a significant number of users. One thing that would help here is to regularly check all of the mapped to Releases, and make sure that all of the URLs they point to are 200s. I wrote a hacky hacky script to do this as a one off: https://github.com/mozilla/balrog/compare/master...mozbhearsum:find-bad-mars?expand=1 find-active-mar-urls2.py finds everything that is pointed at and outputs that to a json file. check-urls.py goes through a json file and does a HEAD request on each of them. This needs to be polished and probably enhanced before we can run it in automation or anything.
Reporter | ||
Updated•7 years ago
|
Whiteboard: [lang=python]
Reporter | ||
Updated•7 years ago
|
Priority: -- → P2
Reporter | ||
Comment 1•7 years ago
|
||
Aki suggested that we should be checking for active MARs before doing any non-automated cleanup on S3 or other places that Balrog may point. He also suggested that publishing the list of active URLs regularly would be helpful - it should let other systems or tools do this or other checks we haven't conceived of yet.
Reporter | ||
Comment 2•7 years ago
|
||
When we have the script, we might be able to run it as a Taskcluster hook (https://tools.taskcluster.net/hooks/).
Reporter | ||
Updated•7 years ago
|
Priority: P2 → P3
Reporter | ||
Comment 3•4 years ago
|
||
Status: NEW → RESOLVED
Closed: 4 years ago
Resolution: --- → MOVED
Updated•4 years ago
|
Product: Release Engineering → Release Engineering Graveyard
You need to log in
before you can comment on or make changes to this bug.
Description
•