Missing builds under https://archive
We seem to be missing all builds (Beta/RC, and possibly also some Dot releases) for versions 46 to 49, in folder https://archive.mozilla.org/pub/firefox/candidates/: - the folder displays: 45.3.0esr, 45.4.0esr, 49.0 and 49.0.1, and all 50 Beta builds - the "archived" folder has versions between 10 and 45 The builds are available in the "releases" folder, but I'm not sure if the "candidates" folder isn't also used by some other tools (ondemand update tests use the "releases" folder).
:rail, do you know what is going on here?
No idea. Let's ask Nick, he knows everything! :)
Flags: needinfo?(rail) → needinfo?(nthomas)
You jest, sir! I did indeed remove them; should have left a bug trail, sorry. Is there any known bustage ?
I don't releng hit anything here. AFAIK, we use those directories in release sanity to verify partial build numbers, that's it.
Urgh, we might hit a problem with a release build then. https://dxr.mozilla.org/build-central/source/tools/lib/python/kickoff/sanity/partials.py#23 says I should put the SHA512SUMS back ?
Nothing for oremj to do, moving bug over to RelEng.
Assignee: oremj → nthomas
Component: Operations: Product Delivery → Releases
Product: Cloud Services → Release Engineering
QA Contact: oremj → rail
Looks like we hit an issue in ship-it: Status: Sanity checks failed. Errors: Issues on release sanity * Broken build - hash https://archive.mozilla.org/pub/firefox/candidates/48.0.2-candidates/build1/SHA512SUMS not found: 404 Client Error: Not Found * 48.0.2-build1 is a good candidate build, b
We can probably copy 48.0.2 from the releases directory to the candidates directory, even though not all files are there.
Severity: normal → blocker
and probably 47.0.1build1
This is holding up our dot release, which isn't an emergency but I'd call it somewhat urgent. I'd like to start the build today if possible. If we don't figure it out today, we can put off the 49.0.2 dot release by another day, and aim for Wed. build and Thursday release.
Sorry for the bustage. I've restored these s3 objects (from the versioned bucket, rather than copying back from releases): pub/firefox/candidates/48.0.2-candidates/build1/SHA512SUMS pub/firefox/candidates/48.0.2-candidates/build1/SHA512SUMS.asc pub/firefox/candidates/48.0.2-candidates/build1/KEY pub/firefox/candidates/47.0.1-candidates/build1/SHA512SUMS pub/firefox/candidates/47.0.1-candidates/build1/SHA512SUMS.asc pub/firefox/candidates/47.0.1-candidates/build1/KEY
I don't think it's likely we'll need any older releases than 47.0.1 so I'll close this out.
Status: NEW → RESOLVED
Last Resolved: 2 years ago
Resolution: --- → FIXED
We hit issues testing release-localtest today, 47.0 needed 47.0.1 from the candidates directory.
Status: RESOLVED → REOPENED
Resolution: FIXED → ---
Apparently the easiest way to bulk undelete things in a S3 bucket is to delete the delete marker. It even gives you back the original modification time. DELETE_DATE = '2016-10-06' for v in bucket.list_versions('pub/firefox/candidates/47.0.1-candidates/build1/update/'): if (isinstance(v, boto.s3.deletemarker.DeleteMarker) and v.is_latest and DELETE_DATE in v.last_modified): bucket.delete_key(v.name, version_id=v.version_id) Did that for 48.0.2-build1 as well.
Status: REOPENED → RESOLVED
Last Resolved: 2 years ago → 2 years ago
Resolution: --- → FIXED
You need to log in before you can comment on or make changes to this bug.