Closed Bug 1093486 Opened 10 years ago Closed 10 years ago

Deploy s3 caches for primary repositories

Categories

(Taskcluster :: General, defect)

x86
macOS
defect
Not set
normal

Tracking

(Not tracked)

RESOLVED FIXED

People

(Reporter: jlal, Assigned: rwood)

References

Details

Attachments

(1 file)

Cloning from hg from aws is fairly slow compared to untaring a copy from s3 (seconds vs minutes) we should push copies of the most common repos to s3 and fallback to cloning only if we cannot find one.

Things to clone at a minimum:

  - mozilla-central
  - b2g-inbound
  - fx-team
  - mozilla-inbound
  - gaia (hg)
More notes: I have been using the test-caching bucket so far and all the caches have been public.

I wrote some code to do similar things here https://github.com/mozilla-b2g/gaia/blob/master/build/docker/gaia-taskenv/bin/entrypoint#L21 (no religion on my part about what languages to use, etc... bash was just easy for this)
Summary: Deploy s3 caches for primary branches → Deploy s3 caches for primary repositories
Ok, so inside the docker builder containers we should pull the source down from the S3 test-caching bucket, instead of doing an hg clone as it is much faster, makes sense.

I'm guessing the .tar.gz for each repo on S3 is just 'manually' made one time and placed on S3 (maybe updated once and awhile?). Or is there code somewhere else that does that?

The code to pull down the .tar.gz from S3 and untar it into the container in /home/worker/x, would go here (or invoke a new .sh from here):

https://github.com/lightsofapollo/gecko-dev/blob/taskcluster/testing/docker/builder/bin/build-setup.sh

And the existing build.sh script will already take care of pulling the latest / updating what we grabbed from S3:

https://github.com/lightsofapollo/gecko-dev/blob/taskcluster/testing/docker/builder/bin/build.sh#L15

Am I on the right track? Thanks!
Flags: needinfo?(jlal)
Yup! It's manual right now... pulling differences is pretty fast but the full clone is brutal.
Flags: needinfo?(jlal)
Assignee: nobody → rwood
PR to use S3 cache for mozilla-central and gaia. As the other builders are created they should do the same (I'm assuming b2g-inbound, fx-team, mozilla-inbound will have their own builders?).
Attachment #8518454 - Flags: review?(jlal)
Comment on attachment 8518454 [details] [review]
https://github.com/lightsofapollo/gecko-dev/pull/15

we talked on irc - please reflag when ready
Attachment #8518454 - Flags: review?(jlal)
Comment on attachment 8518454 [details] [review]
https://github.com/lightsofapollo/gecko-dev/pull/15

PR updated as discussed. Added cli so builders can check a repo-cache.json to see if a repo is cached. If repo is cached it is downloaded and unpacked, otherwise cloned. Added cli to base, .json to builder, and updated build-setup and b2g-desktop builder accordingly. Increments both the base and builder image versions.

My local testing of this PR included building the updated base and builder images, and inside the updated builder I did a ./build.sh and ./build-b2g-desktop.sh just far enough to verify that the sources were pulled from S3.
Attachment #8518454 - Flags: review?(jlal)
Attachment #8518454 - Flags: review?(wcosta)
Comment on attachment 8518454 [details] [review]
https://github.com/lightsofapollo/gecko-dev/pull/15

lgtm I think we can do more in this area but I like this start.
Attachment #8518454 - Flags: review?(jlal) → review+
Attachment #8518454 - Flags: review?(wcosta) → review+
bah we did not land this earlier ... rebasing this unto my current work
rolled into the changes in bug 1104504 (eventually to be come tc-vcs)
Status: NEW → RESOLVED
Closed: 10 years ago
Resolution: --- → FIXED
Component: TaskCluster → General
Product: Testing → Taskcluster
Target Milestone: --- → mozilla41
Version: unspecified → Trunk
Resetting Version and Target Milestone that accidentally got changed...
Target Milestone: mozilla41 → ---
Version: Trunk → unspecified
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Created:
Updated:
Size: