Closed Bug 1477021 Opened 6 years ago Closed 6 years ago

figure out how to regularly update python and dependencies for in-tree docker images

Categories

(Release Engineering :: General, enhancement)

enhancement
Not set
normal

Tracking

(firefox-esr60 fixed, firefox63 fixed)

RESOLVED FIXED
Tracking Status
firefox-esr60 --- fixed
firefox63 --- fixed

People

(Reporter: bhearsum, Assigned: bhearsum)

Details

Attachments

(6 files)

We recently did a one-off update of our in-tree Docker images to get Python and the dependencies we use up-to-date. We need to make sure they stay up to date, which is difficult for a couple of reasons: 1) The images only rebuild when the Dockerfile (or something it references) changes. This means that when a new Python version becomes available, we won't pick it up until the Dockerfile changes for some other reason. 2) Dependencies are version pinned (on purpose), so we need something that bumps those pinned versions periodically. I'm wondering if the periodic update script (or something like it) might be able to help us here? Maybe force a rebuild with a comment change once per week, and have it use the pyup script to bump dependencies? Simon, you've worked a lot with periodic file updates - does that seem viable to you?
Flags: needinfo?(sfoster)
Whoops, wrong person needinfo'ed!
Flags: needinfo?(sfoster) → needinfo?(sfraser)
`./mach python-safety` should be the thing to use here. I need to adjust it slightly to produce output compatible with treeherder, and add then we can set up a cron task to check all the files. Also on the todo list is to make managing the results of the task easier. I think this will be a case of having a mapping of requirements file to owner, in a similar manner to histograms' json file.
Flags: needinfo?(sfraser)
(In reply to Simon Fraser [:sfraser] ⌚️GMT from comment #2) > `./mach python-safety` should be the thing to use here. I need to adjust it > slightly to produce output compatible with treeherder, and add then we can > set up a cron task to check all the files. > > Also on the todo list is to make managing the results of the task easier. I > think this will be a case of having a mapping of requirements file to owner, > in a similar manner to histograms' json file. This looks like it will get us part of the way there, but I don't think safety will care about outdated dependencies that don't have vulnerabilities. Maybe we can have a similar command that runs pyup, and run that from a cronjob?
> This looks like it will get us part of the way there, but I don't think > safety will care about outdated dependencies that don't have > vulnerabilities. Maybe we can have a similar command that runs pyup, and run > that from a cronjob? *grumble* 'pip3 search pyup' didn't return 'pyupio'. Yeah, we should add the regular check, too, although I don't know if it should be the same level of error as the others?
(In reply to Simon Fraser [:sfraser] ⌚️GMT from comment #4) > > This looks like it will get us part of the way there, but I don't think > > safety will care about outdated dependencies that don't have > > vulnerabilities. Maybe we can have a similar command that runs pyup, and run > > that from a cronjob? > > *grumble* 'pip3 search pyup' didn't return 'pyupio'. Yeah, we should add the > regular check, too, although I don't know if it should be the same level of > error as the others? Hmmm, that's a good question. I've been thinking of it more like the periodic file update jobs than a test (because our primary goal here is to make sure that dependencies are regularly bumped for the in-tree requirements files). We could do both, though. I agree that as a test, the error level is lower than a safety. Perhaps the same level as the WARNINGs that python-safety generates about unpinned dependencies?
And argh, I just realized that pyup.io doesn't actually work unless you give it a github or gitlab repot operate on -- you can't feed it a requirements file and have it spit out an updated requirements file. So, that's going to make dependency updates tougher. We might need to look at a different strategy for doing them (eg: Pipfile/Pipfile.lock, or some other method to get the completed pinned set of dependencies from a short list of unpinned ones).
The current 'safety' just scans a requirements.txt file that you give it, hopefully the 'pyup' cli works the same way and we can wrap it inside a command and FileFinder, and so on
(In reply to Simon Fraser [:sfraser] ⌚️GMT from comment #7) > The current 'safety' just scans a requirements.txt file that you give it, > hopefully the 'pyup' cli works the same way and we can wrap it inside a > command and FileFinder, and so on Unfortunately, 'pyup' doesn't know how to operate on local files - it only operates on repos. We might be able to use some of its internals to make something that operates locally, though...
Simon and I spoke about this today, here's the path I'm going to pursue for dependencies: - Switch our in-tree requirements files to Pipfile/Pipfile.lock. The pipfile will specify unpinned first level dependencies. - Write a new job, similar to the current periodic file update jobs, that runs "pipenv update" and attachs patches to phabricator. I still don't have a better idea for ensuring we have up-to-date Python other than regularly forcing rebuilds of affected images.
Assignee: nobody → bhearsum
This is prep for setting up automatic dependency updates for it. I've put what I believe are the first level dependencies into the Pipfile, and generated a Pipfile.lock with pipenv. I purposely downgraded aiohttp to an older version, which will help verify that the automatic updates work when they land.
This is part 2 of the work for automatic dependency update, which sets up a Docker image that is capable of updating in-tree Pipfile.locks, and attaching patches for them to Phabricator. It's largely based on the periodic file update work, with unnecessary features removed. You can see it in action at https://tools.taskcluster.net/groups/PI8R8DqBS7ePULstalFY_Q/tasks/PI8R8DqBS7ePULstalFY_Q/details.
This creates a task that actual does pipfile updates for funsize-update-generator. Like the periodic file updates, it does not run on check-in. I intend for it to run once a week through cron eventually (for now, I'd just like to get the code checked in so it's easier to test).
Comment on attachment 8994991 [details] switch funsize-update-generator to a Pipfile Simon Fraser [:sfraser] ⌚️GMT has approved the revision. https://phabricator.services.mozilla.com/D2373
Attachment #8994991 - Flags: review+
Comment on attachment 8994992 [details] create a docker image that can update Pipfile.lock, and attach diffs to phabricator Simon Fraser [:sfraser] ⌚️GMT has approved the revision. https://phabricator.services.mozilla.com/D2374
Attachment #8994992 - Flags: review+
Comment on attachment 8994994 [details] create pipfile-update task for funsize-update-generator Simon Fraser [:sfraser] ⌚️GMT has approved the revision. https://phabricator.services.mozilla.com/D2375
Attachment #8994994 - Flags: review+
Pushed by bhearsum@mozilla.com: https://hg.mozilla.org/integration/mozilla-inbound/rev/d937887ddcb1 switch funsize-update-generator to a Pipfile. r=sfraser https://hg.mozilla.org/integration/mozilla-inbound/rev/1ac1ba81c59c create a docker image that can update Pipfile.lock, and attach diffs to phabricator. r=sfraser https://hg.mozilla.org/integration/mozilla-inbound/rev/235b296cbd77 create pipfile-update task for funsize-update-generator. r=sfraser
I tried out the new funsize image by tweaking an existing job and discovered that it was failing to find aiohttp. We need to run it through "pipenv run" now, otherwise none of the dependencies are available.
Comment on attachment 8995180 [details] run funsize with "pipenv run" Simon Fraser [:sfraser] ⌚️GMT has approved the revision. https://phabricator.services.mozilla.com/D2426
Attachment #8995180 - Flags: review+
These patches landed successfully, but I still need to enable cron jobs to actually update Pipfile.lock.
Status: RESOLVED → REOPENED
Resolution: FIXED → ---
Comment on attachment 8995524 [details] Schedule pipfile updates to run once per week. Simon Fraser [:sfraser] ⌚️GMT has approved the revision. https://phabricator.services.mozilla.com/D2467
Attachment #8995524 - Flags: review+
Pushed by bhearsum@mozilla.com: https://hg.mozilla.org/integration/autoland/rev/10dea25a944b Schedule pipfile updates to run once per week. r=sfraser
Keywords: leave-open
Looks like this didn't quite work this week. It looks like it's because I didn't enable the taskcluster proxy, so the arc secret couldn't be accessed. Should be an easy fix.
Comment on attachment 8995964 [details] Add taskcluster proxy to pipfile update, so arc secret can be accessed. Johan Lorenzo [:jlorenzo] has approved the revision. https://phabricator.services.mozilla.com/D2509
Attachment #8995964 - Flags: review+
Pushed by bhearsum@mozilla.com: https://hg.mozilla.org/integration/autoland/rev/25b58eff68a8 Add taskcluster proxy to pipfile update, so arc secret can be accessed. r=jlorenzo
Everything went well with the new Pipfile job this time - we're done here!
Status: REOPENED → RESOLVED
Closed: 6 years ago6 years ago
Resolution: --- → FIXED
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Created:
Updated:
Size: