Open Bug 1562226 Opened 5 years ago Updated 2 years ago

Doc upload fails if requests was previously loaded

Categories

(Developer Infrastructure :: Source Documentation, defect, P4)

Tracking

(Not tracked)

People

(Reporter: ahal, Unassigned)

References

Details

This was discovered in bug 1561261. To summarize, bug 1519598 added a new mach_commands.py that imported requests at the top level. For context, all mach commands get imported on every invocation of mach. So this means that when running mach doc, requests was loaded into sys.modules. This somehow interacted badly with Amazon's boto3 library and resulted in this failure when attempting to upload files:

[task 2019-06-25T09:53:52.320Z] SSLError: SSL validation failed for https://s3.us-west-2.amazonaws.com/gecko-docs.mozilla.org/main/latest/_images/AsyncPanZoomArchitecture.png [Errno 2] No such file or directory
[task 2019-06-25T09:53:52.320Z] 
[task 2019-06-25T09:53:52.320Z]   File "/builds/worker/checkouts/gecko/tools/docs/mach_commands.py", line 91, in build_docs
[task 2019-06-25T09:53:52.320Z]     self._s3_upload(savedir, self.project, self.version)
[task 2019-06-25T09:53:52.320Z]   File "/builds/worker/checkouts/gecko/tools/docs/mach_commands.py", line 190, in _s3_upload
[task 2019-06-25T09:53:52.320Z]     s3_upload(files, key_prefix='%s/latest' % project)
[task 2019-06-25T09:53:52.320Z]   File "/builds/worker/checkouts/gecko/tools/docs/moztreedocs/upload.py", line 85, in s3_upload
[task 2019-06-25T09:53:52.320Z]     f.result()
[task 2019-06-25T09:53:52.320Z]   File "/builds/worker/checkouts/gecko/third_party/python/futures/concurrent/futures/_base.py", line 398, in result
[task 2019-06-25T09:53:52.320Z]     return self.__get_result()
[task 2019-06-25T09:53:52.321Z]   File "/builds/worker/checkouts/gecko/third_party/python/futures/concurrent/futures/thread.py", line 55, in run
[task 2019-06-25T09:53:52.321Z]     result = self.fn(*self.args, **self.kwargs)
[task 2019-06-25T09:53:52.321Z]   File "/builds/worker/checkouts/gecko/tools/docs/moztreedocs/upload.py", line 61, in upload
[task 2019-06-25T09:53:52.321Z]     s3.upload_fileobj(f, bucket, key, ExtraArgs=extra_args)
[task 2019-06-25T09:53:52.321Z]   File "/builds/worker/checkouts/gecko/obj-x86_64-pc-linux-gnu/_virtualenvs/docs-y-qLfVwz/lib/python2.7/site-packages/boto3/s3/inject.py", line 539, in upload_fileobj
[task 2019-06-25T09:53:52.321Z]     return future.result()
[task 2019-06-25T09:53:52.321Z]   File "/builds/worker/checkouts/gecko/obj-x86_64-pc-linux-gnu/_virtualenvs/docs-y-qLfVwz/lib/python2.7/site-packages/s3transfer/futures.py", line 106, in result
[task 2019-06-25T09:53:52.321Z]     return self._coordinator.result()
[task 2019-06-25T09:53:52.321Z]   File "/builds/worker/checkouts/gecko/obj-x86_64-pc-linux-gnu/_virtualenvs/docs-y-qLfVwz/lib/python2.7/site-packages/s3transfer/futures.py", line 265, in result
[task 2019-06-25T09:53:52.321Z]     raise self._exception

The easy workaround is to not import requests at the top-level of any mach commands, but we should probably figure out why this happened so we don't waste a bunch of time debugging the next time we run into this.

Product: Firefox Build System → Developer Infrastructure
Severity: normal → S3
You need to log in before you can comment on or make changes to this bug.