Closed
Bug 914912
Opened 12 years ago
Closed 12 years ago
Improve timeouts on crontabber ftpscraper
Categories
(Socorro :: General, task)
Tracking
(Not tracked)
RESOLVED
FIXED
59
People
(Reporter: bburton, Assigned: peterbe)
Details
(Whiteboard: [qa-])
We've been seeing intermittent ftpscaper errors, but we're not seeing any issues corresponding on the ftp.mo cluster nor has releng mentioned anything, which is usually a good indicator of an issue with the cluster
The logs look like
2013-09-08 05:15:01,989 DEBUG - MainThread - about to run <class 'socorro.cron.jobs.ftpscraper.FTPScraperCronApp'>
2013-09-08 05:15:01,996 DEBUG - MainThread - scraping firefox releases for date 2013-09-08 11:30:54.742901+00:00
2013-09-08 05:16:12,468 DEBUG - MainThread - scraping mobile releases for date 2013-09-08 11:30:54.742901+00:00
2013-09-08 05:16:51,135 DEBUG - MainThread - scraping thunderbird releases for date 2013-09-08 11:30:54.742901+00:00
2013-09-08 05:18:58,823 DEBUG - MainThread - error when running <class 'socorro.cron.jobs.ftpscraper.FTPScraperCronApp'> on None
Traceback (most recent call last):
File "/data/socorro/application/socorro/cron/crontabber.py", line 705, in _run_one
for last_success in self._run_job(job_class, config, info):
File "/data/socorro/application/socorro/cron/base.py", line 174, in main
function(when)
File "/data/socorro/application/socorro/cron/base.py", line 213, in _run_proxy
self.run(connection, date)
File "/data/socorro/application/socorro/cron/jobs/ftpscraper.py", line 213, in run
self.scrapeReleases(connection, product_name)
File "/data/socorro/application/socorro/cron/jobs/ftpscraper.py", line 232, in scrapeReleases
for info in getRelease(release, url):
File "/data/socorro/application/socorro/cron/jobs/ftpscraper.py", line 117, in getRelease
kvpairs, bad_lines = parseInfoFile(info_url)
File "/data/socorro/application/socorro/cron/jobs/ftpscraper.py", line 56, in parseInfoFile
infotxt = urllib2.urlopen(url)
File "/usr/lib64/python2.6/urllib2.py", line 126, in urlopen
return _opener.open(url, data, timeout)
File "/usr/lib64/python2.6/urllib2.py", line 391, in open
response = self._open(req, data)
File "/usr/lib64/python2.6/urllib2.py", line 409, in _open
'_open', req)
File "/usr/lib64/python2.6/urllib2.py", line 369, in _call_chain
result = func(*args)
File "/usr/lib64/python2.6/urllib2.py", line 1190, in http_open
return self.do_open(httplib.HTTPConnection, req)
File "/usr/lib64/python2.6/urllib2.py", line 1165, in do_open
raise URLError(err)
URLError: <urlopen error timed out>
From the code in https://github.com/mozilla/socorro/blob/master/socorro/cron/jobs/ftpscraper.py?source=cc it seems like it only tries once, could it be made to try a couple times?
Assignee | ||
Comment 1•12 years ago
|
||
A step forward would be to switch to using `requests` instead of `urllib2` since it has built-in support for retries. /me need to back that claim up with some documentation.
Assignee | ||
Comment 2•12 years ago
|
||
Pull request: https://github.com/mozilla/socorro/pull/1493
Assignee: nobody → peterbe
Status: NEW → ASSIGNED
Comment 3•12 years ago
|
||
Commit pushed to master at https://github.com/mozilla/socorro
https://github.com/mozilla/socorro/commit/c238afe28609785d691de0ebb333a42fea524a42
fixes bug 914912 - Improve timeouts on crontabber ftpscraper, r=rhelmer
Updated•12 years ago
|
Status: ASSIGNED → RESOLVED
Closed: 12 years ago
Resolution: --- → FIXED
Assignee | ||
Updated•12 years ago
|
Target Milestone: --- → 59
Updated•12 years ago
|
Whiteboard: [qa-]
Assignee | ||
Updated•12 years ago
|
Summary: Improve timeouts on crontabber ftpscraper? → Improve timeouts on crontabber ftpscraper
You need to log in
before you can comment on or make changes to this bug.
Description
•