Closed
Bug 1312435
Opened 8 years ago
Closed 8 years ago
Spark Scheduled Jobs Failing
Categories
(Cloud Services Graveyard :: Metrics: Pipeline, defect)
Cloud Services Graveyard
Metrics: Pipeline
Tracking
(Not tracked)
RESOLVED
FIXED
People
(Reporter: frank, Assigned: frank)
References
Details
Attachments
(2 files)
Spark scheduled jobs are failing with:
[NbConvertApp] Writing 10503 bytes to ../getting-crash-stats-for-OOM-data-to-S3.ipynb
grep: getting-crash-stats-for-OOM-data-to-S3.ipynb: No such file or directory
Command exiting with ret '0'
It was initially confirmed this was an IOError due to disk space, but that should have been fixed in bug 1311708.
Assignee | ||
Comment 1•8 years ago
|
||
Discovered that --output in nbconvert just changes the name of the file, am adding --output-dir to put in the current directory.
Assignee | ||
Comment 2•8 years ago
|
||
mdoglio noted that the airflow.sh will need to be changed as well:
https://github.com/mozilla/telemetry-airflow/blob/master/ansible/files/spark/airflow.sh#L90
Assignee | ||
Comment 3•8 years ago
|
||
Just tested this using the previous version (Anaconda 4.0, which used nbconvert 4.1.0), and the old code did indeed work as expected. Looks like this was a non-backwards compatible upgrade.
Comment 4•8 years ago
|
||
Comment 5•8 years ago
|
||
Assignee | ||
Updated•8 years ago
|
Summary: Spark Schedule Jobs Failing → Spark Scheduled Jobs Failing
Assignee | ||
Comment 6•8 years ago
|
||
Looks like this fixed the issue.
Status: ASSIGNED → RESOLVED
Closed: 8 years ago
Resolution: --- → FIXED
Comment 7•8 years ago
|
||
Thanks a lot :frank
Updated•6 years ago
|
Product: Cloud Services → Cloud Services Graveyard
You need to log in
before you can comment on or make changes to this bug.
Description
•