Closed
Bug 1306227
Opened 8 years ago
Closed 8 years ago
Airflow scheduled Jupyter notebooks are failing
Categories
(Cloud Services Graveyard :: Metrics: Pipeline, defect)
Cloud Services Graveyard
Metrics: Pipeline
Tracking
(Not tracked)
RESOLVED
FIXED
People
(Reporter: rvitillo, Assigned: mdoglio)
References
Details
User Story
This is likely due to the recent changes we landed to telemetry-airflow and or emr-bootstrap-spark. It looks like nbconvert times out if a cell takes longer than 30s, which is something very likely to happen. [NbConvertApp] ERROR | Timeout waiting for execute reply (30s). If your cell should take longer than this, you can increase the timeout with: c.ExecutePreprocessor.timeout = SECONDS in jupyter_nbconvert_config.py Traceback (most recent call last): File "/home/hadoop/anaconda2/bin/jupyter-nbconvert", line 6, in <module> main() File "/home/hadoop/anaconda2/lib/python2.7/site-packages/jupyter_core/application.py", line 267, in launch_instance return super(JupyterApp, cls).launch_instance(argv=argv, **kwargs) File "/home/hadoop/anaconda2/lib/python2.7/site-packages/traitlets/config/application.py", line 596, in launch_instance app.start() File "/home/hadoop/anaconda2/lib/python2.7/site-packages/nbconvert/nbconvertapp.py", line 289, in start self.convert_notebooks() File "/home/hadoop/anaconda2/lib/python2.7/site-packages/nbconvert/nbconvertapp.py", line 412, in convert_notebooks self.convert_single_notebook(notebook_filename) File "/home/hadoop/anaconda2/lib/python2.7/site-packages/nbconvert/nbconvertapp.py", line 383, in convert_single_notebook output, resources = self.export_single_notebook(notebook_filename, resources) File "/home/hadoop/anaconda2/lib/python2.7/site-packages/nbconvert/nbconvertapp.py", line 335, in export_single_notebook output, resources = self.exporter.from_filename(notebook_filename, resources=resources) File "/home/hadoop/anaconda2/lib/python2.7/site-packages/nbconvert/exporters/exporter.py", line 165, in from_filename return self.from_notebook_node(nbformat.read(f, as_version=4), resources=resources, **kw) File "/home/hadoop/anaconda2/lib/python2.7/site-packages/nbconvert/exporters/notebook.py", line 26, in from_notebook_node nb_copy, resources = super(NotebookExporter, self).from_notebook_node(nb, resources, **kw) File "/home/hadoop/anaconda2/lib/python2.7/site-packages/nbconvert/exporters/exporter.py", line 130, in from_notebook_node nb_copy, resources = self._preprocess(nb_copy, resources) File "/home/hadoop/anaconda2/lib/python2.7/site-packages/nbconvert/exporters/exporter.py", line 302, in _preprocess nbc, resc = preprocessor(nbc, resc) File "/home/hadoop/anaconda2/lib/python2.7/site-packages/nbconvert/preprocessors/base.py", line 47, in __call__ return self.preprocess(nb,resources) File "/home/hadoop/anaconda2/lib/python2.7/site-packages/nbconvert/preprocessors/execute.py", line 83, in preprocess nb, resources = super(ExecutePreprocessor, self).preprocess(nb, resources) File "/home/hadoop/anaconda2/lib/python2.7/site-packages/nbconvert/preprocessors/base.py", line 70, in preprocess nb.cells[index], resources = self.preprocess_cell(cell, resources, index) File "/home/hadoop/anaconda2/lib/python2.7/site-packages/nbconvert/preprocessors/execute.py", line 97, in preprocess_cell outputs = self.run_cell(cell) File "/home/hadoop/anaconda2/lib/python2.7/site-packages/nbconvert/preprocessors/execute.py", line 140, in run_cell raise exception("Cell execution timed out, see log" RuntimeError: Cell execution timed out, see log for details.
Attachments
(1 file)
No description provided.
Reporter | ||
Updated•8 years ago
|
Severity: normal → major
Assignee | ||
Comment 1•8 years ago
|
||
Attachment #8796129 -
Flags: review?(rvitillo)
Assignee | ||
Comment 2•8 years ago
|
||
mreid r+ this patch on github. This is now deployed and working.
Status: NEW → RESOLVED
Closed: 8 years ago
Resolution: --- → FIXED
Reporter | ||
Updated•8 years ago
|
Attachment #8796129 -
Flags: review?(rvitillo) → review+
Updated•6 years ago
|
Product: Cloud Services → Cloud Services Graveyard
You need to log in
before you can comment on or make changes to this bug.
Description
•