Closed
Bug 1315355
Opened 8 years ago
Closed 8 years ago
Add Arguments for Installing Dependencies to Spark Cluster
Categories
(Cloud Services Graveyard :: Metrics: Pipeline, defect, P3)
Cloud Services Graveyard
Metrics: Pipeline
Tracking
(Not tracked)
RESOLVED
INVALID
People
(Reporter: frank, Unassigned)
References
Details
Right now, if a scheduled job wants to include external dependencies we don't currently include, they have to either download it in ipynb and use it locally, or do some spark magic to create the package egg and distribute it to the cluster.
We should have a simple argument for telemetry.sh with a comma separated list of dependencies to install. Eventually this can even be added to ATMO, where dependencies can be specified before booting up a cluster or running a scheduled job.
It's unclear if this is needed for Scala, but first pass can just be python dependencies.
Reporter | ||
Updated•8 years ago
|
Summary: Add Arguments for Installing Dependencies → Add Arguments for Installing Dependencies to Spark Cluster
Updated•8 years ago
|
Updated•8 years ago
|
Priority: -- → P3
Comment 1•8 years ago
|
||
Status: NEW → RESOLVED
Closed: 8 years ago
Resolution: --- → INVALID
Updated•6 years ago
|
Product: Cloud Services → Cloud Services Graveyard
You need to log in
before you can comment on or make changes to this bug.
Description
•