Closed Bug 1315355 Opened 8 years ago Closed 8 years ago

Add Arguments for Installing Dependencies to Spark Cluster

Categories

(Cloud Services Graveyard :: Metrics: Pipeline, defect, P3)

defect

Tracking

(Not tracked)

RESOLVED INVALID

People

(Reporter: frank, Unassigned)

References

Details

Right now, if a scheduled job wants to include external dependencies we don't currently include, they have to either download it in ipynb and use it locally, or do some spark magic to create the package egg and distribute it to the cluster. We should have a simple argument for telemetry.sh with a comma separated list of dependencies to install. Eventually this can even be added to ATMO, where dependencies can be specified before booting up a cluster or running a scheduled job. It's unclear if this is needed for Scala, but first pass can just be python dependencies.
Summary: Add Arguments for Installing Dependencies → Add Arguments for Installing Dependencies to Spark Cluster
Blocks: 1248688
No longer blocks: 1284522
Priority: -- → P3
Depends on: 1312747
Status: NEW → RESOLVED
Closed: 8 years ago
Resolution: --- → INVALID
Product: Cloud Services → Cloud Services Graveyard
You need to log in before you can comment on or make changes to this bug.