Closed
Bug 1580957
Opened 6 years ago
Closed 6 years ago
Allow databricks-ec2 access to telemetry-spark-emr-2 bucket
Categories
(Data Platform and Tools Graveyard :: Operations, task, P1)
Data Platform and Tools Graveyard
Operations
Tracking
(Not tracked)
RESOLVED
FIXED
People
(Reporter: amiyaguchi, Assigned: robotblake)
References
Details
Currently databricks does not have access to telemetry-spark-emr-2. This bucket contains credentials for various services that run on EMR. I am porting mozaggregator to run on Databricks, in an eventual port of the service to GCP. Currently the job fails with the following message:
ClientError: An error occurred (AccessDenied) when calling the GetObject operation: Access Denied
This can be tested with the following code:
import boto3
import json
credentials_bucket = "telemetry-spark-emr-2"
credentials_prefix = "aggregator_dev_database_envvars.json"
s3 = boto3.resource("s3")
obj = s3.Object(credentials_bucket, credentials_prefix)
creds = json.loads(obj.get()["Body"].read().decode("utf-8"))
| Assignee | ||
Updated•6 years ago
|
Assignee: nobody → bimsland
Priority: -- → P1
| Assignee | ||
Comment 1•6 years ago
|
||
This is working now.
Status: NEW → RESOLVED
Closed: 6 years ago
Resolution: --- → FIXED
Updated•3 years ago
|
Product: Data Platform and Tools → Data Platform and Tools Graveyard
You need to log in
before you can comment on or make changes to this bug.
Description
•