Closed
Bug 1223045
Opened 9 years ago
Closed 9 years ago
Parquet converter should emit dimensions with Spark compatible syntax
Categories
(Cloud Services Graveyard :: Metrics: Pipeline, defect, P4)
Cloud Services Graveyard
Metrics: Pipeline
Tracking
(Not tracked)
RESOLVED
FIXED
People
(Reporter: rvitillo, Assigned: rvitillo)
References
Details
Dimensions in S3 should match the "column_name=value" syntax [1], which allows Spark to infer the partitioning scheme automatically.
[1] http://spark.apache.org/docs/latest/sql-programming-guide.html#partition-discovery
Updated•9 years ago
|
Priority: -- → P4
Assignee | ||
Updated•9 years ago
|
Assignee: nobody → rvitillo
Assignee | ||
Updated•9 years ago
|
Status: NEW → RESOLVED
Closed: 9 years ago
Resolution: --- → FIXED
Updated•6 years ago
|
Product: Cloud Services → Cloud Services Graveyard
You need to log in
before you can comment on or make changes to this bug.
Description
•