spark-bigquery connector fails with 'INTERNAL: request failed: internal error' on payload_bytes datasets
Categories
(Data Platform and Tools :: General, defect, P1)
Tracking
(Not tracked)
People
(Reporter: amiyaguchi, Unassigned)
References
Details
The spark-bigquery connector uses the BigQuery Storage API to access tables within Spark. This is currently being used for mozaggregator and the prio-processor jobs. This is being tracked in the Google Cloud issue tracker here: https://issuetracker.google.com/issues/147113808
Comment 1•4 years ago
|
||
Looks like the fix should roll out this week.
Comment 2•4 years ago
|
||
From the linked issue, the fix went out on Feb 6 - Anthony are you still seeing the same behaviour?
Reporter | ||
Comment 3•4 years ago
|
||
As of 2020-02-11, I had still saw the same behavior as reported in case 21629086 in the airflow-dataproc-prod
project. I received a followup on 2020-02-14 that another fix would be rolled out and announced on the public issue page, which has not happened yet.
Comment 4•4 years ago
|
||
From the linked issue, this seems to be fixed now. We talked about this last week, and IIRC there's another related problem now - can you add an update here?
Updated•4 years ago
|
Reporter | ||
Comment 5•4 years ago
|
||
This has been resolved as of 2020-03-02.
Description
•