Closed Bug 1609546 Opened 4 years ago Closed 4 years ago

spark-bigquery connector fails with 'INTERNAL: request failed: internal error' on payload_bytes datasets

Categories

(Data Platform and Tools :: General, defect, P1)

defect
Points:
2

Tracking

(Not tracked)

RESOLVED FIXED

People

(Reporter: amiyaguchi, Unassigned)

References

Details

The spark-bigquery connector uses the BigQuery Storage API to access tables within Spark. This is currently being used for mozaggregator and the prio-processor jobs. This is being tracked in the Google Cloud issue tracker here: https://issuetracker.google.com/issues/147113808

Blocks: 1605442
Blocks: 1609548

Looks like the fix should roll out this week.

From the linked issue, the fix went out on Feb 6 - Anthony are you still seeing the same behaviour?

Flags: needinfo?(amiyaguchi)

As of 2020-02-11, I had still saw the same behavior as reported in case 21629086 in the airflow-dataproc-prod project. I received a followup on 2020-02-14 that another fix would be rolled out and announced on the public issue page, which has not happened yet.

Flags: needinfo?(amiyaguchi)

From the linked issue, this seems to be fixed now. We talked about this last week, and IIRC there's another related problem now - can you add an update here?

Flags: needinfo?(amiyaguchi)
Points: --- → 2
Priority: -- → P1

This has been resolved as of 2020-03-02.

Status: NEW → RESOLVED
Closed: 4 years ago
Flags: needinfo?(amiyaguchi)
Resolution: --- → FIXED
You need to log in before you can comment on or make changes to this bug.