Closed
Bug 1448389
Opened 7 years ago
Closed 6 years ago
Consider expanding `spark.databricks.queryWatchdog.maxQueryTasks` on shared_serverless
Categories
(Data Platform and Tools :: General, enhancement)
Data Platform and Tools
General
Tracking
(Not tracked)
RESOLVED
DUPLICATE
of bug 1522682
People
(Reporter: frank, Unassigned)
Details
Shared serverless is great because it reduces friction to using databricks. However, many people have been reporting this issue after launching large jobs.
We should see what the pros/cons are of raising this limit. The error is the following:
```
Error in SQL statement: SparkException: The query is not executed because it tries to launch 54794 tasks in a single stage, while the maximum allowed tasks one query can launch is 20000; this limit can be modified with configuration parameter "spark.databricks.queryWatchdog.maxQueryTasks".
```
Temporary workaround for users: Launch your own Spark cluster, and set it like the following in a notebook:
```
spark.conf.set("spark.databricks.queryWatchdog.maxQueryTasks", 100000)
```
Comment 1•7 years ago
|
||
I ran into this and discovered that I could just run "SET spark.databricks.queryWatchdog.maxQueryTasks = 100000" as Spark SQL to override the limit for the current session.
Updated•6 years ago
|
Status: NEW → RESOLVED
Closed: 6 years ago
Resolution: --- → DUPLICATE
| Assignee | ||
Updated•3 years ago
|
Component: Spark → General
You need to log in
before you can comment on or make changes to this bug.
Description
•