Using a Spark cluster name with spaces triggers an internal server error

RESOLVED FIXED

Status

RESOLVED FIXED
3 years ago
3 years ago

People

(Reporter: Dexter, Unassigned)

Tracking

Trunk
Points:
---

Firefox Tracking Flags

(firefox43 affected)

Details

(Reporter)

Description

3 years ago
I tried to create an ad-hoc Spark cluster using the following name (yeah, I know):

"Duplicated subsessionIds (aborted-session and shutdown pings)analysis"

it didn't complain or report any error during the validation (as it does for invalid public keys) but, instead, it failed with an HTTP 500 Internal server error along with the following message:

"The server encountered an internal error and was unable to complete your request. Either the server is overloaded or there is an error in the application."

Using a cluster name with no spaces/dashes works.

Comment 1

3 years ago
Clusters containing spaces should now be rejected in the new analysis service at
https://analysis.telemetry.mozilla.org

PR:
https://github.com/mozilla/telemetry-server/pull/126
Status: NEW → RESOLVED
Last Resolved: 3 years ago
Resolution: --- → FIXED
You need to log in before you can comment on or make changes to this bug.