[traceback] NumericValueOutOfRange: putting 2,644,960,066 in upload_upload.size field
Categories
(Tecken :: Upload, defect, P1)
Tracking
(Not tracked)
People
(Reporter: willkg, Assigned: willkg)
References
Details
Attachments
(2 files)
Sentry: https://sentry.io/organizations/mozilla/issues/3683605292/
Release: 2022.10.11:de6a883f
NumericValueOutOfRange: integer out of range
File "django/db/backends/utils.py", line 84, in _execute
return self.cursor.execute(sql, params)
DataError: integer out of range
File "django/core/handlers/exception.py", line 47, in inner
response = get_response(request)
File "django/core/handlers/base.py", line 181, in _get_response
response = wrapped_callback(request, *callback_args, **callback_kwargs)
File "markus/main.py", line 515, in _timer_decorator
return fun(*args, **kwargs)
File "tecken/base/decorators.py", line 127, in inner
return func(request, *args, **kwargs)
File "django/views/decorators/csrf.py", line 54, in wrapped_view
return view_func(*args, **kwargs)
File "tecken/base/decorators.py", line 29, in inner
return view_func(request, *args, **kwargs)
File "django/contrib/auth/decorators.py", line 21, in _wrapped_view
return view_func(request, *args, **kwargs)
File "tecken/base/decorators.py", line 200, in inner
return func(*args, **kwargs)
File "tecken/upload/views.py", line 353, in upload_archive
upload_obj = Upload.objects.create(
File "django/db/models/manager.py", line 85, in manager_method
return getattr(self.get_queryset(), name)(*args, **kwargs)
File "django/db/models/query.py", line 453, in create
obj.save(force_insert=True, using=self.db)
File "django/db/models/base.py", line 739, in save
self.save_base(using=using, force_insert=force_insert,
File "django/db/models/base.py", line 776, in save_base
updated = self._save_table(
File "django/db/models/base.py", line 881, in _save_table
results = self._do_insert(cls._base_manager, using, fields, returning_fields, raw)
File "django/db/models/base.py", line 919, in _do_insert
return manager._insert(
File "django/db/models/manager.py", line 85, in manager_method
return getattr(self.get_queryset(), name)(*args, **kwargs)
File "django/db/models/query.py", line 1270, in _insert
return query.get_compiler(using=using).execute_sql(returning_fields)
File "django/db/models/sql/compiler.py", line 1416, in execute_sql
cursor.execute(sql, params)
File "django/db/backends/utils.py", line 66, in execute
return self._execute_with_wrappers(sql, params, many=False, executor=self._execute)
File "django/db/backends/utils.py", line 75, in _execute_with_wrappers
return executor(sql, params, many, context)
File "django/db/backends/utils.py", line 84, in _execute
return self.cursor.execute(sql, params)
File "django/db/utils.py", line 90, in __exit__
raise dj_exc_value.with_traceback(traceback) from exc_value
File "django/db/backends/utils.py", line 84, in _execute
return self.cursor.execute(sql, params)
The number is 2,644,960,066 which is the size of the .zip file of symbols being uploaded.
Assignee | ||
Comment 1•2 years ago
|
||
I haven't seen this happen before, so I think this is new. There are 16 instances of the error in Sentry right now. Generally, sym files keep getting bigger so we were probably bound to hit this issue eventually. This particular upload is using upload-by-download which won't be affected by connection timeouts, max payload size, etc.
That field is defined as a PositiveIntegerField
which corresponds to a postgresql integer
which maxes out at 2,147,483,647.
I think we need to switch to a PositiveBigIntegerField
and then we'll be ok.
Unfortunately, that table is big (which causes other problems) so I suspect this is a major migration that will require a service outage to do.
While the size of the zip file is interesting, maybe we don't need this information and we can check for the max value and insert a fake "very large" hand-wavey value? Or maybe the max value? I don't know what effects that would have on the rest of the system or what lying does to us.
Assignee | ||
Updated•2 years ago
|
Assignee | ||
Comment 2•2 years ago
|
||
Assignee | ||
Comment 3•2 years ago
|
||
This requires a database migration that alters the upload_upload
table. That table is 300k items large, so we need to plan to do an outage.
I wrote up a Jira ticket to coordinate that with SRE: https://mozilla-hub.atlassian.net/browse/DSRE-1078
Assignee | ||
Comment 4•2 years ago
|
||
Assignee | ||
Comment 5•2 years ago
|
||
We ran the migrations in stage today. Everything went fine.
We deployed the code (with auto-running migrations as a post-deploy step disabled) to prod just now in bug #1798788.
We'll run the migrations tomorrow.
Assignee | ||
Comment 6•2 years ago
|
||
We ran the migrations in prod today. Everything went fine.
Migration took about 9 minutes to run.
Marking as FIXED.
Assignee | ||
Comment 7•2 years ago
|
||
Assignee | ||
Comment 8•2 years ago
|
||
Description
•