Closed
Bug 59390
Opened 24 years ago
Closed 16 years ago
Abort large (>1MB) attachments before uploading and with a reasonable error message
Categories
(Bugzilla :: Attachments & Requests, defect, P3)
Tracking
()
RESOLVED
WONTFIX
People
(Reporter: sgifford+mozilla-old, Unassigned)
References
Details
I'm trying to attach the file at: http://www.tir.com/~sgifford/mozilla.crash.out to a bug report (bug #59383). When I do, it uploads for a while, then I see: insert into attachments (bug_id, filename, description, mimetype, ispatch, submitter_id, thedata) values (59383,'mozilla.crash.out', 'Console output of Mozilla before and during crash.', 'text/plain', 0, 13170, 'dist/bin/run-mozilla.sh dist/bin/mozilla-bin MOZILLA_FIVE_HOME=/home/sgifford/src/CVS/mozilla/dist/bin LD_LIBRARY_PATH=/home/sgifford/src/CVS/mozilla/dist/bin [ ... ] Error loading URL http://bugzilla.mozilla.org/process_bug.cgi: 804b001e we don\'t handle eBorderStyle_close yet... please fix me WEBSHELL+ = 20 nsWidget::~nsWidget() of toplevel: 197 widgets still exist. WEBSHELL- = 19 '): MySQL server has gone away at globals.pl line 134, <STDIN> chunk 31284 The last line looks like it's probably the error. Is this a known limitation?
Comment 2•24 years ago
|
||
The upper limit for the attachment size is a local configuration issue of the MySQL database. So reassigning to mozilla.org / Misc.
Assignee: tara → mitchell
Component: Bugzilla → Miscellaneous
OS: Linux → All
Product: Webtools → mozilla.org
Hardware: PC → All
Comment 3•24 years ago
|
||
Reassigning to Dawn, current keeper of mozilla.org's Bugzilla implementation
Assignee: mitchell → endico
Comment 4•24 years ago
|
||
this sounds like a fine thing to me. the database would bloat without bounds if we allowed huge attachments. making large files like this available on the web like you did seems like the best thing to do with large files like this.
Status: NEW → RESOLVED
Closed: 24 years ago
Resolution: --- → WONTFIX
Reporter | ||
Comment 5•24 years ago
|
||
A more obvious error message would probably be a good thing, though...and it would be nice if the error message went off before waiting for a 1MB file to upload. That should be straightforward; a content-length header should come with the file first thing, so we could just spit out an error, and drop the upload.
Comment 7•24 years ago
|
||
Reopening and morphing the bug according to the above suggestion.
Status: RESOLVED → REOPENED
Component: Miscellaneous → Bugzilla
Product: mozilla.org → Webtools
Resolution: WONTFIX → ---
Summary: Unable to attach a 1MB file to a bug → Abort large (>1MB) attachments before uploading and with a reasonable error message
Version: other → Bugzilla 2.11
Comment 8•23 years ago
|
||
Reassigning back to Tara then ... but it would probably be hard to read the MySQL setting, not to mention adding further database dependence. I suggest we set up a parameter in Bugzilla and trust the admin to set it correctly.
Assignee: endico → tara
Status: REOPENED → NEW
Updated•23 years ago
|
Target Milestone: --- → Bugzilla 2.16
Comment 9•23 years ago
|
||
Oh, I moved this to 2.16 because it's a common problem.
Comment 10•23 years ago
|
||
... and once it's fixed we should put some sort of release note about it.
Comment 11•23 years ago
|
||
*** Bug 92405 has been marked as a duplicate of this bug. ***
Comment 12•23 years ago
|
||
-> Bugzilla product, Changing-Bugs component, reassigning. There is also bug 57819 is about truncating large error messages, I think.
Assignee: tara → myk
Component: Bugzilla → Creating/Changing Bugs
Product: Webtools → Bugzilla
Version: Bugzilla 2.11 → 2.11
Comment 13•23 years ago
|
||
There's a patch to add the max. size as a param to the DB. Can we tell the Content-Size straight off from the HTTP headers? If so, we could do a sensible error. Gerv
Comment 14•23 years ago
|
||
We are currently trying to wrap up Bugzilla 2.16. We are now close enough to release time that anything that wasn't already ranked at P1 isn't going to make the cut. Thus this is being retargetted at 2.18. If you strongly disagree with this retargetting, please comment, however, be aware that we only have about 2 weeks left to review and test anything at this point, and we intend to devote this time to the remaining bugs that were designated as release blockers.
Target Milestone: Bugzilla 2.16 → Bugzilla 2.18
Component: Creating/Changing Bugs → attachment and request management
Comment 15•20 years ago
|
||
Unloved bugs targetted for 2.18 but untouched since 9-15-2003 are being retargeted to 2.20 If you plan to act on one immediately, go ahead and pull it back to 2.18.
Target Milestone: Bugzilla 2.18 → Bugzilla 2.20
Comment 16•19 years ago
|
||
This bug has not been touched by its owner in over six months, even though it is targeted to 2.20, for which the freeze is 10 days away. Unsetting the target milestone, on the assumption that nobody is actually working on it or has any plans to soon. If you are the owner, and you plan to work on the bug, please give it a real target milestone. If you are the owner, and you do *not* plan to work on it, please reassign it to nobody@bugzilla.org or a .bugs component owner. If you are *anybody*, and you get this comment, and *you* plan to work on the bug, please reassign it to yourself if you have the ability.
Target Milestone: Bugzilla 2.20 → ---
Updated•18 years ago
|
QA Contact: mattyt-bugzilla → default-qa
Updated•18 years ago
|
Assignee: myk → attach-and-request
Comment 17•18 years ago
|
||
Wow, this bug is old. :) To update anyone reading this bug, yes, we have a param now for a max attachment size (and also a max patch size, which allows a different size for patch files). We also provide a quite usable error message now that details exactly what happened. Unfortunately we still have to wait for the file to upload before we can throw the error. I was just investigating it this afternoon, and it doesn't appear that Perl's CGI module supports cutting it off early at all. There is a POST_MAX variable that we can set to the maximum number of bytes to allow in a POST, but that affects the entire POST and not just the attached file. Looking at the source to CGI, it doesn't look like it aborts immediately when it hits, either, it still reads the entire data being sent from the client and just discards it. The best bet at fixing this now is probably to try to patch CGI to allow the immediate abort if Content-Length is too big, and get that patch upstream to the CGI maintainer. Once it's supported in CGI we can do something about it in Bugzilla.
Comment 18•16 years ago
|
||
(In reply to comment #17) > There is a POST_MAX variable that we can set to the maximum number of bytes to > allow in a POST, but that affects the entire POST and not just the attached > file. Looking at the source to CGI, it doesn't look like it aborts immediately > when it hits, either, it still reads the entire data being sent from the client > and just discards it. Right, I just tested right now, setting the limit to 10K, and uploading a huge 540Mb file still uploads the whole file before complaining that the file exceeds the limit of 10K. So $CGI::POST_MAX doesn't help here. I guess it will also depend on the web browser to send this information early, before uploading the file. So as this is technically not doable (at least for now), I'm resolving this bug as WONTFIX.
Status: NEW → RESOLVED
Closed: 24 years ago → 16 years ago
Resolution: --- → WONTFIX
You need to log in
before you can comment on or make changes to this bug.
Description
•