submit_dump_to_staging.sh is an old script on sp-admin01 (Socorro prod admin box, where we run cron jobs and such) which sends crashes from production to staging, for testing purposes. We haven't had our staging setup fully working for a couple months, and we just finally got everything going a few weeks ago so this script has been running non-stop. I discovered earlier this evening that we weren't getting any test crashes in stage, and I see that the submit_dump_to_staging.sh stopped around 10AM today with this in the logs: xargs: `/dev/null': Too many open files in system xargs: /data/bin/submit_dump_to_staging.sh: Too many open files in system /data/bin/submit_dumps_to_staging.sh: line 22: /bin/rm: Too many open files in system Pretty sure this is new, and I suspect it is something this script is doing. We should keep an eye on it, and it's probably time to rewrite it, anyway (the locking breaks all the time, and it's more complex than it could be).
ashish investigated, from irc: 22:16 < ashish> also a fairly low file-max 22:16 < ashish> # cat /proc/sys/fs/file-nr 22:16 < ashish> 1760 0 32768 22:25 < ashish> rhelmer: bumped it to 65536. ping me if you see any breakage Will reopen if we see any further problems. I'll file a separate bug to rewrite that submit script.
Status: NEW → RESOLVED
Last Resolved: 6 years ago
Resolution: --- → FIXED
This was actually probably my fault. I had made a change to my .bashrc which inadvertently caused a fork bomb of sorts...
Product: mozilla.org → mozilla.org Graveyard
You need to log in before you can comment on or make changes to this bug.