Closed Bug 649254 Opened 13 years ago Closed 13 years ago

Bugzilla DB servers in PHX need more RAM

Categories

(mozilla.org Graveyard :: Server Operations, task)

All
Other
task
Not set
normal

Tracking

(Not tracked)

RESOLVED FIXED

People

(Reporter: justdave, Assigned: phong)

References

Details

(Whiteboard: [FedEX: 005794545885363])

-------- Storage Engine Statistics -------------------------------------------
[--] Status: -Archive +BDB -Federated +InnoDB -ISAM -NDBCluster 
[--] Data in MyISAM tables: 1G (Tables: 1)
[--] Data in InnoDB tables: 25G (Tables: 87)

-------- Performance Metrics -------------------------------------------------
[--] Total buffers: 24.3G global + 10.5M per thread (1200 max threads)
[!!] Maximum possible memory usage: 36.6G (155% of installed RAM)

I can (and probably should) get the max memory usage under 32 GB by dropping the max clients to 600.

However, all three of the machines only have 24 GB of RAM currently.  We should really up them to 32 GB.

Before we switch over from San Jose would be nice, but at this point that's probably pushing it.

tp-bugs01-master01.phx.mozilla.com:    productname: ProLiant BL460c G6
tp-bugs01-slave01.phx.mozilla.com:    productname: ProLiant BL460c G6
tp-bugs01-slave02.phx.mozilla.com:    productname: ProLiant BL460c G6
Assignee: server-ops → phong
Flags: colo-trip+
RAM(9 x 8 GB DIMM) is being ordered. I will follow up with tracking information when it is available.  ETA to Phoenix is Monday
According to FedEx, the new RAM was delivered on Friday.
Blocks: 628372
Group: infra
Best to reach out to remote hands to handle this.
justdave: is it safe to assume all of these can be shut down at anytime for the RAM upgrade?  I can email I/O to get someone to pick it up from shipping and install.
Whiteboard: [FedEX: 005794545885363]
New RAM was installed this morning.  These are all at 36 GB now.
Status: NEW → RESOLVED
Closed: 13 years ago
Resolution: --- → FIXED
Product: mozilla.org → mozilla.org Graveyard
You need to log in before you can comment on or make changes to this bug.