Closed
Bug 869749
Opened 11 years ago
Closed 11 years ago
Use local cache to speed up sql import
Categories
(Release Engineering :: General, defect)
Release Engineering
General
Tracking
(Not tracked)
RESOLVED
FIXED
People
(Reporter: catlee, Unassigned)
References
Details
Attachments
(1 file)
8.08 KB,
patch
|
nthomas
:
review+
catlee
:
checked-in+
|
Details | Diff | Splinter Review |
The sql import of the build data into statusdb does a lot of SELECTs to find if objects exist first, and then INSERT/UPDATEs as required. Using a local cache (memcached? redis?) to cache the sql id for various types of objects could help speed up the import of build data quite a bit.
Comment 1•11 years ago
|
||
We've been hitting nagios alerts about Command Queues again, and it's usually from inserting builds with a large number of changed files, eg http://hg.mozilla.org/integration/mozilla-inbound/pushloghtml?fromchange=f79c7d545070&tochange=7e5522d403c1 The jobs time out after 2 minutes, but I haven't tried to confirm it's the issue mentioned here. It might be a DB perf problem, or just more jobs going through the system. Bug 882004 filed to add ganglia or some other kind of monitoring. How big a job is it to add a local cache ?
Comment 2•11 years ago
|
||
May be isolated to use1 masters, or we just happen to run most of the compile jobs there.
Reporter | ||
Comment 3•11 years ago
|
||
So if we suspect processing lots of changes is the problem, we should look at this code: http://hg.mozilla.org/build/buildbotcustom/file/default/status/db/model.py#l387 and http://hg.mozilla.org/build/buildbotcustom/file/default/status/db/model.py#l282
Reporter | ||
Comment 4•11 years ago
|
||
It looks like http://hg.mozilla.org/build/buildbotcustom/file/4d263601be8c/status/db/model.py#l532 is taking a significant amount of time for some reason
Reporter | ||
Comment 5•11 years ago
|
||
Fetching the steps is actually fast, it's just that inserting new properties happens there instead of above due to how the ORM schedules work. http://hg.mozilla.org/build/buildbotcustom/file/4d263601be8c/status/db/model.py#l327 is probably what's causing us problems on changes that touch a lot of files. We do a select and maybe an insert per file.
Reporter | ||
Comment 6•11 years ago
|
||
for big merges, we end up touching a lot of files. for each one of those, we used to do a SELECT to look for the file, and an INSERT if the file wasn't there. For 1000's of files, that's slow. This patch makes the file importing happen in chunks. We look for the pre-existing files in chunks of 100. Any files that are missing then get added.
Attachment #766871 -
Flags: review?(nthomas)
Comment 7•11 years ago
|
||
Comment on attachment 766871 [details] [diff] [review] import change files by chunks Looks good. Do you have any rough numbers on how much of a perf win we get ?
Attachment #766871 -
Flags: review?(nthomas) → review+
Reporter | ||
Updated•11 years ago
|
Attachment #766871 -
Flags: checked-in+
Comment 8•11 years ago
|
||
In production
Status: NEW → RESOLVED
Closed: 11 years ago
Resolution: --- → FIXED
Assignee | ||
Updated•11 years ago
|
Product: mozilla.org → Release Engineering
Assignee | ||
Updated•6 years ago
|
Component: General Automation → General
You need to log in
before you can comment on or make changes to this bug.
Description
•