Closed Bug 151281 Opened 22 years ago Closed 22 years ago

duplicates.cgi performance work

Categories

(Bugzilla :: Reporting/Charting, defect)

x86
Linux
defect
Not set
normal

Tracking

()

RESOLVED FIXED
Bugzilla 2.18

People

(Reporter: gerv, Assigned: gerv)

References

Details

(Keywords: perf)

Attachments

(1 file)

User-Agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.0.0) Gecko/20020529
BuildID:    20020529

duplicates.cgi now performs much worse than it did; this is part of the issue affecting b.m.o.

I'm also testing the Bugzilla Helper replacement, so this bug report will have some random fields in it.

Reproducible: Always

Steps to Reproduce:
1.
2.
3.
Attached patch Patch v.1Splinter Review
This patch makes one SQL call to find out the visibility status of all the bugs
in the list, instead of one for each bug. Because the "max on the page" limit
is handled in the template, b.m.o would have been doing one query for every bug
with more duplicates than the minimum limit (currently set to 2.) That would
have been many thousands of bugs. :-(

It also removes a redundant variable, $loop, and moves the "restrict to
buglist" code to before the SendSQL call, for simplicity and performance
reasons.

However, we can't really see what difference this makes until it goes up on
b.m.o :-)

Gerv
Comment on attachment 87411 [details] [diff] [review]
Patch v.1

The theory sounds good and the code looks good to me, but it will take me a few
hours to reproduce the correct circumstances to actually test this.  Myk might
want to have a look at it since he probably has backup copies of bmo's database
he can throw it at.
Hmm, I meant to CC Myk there, but I midaired with Myk adding it as a dependency,
and forgot to put his name back when I hit submit.  Myk, have a look at comment
2 :-)
Keywords: perf
Comment on attachment 87411 [details] [diff] [review]
Patch v.1

This looks fine to me, but myk should test.

Can we move duplicates into the db, please? ;) that way we could move sorting
into the query, which would let us use LIMIT directly.
Attachment #87411 - Flags: review+
"time ./duplicates.cgi"

old: 9.230u 0.460s 0:13.97 69.3%      0+0k 0+0io 489pf+0w
new: 4.690u 0.090s 0:07.95 60.1%      0+0k 0+0io 489pf+0w

run script five times simultaneously on idle machine and watch load:

old: 0.01->4.49
new: 0.03->2.33

Comment on attachment 87411 [details] [diff] [review]
Patch v.1

>+$generic_query .= (" AND product = " . SqlQuote($product)) if $product;

Nit: "generic" doesn't make sense anymore.

r=myk
Attachment #87411 - Flags: review+
Whiteboard: [applied to b.m.o]
a) We need to reset the signal handlers back to their defaults.

b) Myk, can you run DProf over duplicates.cgi from the cmdline? (perldoc
Devel::DProf)
duplicates.cgi on b.m.o now performs a lot better (combination of the attached
single-query patch and upping the threshold.) I've checked this patch in now -
marking FIXED. But it would still be good to get some DProf or something.

Checking in duplicates.cgi;
/cvsroot/mozilla/webtools/bugzilla/duplicates.cgi,v  <--  duplicates.cgi
new revision: 1.20; previous revision: 1.19
done

Gerv
Status: NEW → RESOLVED
Closed: 22 years ago
Resolution: --- → FIXED
Target Milestone: --- → Bugzilla 2.18
Whiteboard: [applied to b.m.o]
QA Contact: matty_is_a_geek → default-qa
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Created:
Updated:
Size: