Closed Bug 186133 Opened 22 years ago Closed 22 years ago

DOM Performance Regression

Categories

(Core :: DOM: Core & HTML, defect, P1)

defect

Tracking

()

RESOLVED FIXED
mozilla1.3beta

People

(Reporter: markushuebner, Assigned: jst)

References

()

Details

(Keywords: perf, regression, topembed, Whiteboard: [HAVE FIX])

Attachments

(1 file)

10 runs on each testcase on win-xp, 1.1ghz, 512ram.
On average we are doing 20% worse, comparing these
two trunk builds.

TEST                    Trunk 2002121108  :  Trunk 2002121108  :  MSIE6.0 SP1
-----------------------------------------------------------------------------
getElementById()        313ms             :  287ms             :  270ms
getElementsByTagName()  370ms             :  305ms             :  281ms
createElement()         702ms             :  792ms             :  231ms
getAttribute()          442ms             :  378ms             :  140ms
setAttribute()          407ms             :  340ms             :  144ms
Same mozilla build?
sorry - the second truk build is 2002082208 in the table!
No longer blocks: 118933
Testing 1.2a and 1.2b have actually the same numbers as Trunk 2002121108.
Mozilla 1.1 was considerably faster though.

TEST                    Trunk 2002121108  :  Mozilla 1.1 
-----------------------------------------------------------
getElementById()        313ms             :  284ms          
getElementsByTagName()  370ms             :  303ms         
createElement()         702ms             :  682ms         
getAttribute()          442ms             :  375ms            
setAttribute()          407ms             :  343ms           

So it seems the regression happened somewhere in August 2002.
Notice that performance can be further improved when you cache document. Using 
the following code:

var d=document;
for (var i=0;i<=10000;i++) d.getElementById("test");

IE (5.5) dropped from 694ms to 497ms. (!!)
Moz (1.3a) dropped from 958ms to 914ms.

And Opera 7b2 swifts through it in 404ms.

Caching objects is a common way to improve performance, I believe it should be 
taken into account when comparing.
Can anyone run Quantify or Jprof?
Blocks: 21762
More tests, with more browsers (Athlon XP 2100+ Win 2k SP3 512 Mb DDR, no other
programs running but the browser)

Opera 7.0 Beta 2
getElementById 		 --> Average (10runs) :110ms
getElementsByTagName()	 --> Average (10runs) :153ms
createElement() 	 --> Average (10runs) :316ms
getAttribute() 		 --> Average (10runs) :455ms
setAttribute() 		 --> Average (10runs) :3299ms

IE 6 SP1
getElementById 		 --> Average (10runs) :125ms
getElementsByTagName()	 --> Average (10runs) :119ms
createElement() 	 --> Average (10runs) :104ms
getAttribute() 		 --> Average (10runs) :52ms
setAttribute() 		 --> Average (10runs) :59ms

Phoenix 0.5 (nightly 20021219)
getElementById 		 --> Average (10runs) :127ms
getElementsByTagName()	 --> Average (10runs) :156ms
createElement() 	 --> Average (10runs) :323ms
getAttribute() 		 --> Average (10runs) :192ms
setAttribute() 		 --> Average (10runs) :184ms

Mozilla (nightly 20021219)
getElementById 		 --> Average (10runs) :141ms
getElementsByTagName()	 --> Average (10runs) :166ms
createElement() 	 --> Average (10runs) :466ms
getAttribute() 		 --> Average (10runs) :203ms
setAttribute() 		 --> Average (10runs) :186ms

Mozilla 1.1 (20020826)
getElementById 		 --> Average (10runs) :131ms
getElementsByTagName()	 --> Average (10runs) :156ms
createElement()          --> Average (10runs) :406ms
getAttribute() 		 --> Average (10runs) :197ms
setAttribute() 		 --> Average (10runs) :175ms

K-meleon (Mozilla 1.2b 20021016)
getElementById 		 --> Average (10runs) :139ms
getElementsByTagName()	 --> Average (10runs) :171ms
createElement() 	 --> Average (10runs) :352ms
getAttribute() 		 --> Average (10runs) :206ms
setAttribute() 		 --> Average (10runs) :190ms
All of these tests with random old builds are _not_ helpful.  If the regression 
occurred between 1.2b and 12/11, then please test builds in that range to 
narrow down the regression date.  That would be helpful indeed.
Blocks: 118933
any way we can narrow down to a smaller timeframe/builds for when regression
happened?
Where can people find old builds? Maybe try announce the problem somewhere and
see if someone has the time to test builds until finding the "guilty" checkin.
This are the results of my test. I only tested getElementById(). I didn't have
any release build hanging around, so only nightlies.

2002080708	297	100.0%
2002082305	297	100.0%
2002082808	303	102.0%
2002091205	297	100.0%
2002092708	293	 98.7%
2002101508	316	106.4%
2002101808	308	103.7%
2002102508	295	 99.3%
2002110108	283	 95.3%
2002110808	305	102.7%
2002111408	302	101.7%
2002120322	295	 99.3%
2002121308	299	100.7%
2002122008	287	 97.6%

My conclusions: The variations measured are "noise". 1.1 jappend to be kind of
lucky.
> On average we are doing 20% worse, comparing these two trunk builds.

actually, it's 10% worse (on average), even according to your numbers.

the getElementsByTagName test (which, according to comment 0 took the biggest
hit), 10 runs each with linux on a PII-450MHz.

1.0.0  5308
1.1a   2116
1.1b   2256
1.1    2255
1.2a   2302
1.2b   2257
1.2.1  2321
20021220 2208

current trunk looks great.  if there is a regression, it's not happening on in
Andrew-ville.
According to the results I posted in comment #3 we are doing worse as follows:

getElementById()        10,21%
getElementsByTagName()  22,11%
createElement()          2,93%
getAttribute()          17,87%
setAttribute()          18,66%

I forgot to mention that is did my test on linux on an AMD 2000XP. Each with 10
runs (Which should not matter, as one test is already 10000 calls of the function.)
Could it be that this just regressed on Win32 ?
We won't know until someone on windows tests more nightly builds. We need to
know what the variations are, and if there is some trend. 
Only testing the releases is not enough. We need to know the amount of noise in
the measurements to know is the differences are significant.
My (linux) tests suggest that the noise is quite large, and the variantions are
not significant.
You can find some old nightlies on ftp://ftp.mozilla.org/pub/mozilla/nightly/
but not very far back. Find someone who has some on his local disk.
this bug is only about one regression... the one Markus is seeing.  Markus sees
that current trunk (as of 12-11), 1.2a and 1.2b are significantly slower (~20%)
than 1.1 (comment 3).  So he has narrowed down the regression to occurring
between 1.1 and 1.2a.  If we test nightlies outside that range, we might find
another regression, but it would not be this bug.

Since I don't see a significant performance hit between 1.1 and 1.2a (only ~2%,
well within noise level), I don't see this regression. If other people running
linux have similar results (and people running windows have results similar to
Markus') then this bug is Windows only, regardless of what happens with the
nightlies.
Unfortunately ftp://ftp.mozilla.org/pub/mozilla/nightly/ is having builds just 
back till the 10th of December.
Is there a way to get builds around August?
The oldest trunk build I have so far is 2002091208 and there is no regression.
Ups, should get more sleep - the trunk build 2002091208 of course shows the 
regression.
Blocks: 117436
Win98, Athlon 750, 320MB - I don't know if I see this regression... 1.1 performs
about the same as 1.2a, so looking at that I'd say I don't, but 2002073108 _was_
significantly faster (based on the roadmap this was from before 1.1 branched -
markus: did you really test 1.1 itself or did you stick with that 20020822
build? It might be that a fix was checked in on the trunk between 2002073108 and
2002080604, which was then ported to the branch between 2002082208 and
2002082400). On the other hand, the fluctuations are large enough that the
differences can be explained solely by that... (I don't know how accurate these
javascript tests are, but I see the results of one test jumping up and down in
increments of 10 or 50ms...)
_If_ Markus did indeed use 2002082208 in his comparison in comment 3, rather
than the real 1.1, and if somebody can find a checkin on the 1.1-branch between
2002082208 and 2002082400 which occurred on the trunk between 2002073108 and
2002080604, then I'd say that'd bear investigating.

2002073108 (trunk):
getElementById()	395
getElementsByTagName()	455
createElement()		966
getAttribute()		572
setAttribute()		520

2002080604 (trunk):
getElementById()	389
getElementsByTagName()	499
createElement()		960
getAttribute()		639
setAttribute()		593

2002080704 (trunk):
getElementById()	428
getElementsByTagName()	505
createElement()		1000
getAttribute()		627
setAttribute()		588

2002081204 (1.1-branch):
getElementById()	397
getElementsByTagName()	478
createElement()		1000
getAttribute()		577
setAttribute()		533

2002081308 (trunk):
getElementById()	419
getElementsByTagName()	500
createElement()		934
getAttribute()		664
setAttribute()		577

2002081716 (1.1-branch):
getElementById()	401
getElementsByTagName()	462
createElement()		961
getAttribute()		591
setAttribute()		541

2002082400 (1.1-branch):
getElementById()	422
getElementsByTagName()	527
createElement()		1033
getAttribute()		660
setAttribute()		587

2002082611 (1.1):
getElementById()	439
getElementsByTagName()	509
createElement()		985
getAttribute()		620
setAttribute()		565

2002091014 (1.2a):
getElementById()	433
getElementsByTagName()	500
createElement()		1033
getAttribute()		616
setAttribute()		576

2002121704 (current trunk):
getElementById()	445
getElementsByTagName()	528
createElement()		1237
getAttribute()		617
setAttribute()		585
I really did test with 1.1 - usergent info:
Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.1) Gecko/20020826
But it seems that something went wrong on the test (probably there was a
mozilla.exe still in memory when trying to open 1.1).
Here again the test with 1.1

TEST                    Mozilla 1.1 
-----------------------------------------------------------
getElementById()        318ms
getElementsByTagName()  306ms
createElement()         671ms
getAttribute()          379ms
setAttribute()          350ms

Sander, so it seems I have the same as you and the time-frame you mentioned is 
the one that needs analysis to track down the checkin.
Blocks: 140789
Good performance regression infos also in bug 140789
Keywords: nsbeta1, topembed
Ok, I found the cause of this regression, it's the change that was made by alecf
to make nsID::Equals() use memcmp() in stead of manually comparing the nsID. If
I revert that change we get back the ~10% that was lost in performance since
Mozilla 1.1.
Status: NEW → ASSIGNED
Whiteboard: [HAVE FIX]
Oh, and that bug was bug 164580.
see bug 164580 comment 6 -- this bug (regression) does not occur on Linux.
Right, bug 164580 only affected non-unix platforms.
This reverts us back to not using memcmp() in nsID::Equals(), and thus gets us
back to the same speed as we had in 1.1.
Comment on attachment 113436 [details] [diff] [review]
Back out the fix for bug 164580.

sr=alecf
sorry, my bad :(
Attachment #113436 - Flags: superreview+
Flags: blocking1.3b?
Priority: -- → P1
Target Milestone: --- → mozilla1.3beta
Comment on attachment 113436 [details] [diff] [review]
Back out the fix for bug 164580.

r=jag
Attachment #113436 - Flags: review+
Attachment #113436 - Flags: approval1.3b?
Attachment #113436 - Flags: approval1.3b? → approval1.3b+
This doesn't appear to have landed. Am I wrong?
moving blocking request out to 1.3final. We're done with beta. 
Flags: blocking1.3b? → blocking1.3?
Comment on attachment 113436 [details] [diff] [review]
Back out the fix for bug 164580.

You planning to land this?
Attachment #113436 - Flags: approval1.3+
Fixed. Sorry, I missed this approval :-(
Status: ASSIGNED → RESOLVED
Closed: 22 years ago
Resolution: --- → FIXED
Flags: blocking1.3?
No longer blocks: 21762
Blocks: 21762
Component: DOM: Core → DOM: Core & HTML
QA Contact: stummala → general
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Created:
Updated:
Size: