Bug 1556022 Comment 17 Edit History

Note: The actual edited comment in the bug view page will always show the original commenter’s name and original timestamp.

It appears that we are indeed limited by the SVG filter.
Based on comment 8 and comment 10 I made `nsSVGIntegrationUtils::PaintFilter` a no-op and the reported download bandwidth almost doubled.

Baseline gv_example (local release build)
```
Download (Mbps): 37.1, 37.1, 37.8
Upload (Mbps): 19.8, 19.8, 17.7
```

gv_example (local release build but skipping `nsSVGIntegrationUtils::PaintFilter`:
https://searchfox.org/mozilla-central/rev/11712bd3ce7454923e5931fa92eaf9c01ef35a0a/layout/svg/nsSVGIntegrationUtils.cpp#1057
```
Download (Mbps): 62.9, 63.0, 62.4
Upload (Mbps): 20.4, 20.3, 20.2
```


My home network is 250Mbps down and 20Mbps upload so I'm near the upload limit.

On Chrome I'm seeing ~100Mbps download.
I have a preference for independently-verified results, so if someone else who can repro the issue would like to repeat the test, that would be great.

But this makes me think:
- Is this an isolated problem? An expensive filter running during their test, or could this impact pageload of real sites?
I'll start a tp6m job to test this hypothesis.
- It's great that we may be getting decryption optimizations out of this. But is there anything else we can do about the graphics side prior to WebRender + SVG?
It appears that we are indeed limited by the SVG filter.
Based on comment 8 and comment 10 I made `nsSVGIntegrationUtils::PaintFilter` a no-op and the reported download bandwidth almost doubled.

Baseline gv_example (local release build, Moto G5)
```
Download (Mbps): 37.1, 37.1, 37.8
Upload (Mbps): 19.8, 19.8, 17.7
```

gv_example (local release build, , Moto G5, but skipping `nsSVGIntegrationUtils::PaintFilter`):
https://searchfox.org/mozilla-central/rev/11712bd3ce7454923e5931fa92eaf9c01ef35a0a/layout/svg/nsSVGIntegrationUtils.cpp#1057
```
Download (Mbps): 62.9, 63.0, 62.4
Upload (Mbps): 20.4, 20.3, 20.2
```


My home network is 250Mbps down and 20Mbps upload so I'm near the upload limit.

On Chrome I'm seeing ~100Mbps download.
I have a preference for independently-verified results, so if someone else who can repro the issue would like to repeat the test, that would be great.

But this makes me think:
- Is this an isolated problem? An expensive filter running during their test, or could this impact pageload of real sites?
I'll start a tp6m job to test this hypothesis.
- It's great that we may be getting decryption optimizations out of this. But is there anything else we can do about the graphics side prior to WebRender + SVG?

Back to Bug 1556022 Comment 17