Closed Bug 1920660 Opened 2 months ago Closed 2 months ago

Cyclic drawImage leads to memory leak

Categories

(Core :: Graphics: Canvas2D, defect)

Firefox 130
defect

Tracking

()

VERIFIED FIXED
132 Branch
Tracking Status
firefox-esr115 --- unaffected
firefox-esr128 --- fixed
firefox130 --- wontfix
firefox131 --- fixed
firefox132 --- fixed

People

(Reporter: sdecker, Assigned: aosmond)

References

(Blocks 1 open bug, Regression)

Details

(Keywords: regression)

Attachments

(8 files)

Attached image resmon.png โ€”

Steps to reproduce:

We have an ASP NET Core Angular application where the images are displayed cyclically. The images are displayed in canvas with context.drawImage(...).

This error you can reproduce with my fiddle:
https://jsfiddle.net/15vf97hz/
on the latests versions of:

  • Firefox x64
  • Firefox x64 ESR
  • Firefox x64 Nightly

How can we ensure that the memory consumption remains within the โ€˜normalโ€™ range?
Are we missing something important in the script?
Do we have any influence on the garbage collector in Firefox?

Actual results:

The memory is completely consumed by Firefox and this results in OutOfMemory errors in other processes.
You can see in the resource monitor (resmon) that "Working set" and "Shareables" are increasing (see attachment).
The memory is only released after a few minutes of inactivity.

Expected results:

Normal memory consumption as in other browsers (like Firefox 32Bit and chrome).

The Bugbug bot thinks this bug should belong to the 'Core::Graphics: Canvas2D' component, and is moving the bug to that component. Please correct in case you think the bot is wrong.

Component: Untriaged → Graphics: Canvas2D
Product: Firefox → Core

i could not repro on my Win11 machine + Firefoxx64 Nightly on the jsfiddle testcase when i scrolled on the demo-pane with the touchpad. The memory didnt see much increase. I noticed that images loaded from dummyjson take a looooong time for me. Maybe thats why the testcase doesnt work for me.
Is there an alternate testcase?

Can you go to "about:support" in your Firefox browser and copy-paste its contents to this bug?

Flags: needinfo?(sdecker)
Attached file about_support_content โ€”

i have reworked the sample:
https://jsfiddle.net/ynLdcxaw/

now the images are fully loaded and then the timer is started to show each image,
on my win10 (x64) i have still the issue with the memory with Firefox x64 130.0.1

with the newest Firefox x64 Nightly 132.0a1 (2024-09-23) (64-bit) i cannot reproduce this problem

Flags: needinfo?(sdecker)

FWIW i can repro on the latest Nightly with both Canvas2d and the experimental GPU-canvas. After all the images are loaded, the memory increases quickly and the browser or the whole system OOM's.

@Steffen: The modified testcase is still veryyy slow (becuase dummyjson is slow).. Could you create a testcase that doesnt use dummyjson and directly uses images or something?

Status: UNCONFIRMED → NEW
Ever confirmed: true
Flags: needinfo?(sdecker)
Attached file images.zip โ€”

demo images for testcase

Attached file testcase.html โ€”

ok i created a local testcase.html and downloaded 100 images

Please unzip the images and put the html file to one folder:
/testcase.html
/images/0.png
/images/1.png ...

Then you can open the html in firefox and reproduce this memory leak.

I will attach you my powershell script if you need some more images

Flags: needinfo?(sdecker)
Attached file download_images.ps1 โ€”

profile with graphics preset logging:
GPU-canvas: https://share.firefox.dev/3XX3acf
D2d-canvas: https://share.firefox.dev/3N0gTII

Assignee: nobody → aosmond
Severity: -- → S3
Status: NEW → ASSIGNED

If a canvas never stops drawing to the point the canvas does not go
dormant to free buffers, and the caller never does a readback to create
a checkpoint, then we may never release our hold on resources used
earlier in the recording. This patch makes it so that we check at the
end of each transaction. Eventually the playback of the recording will
catch up, or the recording will go dormant, and we will release the
relevant surfaces.

Pushed by aosmond@mozilla.com: https://hg.mozilla.org/integration/autoland/rev/a948d1c44ac4 Ensure we free external images/surfaces with recorded canvas sooner. r=gfx-reviewers,nical
Status: ASSIGNED → RESOLVED
Closed: 2 months ago
Resolution: --- → FIXED
Target Milestone: --- → 132 Branch

The patch landed in nightly and beta is affected.
:aosmond, is this bug important enough to require an uplift?

  • If yes, please nominate the patch for beta approval.
  • If no, please set status-firefox131 to wontfix.

For more information, please visit BugBot documentation.

Flags: needinfo?(aosmond)

Mayank, would you mind verifying the bug was fixed given you could readily reproduce it? Thanks!

Flags: needinfo?(aosmond) → needinfo?(mayankleoboy1)

With the latest Nightly, I dont crash. However, the memory does increase from ~6GB to 9.3 GB which then reduces to ~8GB and oscillates between 8GB and 9.3GB. (The increased memory is still attributed to the whole system, and not to any Firefox process in Windows Task Manager)

Profile: https://share.firefox.dev/4eiqLcO
So, the behaviour has improved, but it is still not great (Other browser use 600MB RAM).

Hope this helps. Let me know if you need more profiles or something.

Edit:
Profile with graphics preset logging: https://share.firefox.dev/3MZWnYN
Profile with memory tracking: https://share.firefox.dev/3THVYxU

Flags: needinfo?(mayankleoboy1) → needinfo?(aosmond)
Attached file memory-report.json.gz โ€”

I captured the about:memory on a lark without expecting it to give anything (as the RAM increase is at the system level), but surprisingly the report shows 2GB+ ram used in the file process for images.
I have no idea why Windows Task Manager doesnt show any Firefox process with 2GB RAM use - all it shows is increase at the system level.

(But maybe thats a different bug)

From the memory report, it is clear that the only thing keeping the images alive now is the image cache. It will automatically clear the cache as its timer expires and/or if we hit memory pressure, whichever happens first. Prior to the patch, we would be in the situation where it holds onto it indefinitely.

Flags: needinfo?(aosmond)

Sounds like there is nothing more to do here.
Should i file a bug for the issue of : Firefox process in Task Manager's default view (Active private working set) does not show 2GB+ Memory use of canvas demo?

(In reply to Mayank Bansal from comment #18)

With the latest Nightly, I dont crash. However, the memory does increase from ~6GB to 9.3 GB which then reduces to ~8GB and oscillates between 8GB and 9.3GB. (The increased memory is still attributed to the whole system, and not to any Firefox process in Windows Task Manager)

Profile: https://share.firefox.dev/4eiqLcO
So, the behaviour has improved, but it is still not great (Other browser use 600MB RAM).

Hope this helps. Let me know if you need more profiles or something.

Edit:
Profile with graphics preset logging: https://share.firefox.dev/3MZWnYN
Profile with memory tracking: https://share.firefox.dev/3THVYxU

Can you file a separate bug about this issue, mark it against Core / Graphics: ImageLib and mark this as a dependency? I'm not sure what exactly the best thing to do here is, but it is no longer a canvas issue, and now more of a "should we clear the image cache more aggressively" decision.(In reply to Mayank Bansal from comment #22)

Sounds like there is nothing more to do here.
Should i file a bug for the issue of : Firefox process in Task Manager's default view (Active private working set) does not show 2GB+ Memory use of canvas demo?

As long as about:memory is accurate (and it appears so), I'm content to let it be. The shared memory is mapped into the GPU process, so it is possible Windows is unable to attribute which process is responsible.

Status: RESOLVED → VERIFIED

Comment on attachment 9427124 [details]
Bug 1920660 - Ensure we free external images/surfaces with recorded canvas sooner.

Beta/Release Uplift Approval Request

  • User impact if declined: May OOM due to canvas not allowing decoded image memory to be released
  • Is this code covered by automated tests?: Yes
  • Has the fix been verified in Nightly?: Yes
  • Needs manual test from QE?: No
  • If yes, steps to reproduce:
  • List of other uplifts needed: None
  • Risk to taking this patch: Low
  • Why is the change risky/not risky? (and alternatives if risky): Adds an additional check at the end of each canvas transaction to check for surfaces we can release. The performance cost is trivial, and it is very safe to do the check, no threading issues.
  • String changes made/needed:
  • Is Android affected?: Yes

ESR Uplift Approval Request

  • If this is not a sec:{high,crit} bug, please state case for ESR consideration: Low risk patch, OOM may lead to crashes and/or degraded performance
  • User impact if declined: May OOM due to canvas not allowing decoded image memory to be released
  • Fix Landed on Version: 132
  • Risk to taking this patch: Low
  • Why is the change risky/not risky? (and alternatives if risky): Adds an additional check at the end of each canvas transaction to check for surfaces we can release. The performance cost is trivial, and it is very safe to do the check, no threading issues.
Attachment #9427124 - Flags: approval-mozilla-release?
Attachment #9427124 - Flags: approval-mozilla-esr128?

Additional note: this is worth taking if you respin the RC, but it isn't a release driver.

Blocks: 1921253

Comment on attachment 9427124 [details]
Bug 1920660 - Ensure we free external images/surfaces with recorded canvas sooner.

Approved for 128.4esr.

Attachment #9427124 - Flags: approval-mozilla-esr128? → approval-mozilla-esr128+

Comment on attachment 9427124 [details]
Bug 1920660 - Ensure we free external images/surfaces with recorded canvas sooner.

Approved for 131.0.3 dot release

Attachment #9427124 - Flags: approval-mozilla-release? → approval-mozilla-release+
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Creator:
Created:
Updated:
Size: