Closed
Bug 1729167
Opened 4 years ago
Closed 3 months ago
[Linux] Implement WebRTC VideoFrameBuffer dmabuf surfaces
Categories
(Core :: WebRTC: Audio/Video, enhancement, P3)
Core
WebRTC: Audio/Video
Tracking
()
RESOLVED
WONTFIX
People
(Reporter: stransky, Unassigned)
References
(Blocks 1 open bug)
Details
Implement WebRTC VideoFrameBuffer dmabuf surfaces to pass dmabuf located video frames from camera to WebRender/encoder.
I think the path is to implement VideoFrameBuffer kNative type based on dmabuf:
and then pass it in VideoFrame
Comment 1•2 years ago
|
||
If I recall, I think this bug is open to fixing again. It looks like bug 1729743 has been resolved. Is there any thing else that is seemingly blocking this?
Flags: needinfo?(stransky)
Reporter | ||
Comment 2•2 years ago
|
||
(In reply to Clayton Voges [:cvoges12] from comment #1)
If I recall, I think this bug is open to fixing again. It looks like bug 1729743 has been resolved. Is there any thing else that is seemingly blocking this?
This is not related.
Flags: needinfo?(stransky)
Comment 3•2 years ago
|
||
Oh, I thought it was blocking as it was mentioned before. Never mind then.
Reporter | ||
Updated•3 months ago
|
Status: NEW → RESOLVED
Closed: 3 months ago
Resolution: --- → WONTFIX
You need to log in
before you can comment on or make changes to this bug.
Description
•