Closed
Bug 948295
Opened 11 years ago
Closed 10 years ago
Able to flood a webrtc datachannel and force channel close
Categories
(Core :: WebRTC: Networking, defect)
Tracking
()
RESOLVED
WORKSFORME
People
(Reporter: damon.oehlman, Unassigned)
Details
(Whiteboard: [bugday-20131211])
Attachments
(3 files)
User Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/31.0.1650.57 Safari/537.36
Steps to reproduce:
Currently working on a node compatible streams interface for the webrtc data channel. Included some load tests which are able to cause the dc to close.
The tests are available on github, so could be replicated quite easily by anyone with some familiarity with node.
Steps to reproduce:
First you will need to have node and npm installed on the machine. If this is the case, then please also ensure that browserify and beefy are also installed. If not these can be installed via npm:
npm install -g browserify
npm install -g beefy
Next clone and prep the repo:
git clone https://github.com/rtc-io/rtc-dcstream.git
cd rtc-dcstream
npm install
Using beefy, run the load tests:
beefy test/test-writer.js
I then ran firefox using some logging options I found from other bug reports. Both stable and nightly were tried:
NSPR_LOG_MODULES=datachannel:5 /opt/browsers/firefox/nightly/firefox http://localhost:9966/
Actual results:
The test runs as expected for the first ~2000 writes to the channel, but then gets to a point where it closes. The last few lines of output from firefox looks like this when running the test in nightly:
1088251776[7f503f84e370]: Sending to stream 1: 574 bytes
1088251776[7f503f84e370]: Sent buffer (len=574), result=574
1088251776[7f503f84e370]: Sending to stream 1: 574 bytes
1088251776[7f503f84e370]: Sent buffer (len=574), result=574
1088251776[7f503f84e370]: Sending to stream 1: 574 bytes
1088251776[7f503f84e370]: Sent buffer (len=574), result=574
717223680[7f503f84e9d0]: DataChannel: String message received of length 574 on channel 1
717223680[7f503f84e9d0]: HandleDataMessage: sending ON_DATA for 7f5016294b00
717223680[7f503f84e9d0]: DataChannel: String message received of length 574 on channel 1
717223680[7f503f84e9d0]: HandleDataMessage: sending ON_DATA for 7f5016294b00
717223680[7f503f84e9d0]: DataChannel: String message received of length 574 on channel 1
717223680[7f503f84e9d0]: HandleDataMessage: sending ON_DATA for 7f5016294b00
1088251776[7f503f84e370]: Sending to stream 1: 574 bytes
1088251776[7f503f84e370]: Sent buffer (len=574), result=574
1088251776[7f503f84e370]: Sending to stream 1: 574 bytes
1088251776[7f503f84e370]: Sent buffer (len=574), result=574
1088251776[7f503f84e370]: Sending to stream 1: 574 bytes
1088251776[7f503f84e370]: Sent buffer (len=574), result=574
717223680[7f503f84e9d0]: DataChannel: String message received of length 574 on channel 1
717223680[7f503f84e9d0]: HandleDataMessage: sending ON_DATA for 7f5016294b00
717223680[7f503f84e9d0]: DataChannel: String message received of length 574 on channel 1
717223680[7f503f84e9d0]: HandleDataMessage: sending ON_DATA for 7f5016294b00
717223680[7f503f84e9d0]: DataChannel: String message received of length 574 on channel 1
717223680[7f503f84e9d0]: HandleDataMessage: sending ON_DATA for 7f5016294b00
1088251776[7f503f84e370]: Sending to stream 1: 574 bytes
1088251776[7f503f84e370]: Sent buffer (len=574), result=574
1088251776[7f503f84e370]: Sending to stream 1: 574 bytes
1088251776[7f503f84e370]: Sent buffer (len=574), result=574
1088251776[7f503f84e370]: Sending to stream 1: 574 bytes
717223680[7f503f84e9d0]: DataChannel: String message received of length 574 on channel 1
717223680[7f503f84e9d0]: HandleDataMessage: sending ON_DATA for 7f5016294b00
1088251776[7f503f84e370]: Sent buffer (len=574), result=574
717223680[7f503f84e9d0]: DataChannel: String message received of length 574 on channel 1
717223680[7f503f84e9d0]: HandleDataMessage: sending ON_DATA for 7f5016294b00
717223680[7f503f84e9d0]: DataChannel: String message received of length 574 on channel 1
717223680[7f503f84e9d0]: HandleDataMessage: sending ON_DATA for 7f5016294b00
1088251776[7f503f84e370]: Destroying DataChannelConnection 7f5001136a80
1088251776[7f503f84e370]: Closing all channels (connection 7f5001136a80)
1088251776[7f503f84e370]: Connection 7f5001136a80/Channel 7f5016294b00: Closing stream 1
1088251776[7f503f84e370]: Connection 7f5001136a80: Resetting outgoing stream 1
1088251776[7f503f84e370]: Destroying Data channel 1
1088251776[7f503f84e370]: Connection 7f5001136a80: Sending outgoing stream reset for 1 streams
1088251776[7f503f84e370]: Deregistered 7f5001136a80 from the SCTP stack.
717223680[7f503f84e9d0]: Incoming: Channel 1 closed, state 1
717223680[7f503f84e9d0]: Connection 7f5002f52280: Resetting outgoing stream 1
717223680[7f503f84e9d0]: Connection 7f5002f52280: Sending outgoing stream reset for 1 streams
717223680[7f503f84e9d0]: Disconnected DataChannel 7f5002ff7e00 from connection 7f5002f52280
717223680[7f503f84e9d0]: Destroying Data channel 1
717223680[7f503f84e9d0]: Association change: SCTP_COMM_LOST
717223680[7f503f84e9d0]: Association change: streams (in/out) = (16/16)
717223680[7f503f84e9d0]: Association: ABORT =
717223680[7f503f84e9d0]: 0x06
717223680[7f503f84e9d0]: 0x00
717223680[7f503f84e9d0]: 0x00
717223680[7f503f84e9d0]: 0x08
717223680[7f503f84e9d0]: 0x00
717223680[7f503f84e9d0]: 0x0c
717223680[7f503f84e9d0]: 0x00
717223680[7f503f84e9d0]: 0x04
1088251776[7f503f84e370]: Destroying DataChannelConnection 7f5002f52280
1088251776[7f503f84e370]: Closing all channels (connection 7f5002f52280)
1088251776[7f503f84e370]: Deregistered 7f5002f52280 from the SCTP stack.
1088251776[7f503f84e370]: Deleting DataChannelConnection 7f5001136a80
1088251776[7f503f84e370]: Closing all channels (connection 7f5002f52280)
1088251776[7f503f84e370]: DataChannelOnMessageAvailable (1) with null Listener
1088251776[7f503f84e370]: Deleting DataChannelConnection 7f5002f52280
1088251776[7f503f84e370]: Close: 7f5016294b00 no connection!
1088251776[7f503f84e370]: Close: 7f5002ff7e00 no connection!
Expected results:
I probably would have expected it not to fail, but I could be wrong. Additionally, this test could be quite edge casey as it is creating two data channels communicating with each other in the same browser instance.
Regardless, I thought I probably should report it...
Reporter | ||
Comment 1•11 years ago
|
||
One point of note is that replicating this in the next couple of days could be frustrating as I did have to make some changes to a node module that was used by the "stream-tester" node module that is used to help simulate node stream load.
Probably worth waiting until this PR is resolved before attempting replicating the bug:
https://github.com/dominictarr/assertions/pull/2
Cheers,
Damon.
Updated•11 years ago
|
Component: Untriaged → WebRTC
Product: Firefox → Core
Whiteboard: [bugday-20131211]
Comment 2•11 years ago
|
||
There's no indication why the datachannel was closed. Either the JS closed it, the peerconnection itself closed (timeout on ICE connectivity/consent check?) or something else higher up went wrong.
Enable datachannel:5,sctp:5,signaling;5,mtransport:5 logging and retry.
Also, if you can wireshark the transfer that would help too.
Component: WebRTC → WebRTC: Networking
Comment 3•11 years ago
|
||
damon, thanks for reporting this issue! Is this something you may be able to help us test and attach more information?
Flags: needinfo?(damon.oehlman)
Comment 4•11 years ago
|
||
nspr.log with datachannel:9,sctp:9,signaling:9,mtransport:9,timestamp
Comment 5•11 years ago
|
||
Pcap from the test run
Comment 6•11 years ago
|
||
stderr output with nICEr output from the test run
Comment 7•11 years ago
|
||
All files where taken from a test run against m-c from today.
From looking at the log file it looks like the RTCPeerConnection object gets garbage collected. But I could not really figure out why that is the case.
damon, can you please double check your code that the RTCPerrConnection objects are referenced through the whole test?
Because if there is no reference to them any more the garbage collector would close the connections and remove the object. That would also terminate the datachannels.
Comment 8•11 years ago
|
||
Also please make sure that both sides keep references to the DataChannels. If you assign an onmessage for the DataChannel from ondatachannel, that is not enough to keep it alive currently. (There is a bug on this.) You have to also keep a reference somewhere (var for example) to the datachannel to avoid it being garbage collected.
Thanks
Reporter | ||
Comment 9•11 years ago
|
||
(In reply to Liz Henry :lizzard from comment #3)
> damon, thanks for reporting this issue! Is this something you may be able to
> help us test and attach more information?
Sure thing - I'm offline for a few days though as I'm currently moving house. As soon as I'm back coding again though I'll be happy to do whatever I can do help :)
Flags: needinfo?(damon.oehlman)
Reporter | ||
Updated•11 years ago
|
Flags: needinfo?(damon.oehlman)
Reporter | ||
Comment 10•10 years ago
|
||
Apologies for being so incredibly slack on this. I have just retested this against Firefox 30.0 and it is no longer an issue.
Status: UNCONFIRMED → RESOLVED
Closed: 10 years ago
Flags: needinfo?(damon.oehlman)
Resolution: --- → WORKSFORME
You need to log in
before you can comment on or make changes to this bug.
Description
•