Gecko revision: http://hg.mozilla.org/mozilla-central/rev/1ad9af3a2ab8 Gaia is: https://github.com/mozilla-b2g/gaia/commit/8952898bbc98dd31e25b647203791cf129867ff1 13:23:14 test_cleanup_scard (test_cleanup_sdcard.TestCleanupSDCard) ... FAIL 13:23:14 13:23:14 ====================================================================== 13:23:14 FAIL: None 13:23:14 ---------------------------------------------------------------------- 13:23:14 Traceback (most recent call last): 13:23:14 File "/var/jenkins/workspace/b2g.hamachi.mozilla-central.unittests/.env/local/lib/python2.7/site-packages/marionette_client-0.7.0-py2.7.egg/marionette/marionette_test.py", line 143, in run 13:23:14 testMethod() 13:23:14 File "/var/jenkins/workspace/b2g.hamachi.mozilla-central.unittests/tests/python/gaia-ui-tests/gaiatest/tests/unit/test_cleanup_sdcard.py", line 20, in test_cleanup_scard 13:23:14 self.assertEqual(len(self.data_layer.media_files), 0) 13:23:14 TEST-UNEXPECTED-FAIL | test_cleanup_sdcard.py test_cleanup_sdcard.TestCleanupSDCard.test_cleanup_scard | AssertionError: 1 != 0 13:23:14 ---------------------------------------------------------------------- 13:23:14 Ran 1 test in 43.661s I'm re-running with a dedicated, single test, so I can get a cleaner logcat, which I'll attach.
(In reply to Stephen Donner [:stephend] from comment #1) > Created attachment 8346899 [details] > logcat.txt I tried creating a single-test run, but got our dreaded "socket timeout" error, so this is the full logcat from an earlier unittest-suite run: http://qa-selenium.mv.mozilla.com:8080/view/B2G%20Hamachi/job/b2g.hamachi.mozilla-central.unittests/16/
Is this happening often? It would be useful to know what files are being found. Perhaps we could add some debug to the test to output the list of files on failure.
(In reply to Dave Hunt (:davehunt) from comment #3) > Is this happening often? It would be useful to know what files are being > found. Perhaps we could add some debug to the test to output the list of > files on failure. Yeah, happens 100% of the time. Bebe, can you have someone take a look, please?
I tried running this unit test on my Hamachi with the latest build and it passes 100% of the time for me. Is this happening on one specific phone?
Oh, I was able to get a failure, after 6 runs. Here's what ls says is on the card: root@android:/sdcard # ls -l d---rwx--- system sdcard_rw 2013-12-17 17:41 tests Does that help?
It looks like there's a 'tests' directory on the sdcard. Is there anything inside it? What permissions does it have? Are you able to delete it?
There is nothing in the directory. I assumed that the output of that ls -l command showed the permissions - is there a way to see more detailed permissions? I *am* able to remove the directory with rmdir.
(In reply to Bob Silverberg [:bsilverberg] from comment #8) > There is nothing in the directory. I assumed that the output of that ls -l > command showed the permissions - is there a way to see more detailed > permissions? I *am* able to remove the directory with rmdir. Sorry, you're right about the permissions.. they're included in comment 6. Can you tell me how you're replicating this? Are you just running the test_cleanup_sdcard over and over? Any idea where this 'tests' folder is coming from? I couldn't find anything obvious in the UI tests. I'm thinking we may want to try to clear the sdcard during our flash script, and also make the cleanup_sdcard more robust, perhaps by checking if each item has been deleted and repeating the rm until it's gone.
Yes, I reproduced this by just running that one test in a loop. I have no idea where that 'tests' folder is coming from, but I can try to investigate. I will also take a look at making `cleanup_sdcard` more robust, as you suggest.
Assignee: nobody → bob.silverberg
Status: NEW → ASSIGNED
I believe what is happening is that the call to `device.manager.removeDir`  is running asynchronously, so sometimes the deletion isn't complete before the assertion at . I observed this by adding logging into `cleanup_sdcard` and saw that sometimes the deletion doesn't happen immediately. I think the best solution to this is to change the test to wait for `data_layer.media_files` files to empty, instead of just asserting it immediately. I will do that in a PR and attach it to this bug.  https://github.com/mozilla-b2g/gaia/blob/master/tests/python/gaia-ui-tests/gaiatest/gaia_test.py#L837  https://github.com/mozilla-b2g/gaia/blob/master/tests/python/gaia-ui-tests/gaiatest/tests/unit/test_cleanup_sdcard.py#L20
looks like Bob is on the case Thanks for taking care of this
Created attachment 8349379 [details] [review] Link to Github pull-request: https://github.com/mozilla-b2g/gaia/pull/14809
Comment on attachment 8349379 [details] [review] Link to Github pull-request: https://github.com/mozilla-b2g/gaia/pull/14809 Agree with Dave on this one We just fix the test here not the issue
I wasn't sure if the existing mozdevice behaviour was expected, so rather than monkey with that I figured I'd just make sure that the test would work. Now I understand that the behaviour is not intended so I have submitted a patch to mozbase via bug 951773. Once that lands and is brought into the project I will test it again. My local tests with a patched mozbase do show that the problem is gone.
Depends on: 951773
Attachment #8349379 - Attachment is obsolete: true
Awesome, thanks Bob.
Created attachment 8349552 [details] [review] Link to Github pull-request: https://github.com/mozilla-b2g/gaia/pull/14816
Attachment #8349552 - Flags: review?(dave.hunt)
Comment on attachment 8349552 [details] [review] Link to Github pull-request: https://github.com/mozilla-b2g/gaia/pull/14816 Looks good! The Travis-CI failure appears to be unrelated.
Attachment #8349552 - Flags: review?(dave.hunt) → review+
Landed on master in https://github.com/mozilla-b2g/gaia/commit/cf0ab53b442d78529bb160c2a27115ceff92fe0f
Status: ASSIGNED → RESOLVED
Last Resolved: 5 years ago
Resolution: --- → FIXED
had to back this out because breaking gaia ui tests like https://tbpl.mozilla.org/php/getParsedLog.php?id=32216951&tree=B2g-Inbound
Status: RESOLVED → REOPENED
Resolution: FIXED → ---
jgriffin: How do we get TBPL to pick up the latest version of mozdevice?
(In reply to Dave Hunt (:davehunt) from comment #21) > jgriffin: How do we get TBPL to pick up the latest version of mozdevice? This is already in the works as part of bug 949600, which moves mozbase from github to the tree. However, if you need this sooner, we could mirror only mozdevice (and any dependencies) separately.
No rush, this is for fixing an issue we have only seen on device, and that's already fixed because we were previously not pinning the mozdevice version anyway.
Depends on: 949600
Can this bug be closed? Do we need to wait for something to happen so we can pin mozdevice, or should we just keep it unpinned? What do you think, Dave?
We shouldn't close this until it's fixed. See comment 22 and the dependent bugs.
Now that bug 949600 is resolved and TBPL is on mozdevice 0.33, I have relanded this change: https://github.com/mozilla-b2g/gaia/commit/c6d6c106494745b54d562434936689419b4c5996
Status: REOPENED → RESOLVED
Last Resolved: 5 years ago → 4 years ago
Resolution: --- → FIXED
Verified FIXED; sorry for the delay!
Status: RESOLVED → VERIFIED
You need to log in before you can comment on or make changes to this bug.