Closed Bug 961335 Opened 6 years ago Closed 3 years ago

Write unit tests for the ICODecoder

Categories

(Firefox for Android :: Favicon Handling, defect)

All
Android
defect
Not set

Tracking

()

RESOLVED FIXED
Firefox 51
Tracking Status
firefox51 --- fixed

People

(Reporter: ckitching, Assigned: ronoueb, Mentored)

References

Details

(Whiteboard: [lang=java][good first bug][exterminationweek])

Attachments

(2 files)

Followup to Bug 748100 - write some shiny unit tests for it and verify nothing fruity occurs if it's faced with favicons even more bizzairely broken than readily-available ones on the internet.

Some favicons that I know have amusing propeties off the top of my head:
NVIDIA.com has an ICO containing three different colour depths and three different sizes for each - a most nifty property for unit testing. These are bitmap payloads.
Microsoft.com has an ICO containing six sizes, going up as far as 128x128 - probably handy for verifying the "Bin sizes we don't care about" logic works. It also (IIRC) claims that every one of these has a colour depth of zero... (Also BMP payloads)
Golem.de has an ICO containing a really odd selection of 5 sizes - four small ones and one gigantic 256x256 one (The largest possible size). These are, I think, PNG payloads.
Those ones will be handy for a start. Tactfully maiming a few of them to be corrupt or otherwise and, well, you know the drill. Please make the random-data test be deterministic (ie. No random test inputs). I'll stop stating the obvious now.)

Enjoy.
Hardware: ARM → All
Whiteboard: [mentor=rnewman][lang=java][good first bug]
Hi ,
I would like to work on this 
well I'm new to this can some one help me on this ?
(In reply to Dulanja Wijethunga [:dwij] from comment #1)
> Hi ,
> I would like to work on this 
> well I'm new to this can some one help me on this ?

Sure!

First you need to get set up for development: Android developer tools, the Mozilla codebase, etc.

Start by reading these:

https://wiki.mozilla.org/Mobile/Get_Involved
https://developer.mozilla.org/docs/Introduction

Those should be enough to get you a checkout and get a working build of Firefox for Android that you can run on your own Android device.

Come find us in IRC if you get stuck: #mobile at irc://irc.mozilla.org/mobile .

Let me know when you're up and running!
Whiteboard: [mentor=rnewman][lang=java][good first bug] → [mentor=rnewman][lang=java][good first bug][exterminationweek]
Hi, Dulanja - Are you still interested in working on this?
(In reply to Mike Hoye [:mhoye] from comment #3)
> Hi, Dulanja - Are you still interested in working on this?
These days I'm busy with my academic , if you want you can work on this :)
Mentor: rnewman
Whiteboard: [mentor=rnewman][lang=java][good first bug][exterminationweek] → [lang=java][good first bug][exterminationweek]
Hi, I'm interested in working on this, but before I request it to be assigned to me, I want to know if I understood the problem correctly and to see if I am capable of fixing this.

For this process, I'd need to create something like a TestICODecoder.java in: https://mxr.mozilla.org/mozilla-central/source/mobile/android/tests/browser/junit3/src/tests/

From examining the other tests in that folder, the one I need to create should also extends BrowserTestCase, right?

Then I'd need to create a public void testICODecoder() method which then uses this: https://mxr.mozilla.org/mozilla-central/source/mobile/android/base/favicons/decoders/ICODecoder.java in executing the tasks you described in the bug description.

If I am in the right direction, kindly assign me this bug.
(In reply to Franz Sarmiento from comment #5)

> From examining the other tests in that folder, the one I need to create
> should also extends BrowserTestCase, right?

That'll do it, yes.


> Then I'd need to create a public void testICODecoder() method which then
> uses this:
> https://mxr.mozilla.org/mozilla-central/source/mobile/android/base/favicons/
> decoders/ICODecoder.java in executing the tasks you described in the bug
> description.

Yup. Note that you can't fetch stuff from the internet during test runs, so you'll need to prepare local test data to exercise various cases.


> If I am in the right direction, kindly assign me this bug.

Done! Ping me or ckitching in IRC if you need help.
Assignee: nobody → franzks
Status: NEW → ASSIGNED
Component: General → Favicon Handling
As explain in IRC, I still can't generate an eclipse backend using this command:
./mach build-backend -b AndroidEclipse

I've also tried the command from nalexander's blog:
./mach build-backend --backend=AndroidEclipse

A full build and package was done before running both commands:
./mach build && ./mach package
which were both successful

The output of the "android_eclipse" folder are seemingly empty project files. They only contain 3 things: an empty "gen" folder, a "lint.xml" file, and a "project.properties" file.

I'm trying to do this in a freshly pulled master branch, so all files are untouched. This was also tested on 2 different machines, an ubuntu and mac, and both give the same output.

The docs and wiki + a bit of Googling has lead me nowhere so far in trying to run these tests. Any ideas what I'm doing wrong?
(In reply to Franz Sarmiento from comment #7)

> The docs and wiki + a bit of Googling has lead me nowhere so far in trying
> to run these tests. Any ideas what I'm doing wrong?

Over to Nick, I'm afraid. I'm seeing the same thing.
(In reply to Franz Sarmiento from comment #5)
> For this process, I'd need to create something like a TestICODecoder.java
> in:
> https://mxr.mozilla.org/mozilla-central/source/mobile/android/tests/browser/
> junit3/src/tests/

This will actually be a junit 4 test now:
  https://dxr.mozilla.org/mozilla-central/source/mobile/android/tests/background/junit4/src/org/mozilla/gecko

See https://wiki.mozilla.org/Mobile/Fennec/Android/Testing#Quick_reference for an overview of tests and click through more information to see how to run individual tests (which I recommend).
 
> From examining the other tests in that folder, the one I need to create
> should also extends BrowserTestCase, right?

JUnit 4 doesn't use inheritance to define test cases.

> Then I'd need to create a public void testICODecoder() method which then
> uses this:
> https://mxr.mozilla.org/mozilla-central/source/mobile/android/base/favicons/
> decoders/ICODecoder.java in executing the tasks you described in the bug
> description.

Yep.

(In reply to Franz Sarmiento from comment #7)
> As explain in IRC, I still can't generate an eclipse backend using this
> command:
> ./mach build-backend -b AndroidEclipse

We've moved to a new build system so this should be more straight-forward.
Assignee: franzks → nobody
Mentor: michael.l.comella, ahunt
Status: ASSIGNED → NEW
First time Fennec contributor here :) are these unit tests still needed?
That's a good question. Rnewman?
Flags: needinfo?(rnewman)
Presumably; the code still exists. Over to Sebastian.
Flags: needinfo?(rnewman) → needinfo?(s.kaspari)
Mentor: rnewman
Flags: needinfo?(s.kaspari) → needinfo?(ahunt)
(In reply to bd339 from comment #10)
> First time Fennec contributor here :) are these unit tests still needed?

Yup, these tests would definitely still be useful! (I don't think the code has actually changed significantly in the past year or so, and we probably won't get rid of it anytime soon.)
Flags: needinfo?(ahunt)
Nice to know they will still be useful. However, I went somewhere today and won't return to my "workstation" until the 26th of this month. I thought I could do it over this past weekend, but obviously that's too late now. Unless it's been fixed in the meantime, expect that I will look into getting it done some time shortly after my return on the 26th :)
Just wanted to know how this is looking as a unit test for the NVIDIA favicon mentioned by Chris in comment 0?
Attachment #8776089 - Flags: feedback?(ahunt)
Assignee: nobody → bd339
Status: NEW → ASSIGNED
I have run into some difficulties testing that the best fit bitmap for a given target size is selected correctly, for which Chris suggested the Microsoft favicon.
This is what that test looks like so far: https://gist.github.com/bd339/f894c49050658618a91f7798f7465079
But unfortunately `getWidth` returns 100 and so does `getHeight` - even though such a bitmap is not even part of the ico!
It might be because robolectric doesn't fully implement Bitmap. Therefore Sebastian suggested over irc that I write a Robocop test that runs on the device, using a similar method to obtain the ico file as this test uses to obtain resources: https://dxr.mozilla.org/mozilla-central/source/mobile/android/tests/browser/robocop/src/org/mozilla/gecko/tests/testReadingListToBookmarksMigration.java

Is this something we want to do instead of the junit 4 test suggested by Michael? And if so, should it just be this particular test (the rest junit 4) or this entire unit test?
Flags: needinfo?(michael.l.comella)
(In reply to bd339 from comment #16)
> Is this something we want to do instead of the junit 4 test suggested by
> Michael?

Unit tests are generally preferable [1] but if it's not possible (which it sounds like it's not), then we can do an integration test.

> And if so, should it just be this particular test (the rest junit
> 4) or this entire unit test?

Assuming it won't take too much time (use your judgement!), I would test this particular test as an integration test & the rest as junit.

Unit testing is preferred so we should test as much as possible as unit tests.

[1]: Unit tests can test code in isolation meaning which means when they fail, they're generally failing because the code you're testing is broken and you know what to fix – this is ideal.
     On the other hand, integration tests require us to start fennec and perform some action, which can fail when unrelated code fails (e.g. gecko start up code). Additionally, they take longer to run & they run on device, making them more flaky than desktop tests.
Flags: needinfo?(michael.l.comella)
I've finished the "Microsoft favicon test" as an integration test, while trying to keep the others as JUnit 4. Unfortunately, it seems this will not be possible. I went back to improve the "NVIDIA favicon test" and discovered that not even decoding actually works properly as a JUnit 4 test.

When decoding, we discard all but the smallest bitmap larger than `Favicons.largestFaviconSize`[1] so it is obviously important that this value is set realistically. Problem is, in a JUnit 4 test it is merely set to zero as it depends on screen density[2].

Assuming I go back and make the "NVIDIA favicon test" an integration test too, I will be faced with a problem. I really need access to `iconDirectory`[3] to verify that the correct entries are kept after decoding. With a JUnit 4 test it was easy to make sure the test was in the `org.mozilla.gecko.favicons.decoders` package, but that doesn't seem easy with a test in the Robocop suite. How do I deal with this Michael?

[1]: https://dxr.mozilla.org/mozilla-central/source/mobile/android/base/java/org/mozilla/gecko/favicons/decoders/ICODecoder.java#166
[2]: https://dxr.mozilla.org/mozilla-central/source/mobile/android/base/java/org/mozilla/gecko/favicons/Favicons.java#493
[3]: https://dxr.mozilla.org/mozilla-central/source/mobile/android/base/java/org/mozilla/gecko/favicons/decoders/ICODecoder.java#87
Flags: needinfo?(michael.l.comella)
(In reply to bd339 from comment #18)
> I really need access to
> `iconDirectory`[3] to verify that the correct entries are kept after
> decoding. With a JUnit 4 test it was easy to make sure the test was in the
> `org.mozilla.gecko.favicons.decoders` package, but that doesn't seem easy
> with a test in the Robocop suite. How do I deal with this Michael?

It's hard to access the code directly: robocop tests are specifically designed to test UI, which means they're intended to interacted with and verified directly through the UI, while black-boxing the code.

That being said, we've broken that convention before (e.g. testBrowserProvider & friends) and I suppose we can do it again. :)

Without seeing your code (and being unfamiliar with the favicon code), it's hard to say what is ideal (feel free to share your unfinished code). But with that, some ideas:
  * In the test, you should be able to make a call like `final BrowserApp browserApp = (BrowserApp) getActivity();`. You can try to access the ICODecoder from there
  * You can store a static reference to the last used ICODecoder somewhere, preferably only for test code.
  * If you're calling out to the load favicon methods directly, you could store the ICODecoders in the results.

Just be careful to reduce the impact on non-test code, e.g. don't keep a long-lived reference to a large Object. Also, be sure to explicitly comment any code in the main code base for testing.

Let me know if you need more help.
Flags: needinfo?(michael.l.comella)
Thanks Michael. Here is the code for the "NVIDIA favicon test" - the one that will probably have to be made into a Robocop test unfortunately: https://gist.github.com/bd339/9ff05abec57ee1fb44d4e4e52b96f01e
I hope that the source code will make more clear specifically what I'm trying to verify and what I need access to.

Basically it tests whether we correctly discard the entries we don't consider the best when multiple bitmaps have the same width. The NVIDIA ico file in question has nine bitmaps: 16x16, 32x32 and 48x48 and each of those sizes have 4 bpp, 8 bpp and 32 bpp versions for nine in total. The test can verify that we discard the poorest options because it is able to access `iconDirectory` for the `ICODecoder` instance created in the test. Since that member has package visibility we have to make sure the test is in the same package as the `ICODecoder` class - something that's quite easy in a JUnit 4 test. Now I ask of you, how to write an equivalent Robocop test. For reference here is the "Microsoft favicon test" I have successfully written as a Robocop test: https://gist.github.com/bd339/21a3b4e6bbb909f7cf7f64ba3e0e6fd5

Time for a bonus question :)
When we notice we have already encountered a bitmap of the same width in an ico file, we check whether we should replace the old entry with the new entry as our preferred bitmap for that size, by comparing them according to [1]. You'll notice that only if both bitmaps exceed the maximum colour depth will we prefer the one that has lower bpp. If just one exceeds the maximum colour depth we will instead prefer the one with higher bpp. I wonder if this is correct behaviour? For example in the "NVIDIA favicon test" we will prefer the 32 bpp bitmaps to their same-size-alternatives with lower bpp, even though the max bpp will always be set to either 16 or 24 at [2] & [3].

[1]: https://dxr.mozilla.org/mozilla-central/source/mobile/android/base/java/org/mozilla/gecko/favicons/decoders/IconDirectoryEntry.java#125
[2]: https://dxr.mozilla.org/mozilla-central/source/mobile/android/base/java/org/mozilla/gecko/BrowserApp.java#754
[3]: https://dxr.mozilla.org/mozilla-central/source/mobile/android/base/java/org/mozilla/gecko/GeckoAppShell.java#1147
Flags: needinfo?(michael.l.comella)
Sorry for the delay.

Honestly, I think the easiest thing to do is add a public getter with the `@VisibleForTesting` annotation. Since this method will be unused in the main code base, it'd probably be removed by proguard so annotate this method with `@RobocopTarget` (which makes sure the code is not removed)


(In reply to bd339 from comment #20)
> When we notice we have already encountered a bitmap of the same width in an
> ico file, we check whether we should replace the old entry with the new
> entry as our preferred bitmap for that size, by comparing them according to
> [1]. You'll notice that only if both bitmaps exceed the maximum colour depth
> will we prefer the one that has lower bpp. If just one exceeds the maximum
> colour depth we will instead prefer the one with higher bpp. I wonder if
> this is correct behaviour?

To be honest, I don't know much about favicon formats but that does sound fishy to me. Let me see if I can find someone with more knowledge.

A few notes from looking at the Microsoft favicon test:
  * UITest is preferred over BaseTest for the super class (BaseTest was the system before UITest and it seems they're poorly named :\ ). Can you try changing it over? That being said, if the tests breaks from changing it over, I don't think it's worth the effort to fix.
  * For assertions, you should use the `fAssert*` methods in the org.mozilla.gecko.tests.helpers.AssertionHelper class. The `assert*` methods are the built-in methods which don't work with our test harness.
(In reply to bd339 from comment #20)
> Time for a bonus question :)
> When we notice we have already encountered a bitmap of the same width in an
> ico file, we check whether we should replace the old entry with the new
> entry as our preferred bitmap for that size, by comparing them according to
> [1]. You'll notice that only if both bitmaps exceed the maximum colour depth
> will we prefer the one that has lower bpp. If just one exceeds the maximum
> colour depth we will instead prefer the one with higher bpp. I wonder if
> this is correct behaviour? For example in the "NVIDIA favicon test" we will
> prefer the 32 bpp bitmaps to their same-size-alternatives with lower bpp,
> even though the max bpp will always be set to either 16 or 24 at [2] & [3].

Having read through Bug 748100, I think the code is right:

To me "maxBPP" seems to be a misleading name, "desiredBPP" would be more appropriate since we're trying to find the icon that best matches the screen depth (as opposed to finding an icon that is <= desired depth). There's no definitive scientific answer, but selecting a larger BPP (and allowing Android to downsample) seems to have delivered better results than showing a lower depth image, hence that's what was implemented:

"This current approach for selecting best BPP appears to work well in all cases I can find. (Including the crazy NVidia.com favicon). Yay!"
(end of https://bugzilla.mozilla.org/show_bug.cgi?id=748100#c31 )

It's probably worth adding comments to explain this while we're adding the test?
(In reply to Andrzej Hunt :ahunt from comment #22)
> It's probably worth adding comments to explain this while we're adding the
> test?

Sounds good to me. :) Flag :ahunt on review for the comments please.
Flags: needinfo?(michael.l.comella)
Sure thing, I'll get around to it soon and will ask :ahunt for review when its ready or for further questions. I'll make an exception though :)
Out of curiosity, can you tell me why you're not working on fennec at the moment :mcomella?
Comment on attachment 8776089 [details] [diff] [review]
Unit test for NVIDIA favicon

This no longer needs feedback, this test code is obsolete now.
Attachment #8776089 - Flags: feedback?(ahunt)
I have been writing a test-case for the Golem.de favicon. The way I understand comment 0, the test is basically supposed to verify that the Golem.de favicon has PNG payloads. Assuming all payloads to be PNG failed the test, but I think it is correct to have the test expect only the 256x256 bitmap to be a PNG payload. I think so because according to Chris' explanation in https://bugzilla.mozilla.org/show_bug.cgi?id=748100#c10, PNG payloads are literal PNG images contained in the ICO file. As such you would expect the full PNG header to be present (and the current ICODecoder code does this), but when I look at the downloaded favicon I only see one PNG header (starting at the offset the 256x256 is supposed to be).
So long story short: I think only the 256x256 mentioned in comment 0 is a PNG, which is good because then the code seems to work.

Only the test cases where we intentionally break the files left to do.
Comment on attachment 8787965 [details]
Bug 961335 - ICODecoder Robocop tests;

https://reviewboard.mozilla.org/r/76520/#review75598

Looks good to me! I've pushed this to try (our CI system), and can land the patch for you if/when that gives the green light!

Thank you for adding these tests : )

::: mobile/android/tests/browser/robocop/src/org/mozilla/gecko/tests/testICODecoder.java:40
(Diff revision 1)
> +    /**
> +     * Decode and verify a Microsoft favicon with six different sizes:
> +     * 128x128, 72x72, 48x48, 32x32, 24x24, 16x16
> +     * Each of the six BMPs supposedly has zero colour depth.
> +     */
> +    private void testMicrosoftFavicon() throws Exception {

Minor nit: it's good to be as specific as possible with Exception declarations, i.e. IOException instead of Exception!
Attachment #8787965 - Flags: review?(ahunt) → review+
Hmm... something is definitely not right on the Try run. I'm not sure about the rc2 failures - they don't look related to this. The rc4 failures on the other hand, are obviously this test failing. We're seeing "Exception caught during test! - java.lang.NoSuchMethodError: org.mozilla.gecko.icons.decoders.ICODecoder.<init>" which seems to me like Proguard's doing on the constructor (is Proguard actually enabled on Try?). In addition to being marked as a Robocop target, should the constructor also be marked as "visible for testing"? The tests definitely ran fine on my device, so perhaps Proguard is behaving differently on Try because it is a different build config.
ahunt, can you confirm if Proguard is used on Try and in that case, do you think annotating the constructor with VisibleForTesting would help? Also, what do you reckon about those rc2 failures?
Flags: needinfo?(ahunt)
Forget what I said in comment 30. It seems that when I rebased onto changes to the ICODecoder class, including the constructor, I forgot to add the RobocopTarget annotation back on the constructor.
Flags: needinfo?(ahunt)
(In reply to B. Dahse from comment #31)
> Forget what I said in comment 30. It seems that when I rebased onto changes
> to the ICODecoder class, including the constructor, I forgot to add the
> RobocopTarget annotation back on the constructor.

I've added it back and did a second try run which succeeded (for some reason that didn't post here automatically):
https://treeherder.mozilla.org/#/jobs?repo=try&revision=5825426d6595

I ran the original patch locally without issues, so it seems Proguard is being more aggressive on try?
Indeed. After noticing the Try failure I tried running the original locally and as I thought it didn't throw exceptions. So something is definitely different in Proguard between the versions developers build and the version the Try server builds.
https://hg.mozilla.org/mozilla-central/rev/c46176e993d5
Status: ASSIGNED → RESOLVED
Closed: 3 years ago
Resolution: --- → FIXED
Target Milestone: --- → Firefox 51
You need to log in before you can comment on or make changes to this bug.