Closed Bug 981378 Opened 10 years ago Closed 10 years ago

AMO Favorites’ JSON Dump Does not always fully load.

Categories

(addons.mozilla.org Graveyard :: API, defect)

defect
Not set
normal

Tracking

(Not tracked)

RESOLVED WONTFIX

People

(Reporter: shlomif, Unassigned)

Details

User Agent: Mozilla/5.0 (X11; Linux x86_64; rv:27.0) Gecko/20100101 Firefox/27.0 (Beta/Release)
Build ID: 20140214065137

Steps to reproduce:

I am user "shlomif" on addons.mozilla.org and I noticed that often the Firefox Personas Rotator extension stops working, despite the fact that I'm logged in. After debugging the JavaScript, I realised that often loading this page - https://addons.mozilla.org/en-US/firefox/collections/mine/favorites/format%3Ajson?src=personas-plus-1.7.3 - results in a very partial download and times out which confuses the Firefox personas rotator (and is not getting notified). My ISP is Bezeq International - https://duckduckgo.com/?q=bezeq%20international - which kinda sucks, but there isn't a lot of competition in the ISP market in Israel.


Actual results:

The personas rotator fails.


Expected results:

The JSON should have fully downloaded each time and the personas rotator extension should work.
Hi,  I'm worried there isn't going to be anything we can do for you here.  The next step would be figuring out why it's a partial download - the developer tools might be able to help here (I'm not sure) but it sounds like it's simply timing out or dropping packets. :-/
Hi,

(In reply to Wil Clouser [:clouserw] from comment #1)
> Hi,  I'm worried there isn't going to be anything we can do for you here. 

I see.

> The next step would be figuring out why it's a partial download - the
> developer tools might be able to help here (I'm not sure) but it sounds like
> it's simply timing out or dropping packets. :-/

Yes, maybe - I did not see any failure in the developer tools. Is there a way to detect that the HTTPS stream staled due to dropped packets or timeouts? Furthermore, that JSON file contains a lot of extraneous information like the description the title/etc. It would be much smaller if it only contained the URLs or the IDs. Another option would be to chunk it into several smaller chunks and do a query on that. Just some ideas.

Regards,

-- Shlomi Fish
Hi,

I've been thinking about how to solve this bug, and I have several ideas. I thought it may be a good idea to be able to pass a start index and a count (or a start index and end index) as GET parameters, so one can retrieve a subset range of the results. Together with a query of how many results are there and assuming the results are sorted in a consistent order, one can retrieve them incrementally.

Optionally, one may wish to also support a "streamable" format where the results can be processed incrementally and stored, like the one I implemented in a different context here: https://metacpan.org/pod/File::Dir::Dumper::Stream::JSON::Writer ; https://metacpan.org/pod/File::Dir::Dumper::Stream::JSON::Reader . This way, we can store the first results and only ask for the ones that did not arrive yet.

Regards,

-- Shlomi Fish
Hi all,

(In reply to Shlomi Fish from comment #3)
> Hi,
> 
> I've been thinking about how to solve this bug, and I have several ideas. I
> thought it may be a good idea to be able to pass a start index and a count
> (or a start index and end index) as GET parameters, so one can retrieve a
> subset range of the results. Together with a query of how many results are
> there and assuming the results are sorted in a consistent order, one can
> retrieve them incrementally.
> 
> Optionally, one may wish to also support a "streamable" format where the
> results can be processed incrementally and stored, like the one I
> implemented in a different context here:
> https://metacpan.org/pod/File::Dir::Dumper::Stream::JSON::Writer ;
> https://metacpan.org/pod/File::Dir::Dumper::Stream::JSON::Reader . This way,
> we can store the first results and only ask for the ones that did not arrive
> yet.
> 

I may be willing to work on that. Is there an agreement that such changes will be accepted? And if so - are there any constraints?

Regards,

-- Shlomi Fish
Thanks for filing this.  Due to resource constraints we are closing bugs which we won't realistically be able to fix.  If you have a patch that applies to this bug please reopen.

For more info see http://micropipes.com/blog/2014/09/24/the-great-add-on-bug-triage/
Status: UNCONFIRMED → RESOLVED
Closed: 10 years ago
Resolution: --- → WONTFIX
Product: addons.mozilla.org → addons.mozilla.org Graveyard
You need to log in before you can comment on or make changes to this bug.