Closed Bug 625417 Opened 14 years ago Closed 13 years ago

Create a REST webservice for build lookup

Categories

(Release Engineering :: General, enhancement, P3)

enhancement

Tracking

(Not tracked)

RESOLVED DUPLICATE of bug 487036

People

(Reporter: christian, Assigned: laura)

References

Details

It'd be really, really nice to have a webservice for build/nightly lookup and cross-reference. This will reduce scraping and script maintenance others have to do and should shield the organization from potentially disruptive layout changes such as bug 449607 and  bug 487036

I'd like something like:

* GET on [whatever]/builds/nightlies/[date] returns JSON containing builds from that date for all repos, ids, links to download on ftp, etc

* GET on [whatever]/builds/nightlies/[repo name]/[date] to return JSON containing builds from that date for a repo, ids, links to download on ftp, etc

* GET on [whatever]/build/[build_id] returns JSON with the hg changeset, build date, platform, repo, etc

This service can be backed by the existing data on FTP and can store a cache in a database. A future step would have the build system report directly into it.
It'd also be nice to have something like /latest/ that would return the latest build in a given category, etc.
yoink
Assignee: nobody → catlee
Priority: -- → P3
How can we help with this?
I haven't put much thought into this other than what do use as the data source...Our options are scraping FTP, and mining our build status database, both of which have problems.

Dumps of our status database are available at http://build.mozilla.org/builds/ if you want to poke around there.
Assignee: catlee → laura
The repo I linked above has code for both scraping FTP and the buildbot JSON. Neither is perfect, but they're both usable sources of data. If we get something like this online, ideally it'd use pulse for future updating, and we can backfill historical data through whatever means necessary. (I'd like to backfill to the beginning of hg history, at least, even if I have to resort to automated downloading of builds from FTP.)
Talked to catlee: we should just be able to hook up to the build database direct, or to a ro slave of it.  That avoids scraping ftp or buildbot JSON.  Do we need data that's not in that database?

Ted: thanks for the code link.  Not sure how pulse ties in - can you explain?
Christian could expound, but we currently have the buildbot master reporting the status of builds via pulse, so you could be watching that to find out about new nightly builds instead of scraping anything.

If you're tied into the build db, I guess that makes data source a non-issue, although I would like to have historical data in there that probably doesn't fit into that model.
We should be able to backfill historical data.  What's the right way to get that, scraping hg?
Probably scraping FTP in some way. (the hg repo doesn't know anything about builds.) I don't recall how far back the buildbot data goes, but I guess we'll find out!
(In reply to comment #11)
> Probably scraping FTP in some way. (the hg repo doesn't know anything about
> builds.) I don't recall how far back the buildbot data goes, but I guess we'll
> find out!

I think Oct 2009 or so...But the buildbot data only knows about the tinderbox-builds urls, so we'll have to do some translation to the date-based urls for nightlies.
from irc with legneato:

1) This bug, and bug#487036 are about matching up the buildID with the changeset while doing regression hunting. This can happen either from "I have changeset, give me buildid", or "I have buildid / regression_date_range, please give me changeset". Then for those two changesets, link to hg.m.o to show the exact list of what changes landed, comments, owner, etc.

2) Legneato agrees this is a DUP of bug#487036. If there's something we missed that is not in bug#487036, please file new bug linked to bug#487036 with specific details of what is needed after bug#487036 is fixed.
Status: NEW → RESOLVED
Closed: 13 years ago
Resolution: --- → DUPLICATE
Product: mozilla.org → Release Engineering
You need to log in before you can comment on or make changes to this bug.