Closed Bug 630534 Opened 15 years ago Closed 13 years ago

Host tree status outside of tinderbox

Categories

(Tree Management Graveyard :: TBPL, defect)

defect
Not set
normal

Tracking

(Not tracked)

RESOLVED FIXED

People

(Reporter: coop, Unassigned)

References

Details

As part of the work to stop using tinderbox server for any part of Firefox work, we should move the hosting of the tree status (e.g. The tree is APPROVAL REQUIRED) outside of tinderbox. In email, mstange indicated that it wouldn't be terribly hard to host this on the TBPL server, but metrics had also offered to host this data. I vote for hosting it in TBPL, but whatever is easiest.
Blocks: 630538
I don't think I object to where it is hosted, but please can we make it so that it is *far* easier to work with. Adjusting the tree status should be a matter of changing an option element, and entering some text (possibly just selecting recent/pre-determined text), with multiple trees accessible from the same page (think infrastructure downtime). We shouldn't be editing HTML in multiple places.
If we're going to build a tool to control tree status, that's easy enough, and for sanity's sake should just be standalone (both TBPL and hg hooks are known dependencies) Obvious requirements: * per-tree URL that can be used for both TBPL and hg hooks in a standard way. GET http://treestatus.mozilla.org/tree/mozilla-central {open: 0, closedby: mconnor@mozilla.com, reason: "I blame armen"} * hook this into LDAP so we get sane logging of who made changes (and we have a shorter infrasec cycle if it's behind LDAP) * record all closures/reopenings so we can get real data on tree closures.
(In reply to comment #2) > If we're going to build a tool to control tree status, that's easy enough, > and for sanity's sake should just be standalone (both TBPL and hg hooks are > known dependencies) > > Obvious requirements: > * per-tree URL that can be used for both TBPL and hg hooks in a standard way. > > GET http://treestatus.mozilla.org/tree/mozilla-central > {open: 0, closedby: mconnor@mozilla.com, reason: "I blame armen"} I would rather go with a status: "open" or something string than open: 0. That way status can be "open", "restricted", "approval", "closed" or whatever. JSON is of course preferred ;-) > * hook this into LDAP so we get sane logging of who made changes (and we > have a shorter infrasec cycle if it's behind LDAP) > > * record all closures/reopenings so we can get real data on tree closures. Good idea!
(In reply to comment #3) > (In reply to comment #2) > > * hook this into LDAP so we get sane logging of who made changes (and we > > have a shorter infrasec cycle if it's behind LDAP) My only question on this, is would all LDAP users have access to close the tree, if not, what level(s) have access, or do we want/plan to setup a unique LDAP perm bit to toggle for this. Currently of course, we simply pass around the tinderbox password to trusted users, independent of ldap access [though of course, anyone privileged to CLOSE m-c also has level 3 access, but the reverse doesn't hold true -- for now.] When thinking about that, keep in mind the desire to let people from other projects use this for their trees (comm-central) and the ability for someone like myself to close m-c if its a holiday, and I take it upon myself to sheriff for a sudden inrush of people "can I land" :-) type stuff.
We could just piggyback on scm_level_2, or create a new LDAP group. If we're logging actions, I'm not super worried about it being misused
(In reply to comment #3) > > GET http://treestatus.mozilla.org/tree/mozilla-central > > {open: 0, closedby: mconnor@mozilla.com, reason: "I blame armen"} > > I would rather go with a status: "open" or something string than open: 0. > That way status can be "open", "restricted", "approval", "closed" or > whatever. > JSON is of course preferred ;-) Meh, strings. If we're going with JSON, we don't need to overload a single key. Okay, more requirements: States we need to reflect in a simple way: Open Open to approved patches Closed for maintenance Closed for bustage Added metadata we should track/return: Closed-Maintenance: who, why, ETA Closed-Bustage: who, why Open: tree rules URL Open to approved patches: approval process URL, tree rules URL
Today the tree status is saying this: > OPEN. Please ignore mochitest-browser-chrome constantly reporting T-FAIL; > if it's green it's ok. See Bug 643607. Any idea how to reflect that in the JSON? Just add a free form message meta data field that can be set on all states?
I wrote a simple web.py app that may be useful: https://github.com/catlee/treestatus
(In reply to comment #7) > Today the tree status is saying this: > > > OPEN. Please ignore mochitest-browser-chrome constantly reporting T-FAIL; > > if it's green it's ok. See Bug 643607. > > Any idea how to reflect that in the JSON? Just add a free form message meta > data field that can be set on all states? Yeah, we'll need some sort of "message" field as well.
Can we accelerate this? We need something like this to start gathering better data around closures, and to make our tree closure hooks a lot less fragile. Doesn't seem like it should be a huge project, at this point.
I have it up and running here: http://cruncher.build.mozilla.org:5000 I'm not sure what VPN access is required to hit that, but I don't have another place to host it ATM.
(In reply to comment #11) > I have it up and running here: > http://cruncher.build.mozilla.org:5000 > > I'm not sure what VPN access is required to hit that, but I don't have > another place to host it ATM. username/password is test/test.
I think the VPN to this is the MPT vpn but we'll need the IP address of that, because it's not resolving for me across the vpn. Do you have the IP address for this? If you want, we could look into hosting this publicly on brasstacks.mozilla.com. (CC'ing mcote & jgriffin, who are our brasstacks.m.c admins)
We could totally host it. We have a number of web.py apps on brasstacks.
(In reply to comment #14) > We could totally host it. We have a number of web.py apps on brasstacks. That would be great! It's pretty simple to deploy, I just added some instructions in the README.
Okay due to some Python weirdness (long story), we couldn't host it on brasstacks without some major surgery. However, we are in the process of setting up a second, more modern server for our web tools, so I have deployed treestatus on it. We're just waiting on IT to open public access to the web server. When that's done, it will be accessible as http://flyingtanks.mozilla.com/treestatus/. At the moment you can get to it if you're on the MPT network at http://10.8.73.23/treestatus/.
(In reply to Mark Côté ( :mcote ) from comment #16) > Okay due to some Python weirdness (long story), we couldn't host it on > brasstacks without some major surgery. However, we are in the process of > setting up a second, more modern server for our web tools, so I have > deployed treestatus on it. We're just waiting on IT to open public access > to the web server. When that's done, it will be accessible as > http://flyingtanks.mozilla.com/treestatus/. At the moment you can get to it > if you're on the MPT network at http://10.8.73.23/treestatus/. Any update on the public side of things?
Was just about to comment. :) So IT wants to do a security review of treestatus... see bug 674711. Since this is your app, I figure you're the one who should file this request, in case they have questions for you. Once it's approved they'll open up the ports to flyingtanks.
Although I'm in the MPT network, I get unauthorized if I try to doing anything (or get a message about flyingtanks). Is that expected at this stage (I'm curious and want to offer comment if possible).
I just realized that the app expects to be at the root level, so I set up a virtual host called treestatus and "moved" it there. There's no DNS entry for this yet, but if you add it to your hosts file you should be able to load it as http://treestatus/. However I am also getting the 401 errors, and I dunno what they mean.
Note that since this will be what the hg hook will be querying, it's a tier 1 line-of-business spof (unless it fails open instead of failing closed, in which case it's pretty broken), so we might want to go to the trouble of seriously hosting it, silly as that seems.
I just want a place to host this that's not my laptop so other folks can poke at it and see if it does what we want. After that we can look at proper hosting.
I have this up at http://treestatus.atlee.ca/ for now. Login is test/test.
(In reply to Chris AtLee [:catlee] from comment #22) > I just want a place to host this that's not my laptop so other folks can > poke at it and see if it does what we want. After that we can look at proper > hosting. I think it is a good starter, nice and simple, but it obviously needs expanding to include the requirements mentioned in comments 1 to 6.
json status is available: curl -H 'Accept: application/json' http://treestatus.atlee.ca/mozilla-central {"status": "closed", "repo": null, "reason": "they're broke!", "tree": "mozilla-central"} or curl http://treestatus.atlee.ca/mozilla-central?format=json {"status": "closed", "repo": null, "reason": "they're broke!", "tree": "mozilla-central"} History is available via http://treestatus.atlee.ca/mozilla-central/logs
Blocks: 683418
Reviving this bug, its been quiet for a month. (In reply to Mark Côté ( :mcote ) from comment #18) > Was just about to comment. :) So IT wants to do a security review of > treestatus... see bug 674711. Since this is your app, I figure you're the > one who should file this request, in case they have questions for you. Once > it's approved they'll open up the ports to flyingtanks. mcote: since bug#674711 is closed, whats next step before infrasec review? mconnor, standard8, catlee: anything left to do from your point-of-view?
sec review is bug 678516
per irc with catlee, this does not block bug#630538. Moving to "blocking bug#625979" so we dont lose track of this.
Blocks: 625979
No longer blocks: 630538
I hadn't realised the app had been updated, but it certainly looks much better now. A small nit is that the reason box should be wider - not many of the reasons we use will fit into that sized box. I also think we should potentially host the sheriff details here (Firefox is via google calendar, which could be cached, Thunderbird & others currently have a static value) - although we could probably do that in a separate bug (there is a related one around somewhere).
To be clear, I'd be happy for this to go out now with the slightly wider box. The rest can be done later.
Just to clarify: is one of our servers (flyingtanks/brasstacks) still the intended destination for treestatus once the secreview is finished?
(In reply to Mark Côté ( :mcote ) from comment #31) > Just to clarify: is one of our servers (flyingtanks/brasstacks) still the > intended destination for treestatus once the secreview is finished? I think it would make more sense for the treestatus app to be hosted as part of TBPL (and on the tbpl.mozilla.org domain) rather than on our server. However, if you want it on our server for some reason, we're happy to oblige.
Depends on: 701397
Paul was working on something for this the other day, I think.
I didn't know there was progress being made, so props to all involved. My "tool" was just a form to make the current process slightly less error-prone. It's just a quick hack. http://playground.zpao.com/mozilla/treestatusbuilder/ From a quick look at what's been done by others, obviously logging & getting rid of the current process is key. I look forward to the final product.
My current implementation is hosted at http://treestatus.atlee.ca and source is at https://github.com/catlee/treestatus if you want to take a look!
Status: NEW → RESOLVED
Closed: 13 years ago
Resolution: --- → FIXED
Product: Webtools → Tree Management
Product: Tree Management → Tree Management Graveyard
You need to log in before you can comment on or make changes to this bug.