Power off and mothball xulrunner 1.9.0 machines

RESOLVED FIXED

Status

mozilla.org Graveyard
Server Operations
RESOLVED FIXED
8 years ago
3 years ago

People

(Reporter: joduinn, Assigned: phong)

Tracking

Details

Attachments

(1 attachment)

On 27jan2010, mfinkle posted to m.dev.planning about de-supporting the xulrunner 1.9.0 builds. 

This de-support means that continuous builds will stop, and RelEng will no longer provide builds as part of each Firefox 3.0 release. The source is obviously still available, so people can still land approved fixes, and do their own builds. 

This bug is to track all the mechanical steps involved with the 3 xulrunner machines: bm-xserve08, xr-linux-tbox, xr-win32-tbox. 
- remove machines from nagios
- power off
- backup just in case we *might* need to recreate
- delete VMs/recycle hardware
I have not had any feedback from the newsgroup posting or my blog post, so I think we are clear to proceed here.
(In reply to comment #0)
> This bug is to track all the mechanical steps involved with the 3 xulrunner
> machines: bm-xserve08, xr-linux-tbox, xr-win32-tbox. 
> - remove machines from nagios
> - power off
> - backup just in case we *might* need to recreate
> - delete VMs/recycle hardware

The full set of steps applies to xr-{linux,win32}-tbox but not to bm-xserve08. For that mac we just need to adjust /builds/tinderbox/multi-config.pl and restart tinderbox - otherwise we'll lose nightly and debug builds for Firefox 3.0.
Currently http://tinderbox.mozilla.org/showbuilds.cgi?tree=XULRunner only shows the xulrunner1.9.0 builds. 

As far as I can see, the xulrunner builds generated for 1.9.1, 1.9.2, mozilla-central and lorentz are not being displayed anywhere on tinderbox server. We should display the xulrunner builds from all these other branches someplace. Putting them all on the same one xulrunner waterfall might be easiest, and might make the most sense, but I'd be curious to hear what others think is best?
Actually, they're already reporting to the XULRunner waterfall, but we only build nightlies for hg branches so they fall off the waterfall 12 hours after building. See
 http://tinderbox.mozilla.org/showbuilds.cgi?tree=XULRunner&hours=30
Assignee: nobody → joduinn
Created attachment 427882 [details] [diff] [review]
disable xulrunner jobs on bm-xserve08
Attachment #427882 - Flags: review?(nrthomas)
(In reply to comment #3)
> Currently http://tinderbox.mozilla.org/showbuilds.cgi?tree=XULRunner only shows
> the xulrunner1.9.0 builds. 
> 
> As far as I can see, the xulrunner builds generated for 1.9.1, 1.9.2,
> mozilla-central and lorentz are not being displayed anywhere on tinderbox
> server. We should display the xulrunner builds from all these other branches
> someplace. Putting them all on the same one xulrunner waterfall might be
> easiest, and might make the most sense, but I'd be curious to hear what others
> think is best?

(In reply to comment #4)
> Actually, they're already reporting to the XULRunner waterfall, but we only
> build nightlies for hg branches so they fall off the waterfall 12 hours after
> building. See
>  http://tinderbox.mozilla.org/showbuilds.cgi?tree=XULRunner&hours=30

nthomas, thanks for the info. This works for me, so nothing to do for waterfall.
Status: NEW → ASSIGNED
(In reply to comment #0)
> This bug is to track all the mechanical steps involved with the 3 xulrunner
> machines: bm-xserve08, xr-linux-tbox, xr-win32-tbox. 
> - remove machines from nagios
xr-linux-tbox, xr-win32-tbox disabled in nagios.

> - power off
xr-linux-tbox, xr-win32-tbox: VMs powered off.
bm-xserve08: changed /builds/tinderbox/multi-config.pl - see attachment for diff - and then stopped/started the tinderbox client.

> - backup just in case we *might* need to recreate
> - delete VMs/recycle hardware
Phong: can you backup these two VMs and then delete them please?

Also, can you remove the XULRunner 1.9.0 builds from nagios? I couldnt find where that was set.
Assignee: joduinn → phong
Component: Release Engineering → Server Operations
QA Contact: release → mrz
Comment on attachment 427882 [details] [diff] [review]
disable xulrunner jobs on bm-xserve08

All good. Looks like tinderbox was restarted via ssh though (orange builds from failure to launch the build for AliveTest). Restarted Tinderbox using a VNC session to fix that up.
Attachment #427882 - Flags: review?(nrthomas) → review+
(Assignee)

Comment 9

8 years ago
removed bm-xserve08, xr-linux-tbox & xr-win32-tbox from dns, dhcp, nagios, and inventory.
Status: ASSIGNED → RESOLVED
Last Resolved: 8 years ago
Resolution: --- → FIXED
(Assignee)

Updated

8 years ago
Blocks: 548331
(In reply to comment #9)
> removed bm-xserve08, xr-linux-tbox & xr-win32-tbox from dns, dhcp, nagios, and
> inventory.

Please undo the dns, dhcp, nagios, and inventory changes for bm-xserve08. That box is still in use for other builds, per comment #8.
Status: RESOLVED → REOPENED
Resolution: FIXED → ---
(Assignee)

Comment 12

8 years ago
added it back.
Status: REOPENED → RESOLVED
Last Resolved: 8 years ago8 years ago
Resolution: --- → FIXED
Sorry, forgot to ask for removal of "surf:Nightly Builds - XULRunner mozilla1.9.0 branch" from nagios too. Could you also remove qm-pvista-try10 since we're not bringing that back from the dead (bug 532967).
Status: RESOLVED → REOPENED
Resolution: FIXED → ---
(In reply to comment #13)
> Sorry, forgot to ask for removal of "surf:Nightly Builds - XULRunner
> mozilla1.9.0 branch" from nagios too. Could you also remove qm-pvista-try10
> since we're not bringing that back from the dead (bug 532967).

ping?
(Assignee)

Comment 15

8 years ago
commented out the check in nagios.

# remove check per bug 544678
# define service{
#       use                             nightly-build-service
#       service_description             Nightly Builds - XULRunner mozilla1.9.0 branch
#       check_command                   check_nightly_builds!XULRunner_mozilla1.9.0
#       }
Status: REOPENED → RESOLVED
Last Resolved: 8 years ago8 years ago
Resolution: --- → FIXED
Product: mozilla.org → mozilla.org Graveyard
You need to log in before you can comment on or make changes to this bug.