Closed Bug 773331 Opened 10 years ago Closed 10 years ago

Setup 16 lion machines in SCL3

Categories

(Infrastructure & Operations Graveyard :: CIDuty, task, P3)

x86_64
macOS

Tracking

(Not tracked)

RESOLVED FIXED

People

(Reporter: coop, Unassigned)

References

Details

(Whiteboard: [buildduty][buildslave][capacity])

Attachments

(3 files)

There's a list of 16 lion machines in https://bugzilla.mozilla.org/show_bug.cgi?id=759466#c9 that need to be configured in puppet and slavealloc. 6 are destined for preproduction, 8 for the build pool, and 2 for try. Let's break them down as follows:

Build:
bld-lion-r5-081.build.releng.scl3
bld-lion-r5-082.build.releng.scl3
bld-lion-r5-083.build.releng.scl3
bld-lion-r5-084.build.releng.scl3
bld-lion-r5-085.build.releng.scl3
bld-lion-r5-086.build.releng.scl3
bld-lion-r5-087.build.releng.scl3
bld-lion-r5-088.build.releng.scl3

Preproduction:
bld-lion-r5-089.build.releng.scl3
bld-lion-r5-090.build.releng.scl3
bld-lion-r5-091.build.releng.scl3
bld-lion-r5-092.build.releng.scl3
bld-lion-r5-093.build.releng.scl3
bld-lion-r5-094.build.releng.scl3

Try:
bld-lion-r5-095.try.releng.scl3
bld-lion-r5-096.try.releng.scl3
Depends on: 759466
When you're making buildbot changes, r5-mini-001, r5-mini-002, r5-mini-003, r5-mini-004, r5-mini-005, r5-mini-006 should come out of service (they're being replaced by the above mentioned machines).
Attached patch puppet patchSplinter Review
Attachment #641544 - Flags: review?(coop)
Attachment #641544 - Flags: review?(coop) → review+
I've added these machines to slavealloc now too.
Tried changing the password on 089 and 090 (got this working on 045 earlier; bug 760093 comment 6 ).

This led to me not being able to log in via screen sharing.
Perhaps we should wait til the password is fixed on the image and reimage.
Comment on attachment 641612 [details] [diff] [review]
remove r5-mini's per arr

Review of attachment 641612 [details] [diff] [review]:
-----------------------------------------------------------------

::: mozilla/staging_config.py
@@ +1,3 @@
>  import production_config as pc
>  
> +MAC_LION_MINIS = []

r+ if you add the new dev lion minis in their place (already in slavealloc):

MAC_LION_MINIS = ['bld-lion-r5-%03d' % x for x in range(89,95)]
Attachment #641612 - Flags: review?(coop) → review+
cc-ing Kim because she's currently listed in slavealloc as using r5-mini-006 for puppet testing. I don't think that machine is currently plugged in to regular puppet, so as long as IT doesn't take the machine away immediately, she may not care.
Right, I'm just using it for staging testing o the old puppet.
Ok.

So using Bear's script I was able to fix the password situation, then added the appropriate ssh keys, and puppetized.

Enabling in slavealloc & rebooting brought them back online.

However, scl3-production-puppet is overwriting staging and try ssh keys with production keys.
I don't know how to fix this; until we either decide these are all production boxes or fix this, I'm leaving 89 through 96 disabled in slavealloc.
Attachment #642108 - Flags: review?(coop) → review+
Blocks: 772458
(In reply to Aki Sasaki [:aki] from comment #11)
> However, scl3-production-puppet is overwriting staging and try ssh keys with
> production keys.
> I don't know how to fix this; until we either decide these are all
> production boxes or fix this, I'm leaving 89 through 96 disabled in
> slavealloc.

I checked the keys on the two try slaves (95,96), and they looked right, even after a reboot, so I've enabled these two slaves. 

Looking at the staging slaves now.
I think we're good here now.
Status: NEW → RESOLVED
Closed: 10 years ago
Resolution: --- → FIXED
Duplicate of this bug: 773910
Can we reclaim r5-mini* now?
Blocks: 776888
(In reply to Amy Rich [:arich] [:arr] from comment #17)
> Can we reclaim r5-mini* now?

Almost. Filed bug 776888.
Product: mozilla.org → Release Engineering
Component: Platform Support → Buildduty
Product: Release Engineering → Infrastructure & Operations
Product: Infrastructure & Operations → Infrastructure & Operations Graveyard
You need to log in before you can comment on or make changes to this bug.