We're about to have more than one PuppetAgain server (now in scl1, and soon another in scl3). Some of these repos are static, and others are only updated on-demand, but some (the local repos) will have custom RPMs dropped into them fairly frequently. We need some method of ensuring that RPMs are easy to add, and that those RPMs end up on all of the puppet servers without manual intervention. My best thought so far is to make one of the puppet masters the "primary", and use rsync to update from there. It would be nice, though, to also apply some deduplication. We will very likely see cases where a single RPM appears in a few repos -- particularly different snapshots of the same repo -- and storing those redundantly can waste a lot of space. I'm open to suggestions.
Additional notes to self: - the sysadmin puppet manifests that control this are *really* annoying to maintain, so those should be refactored - https://wiki.mozilla.org/ReleaseEngineering/PuppetAgain/Repositories should be kept up to date - add comments to that effect
OK, I got a basic config of csync2 installed and synchronizing /data, which has all of the repositories and python packages and whatnot in it. I'll need to do some experimentation to see how it responds. I suspect we'll have relatively infrequent crontasks to synchronize files, as well as a script that can be run as root to do the synchronization on-demand.
Sweet, I have this set up now, and updated the docs too - https://wiki.mozilla.org/ReleaseEngineering/PuppetAgain/Data The change management emails are going to me for the moment, so nobody freaks out, but once I'm confident they're not noisy, I'll redirect to the releng shared mailbox.
Status: NEW → RESOLVED
Last Resolved: 7 years ago
Resolution: --- → FIXED
Component: Server Operations: RelEng → RelOps
Product: mozilla.org → Infrastructure & Operations
You need to log in before you can comment on or make changes to this bug.