Closed
Bug 1175133
Opened 9 years ago
Closed 8 years ago
Set up a redundant nuget repository on puppetmasters for Chocolatey packages
Categories
(Infrastructure & Operations :: RelOps: Puppet, task)
Infrastructure & Operations
RelOps: Puppet
Tracking
(Not tracked)
RESOLVED
WONTFIX
People
(Reporter: grenade, Assigned: grenade)
References
Details
(Whiteboard: [windows])
Attachments
(2 files, 4 obsolete files)
Either create some scripts that generate static package descriptions or find an existing Apache based NuGet repository.
Comment 1•9 years ago
|
||
Corollaries to this are:
* clients must be able to poke around to find a working nuget repository on one of the puppetmasters
* the repository must automatically sychronize between puppetmasters, preferably with the same rsync invocation as we use for all of the other package data
Updated•9 years ago
|
Whiteboard: [windows]
Assignee | ||
Comment 2•9 years ago
|
||
From #mozwww discussion:
- Two repos. One for third party packages, one for packages we create.
- NuGet Repo implementation based on releng-puppet: /data/repos/apt/custom/collectd/update.sh (from one of the custom apt repos)
Assignee: relops → rthijssen
Status: NEW → ASSIGNED
Assignee | ||
Comment 3•9 years ago
|
||
Script for generating the Packages manifest from a collection of .nupkg files
Assignee | ||
Comment 4•9 years ago
|
||
SL transform for extracting Packages xml from .nuspec files
Updated•9 years ago
|
Attachment #8628937 -
Attachment is patch: false
Comment 5•9 years ago
|
||
Comment on attachment 8628937 [details]
generate-manifest.sh
It looks like this deletes the .nuspec files. Is this a script I could run multiple times in the same directory?
Comment 6•9 years ago
|
||
Comment on attachment 8628939 [details]
transform.xsl
This brings back nightmares.. I think I still have that XSLT book around here somewhere..
Attachment #8628939 -
Attachment is patch: false
Assignee | ||
Comment 7•9 years ago
|
||
- Yes, the .nuspec files are extracted from the .nupkg (zip) files on every run and deleted after the transform.
- You can run it over and over in the same directory until you remember you'd rather be watching paint dry with the only ill effect being your loss of the will to live.
- I had nightmares writing it: http://stackoverflow.com/questions/31182965/transform-and-combine-multiple-xml-documents
Assignee | ||
Comment 8•9 years ago
|
||
So this is what I came up with for NuGet repo url rewrites. The first rule maps NuGet repo convention for package download URLs to package files in our directory structure when the required version is known (.../package/SomePackage/1.1.1/ -> .../packages/SomePackage.1.1.1.nupkg).
The second rule works, but does not match convention. It points requests for unspecified (latest) version of a package to a symlink that is (re)created whenever a newer version of a package is detected. I'm not really happy with it, because it will result in downloaded package files that do not contain the package version in the filename. NuGet clients should handle this, but it's a behaviour not matched in other NuGet repositories. If clients were to rely on version information in package filenames (which could easily happen, since that is the convention), then our repo would serve package files with this missing. We should choose this option if we want to stay away from using a .htaccess file and we're happy to always specify the package version we require in our nuget clients (probably mostly puppet manifests, so manageable).
The alternative for the second rule would be to generate and use a .htaccess file every time an updated package is detected containing specific latest version rewrite rules for each package. eg:
RewriteRule ^package/SomePackage/?$ packages/SomePackage.1.1.1.nupkg
RewriteRule ^package/SomeOtherPackage/?$ packages/SomeOtherPackage.1.2.3.nupkg
...
This would give our repo convention parity with standard NuGet repos. Perhaps this isn't important, since we are also not supporting `push` to repo, so have a defacto no-parity implementation anyway.
Attachment #8629966 -
Flags: feedback?(dustin)
Comment 9•9 years ago
|
||
Comment on attachment 8629966 [details] [diff] [review]
Apache config for NuGet repo package rewrite rules
I think we should figure out exactly how a "normal" nuget server responds to the version-less package first. How does it supply the filename to the HTTP client?
Attachment #8629966 -
Flags: feedback?(dustin) → feedback+
Assignee | ||
Comment 10•9 years ago
|
||
wip: https://gist.github.com/grenade/b4f11aa5080f9f5209f4 - just posting this here, so I have an easy link when I switch workstations
Assignee | ||
Comment 11•9 years ago
|
||
So there were some flaws in my earlier implementation.
- redirects, as already mentioned
- also the manifest should contain all versions of all packages.
This script addresses these issues and also introduces html transformations for describing packages in a human friendly format.
Full implementation at: https://github.com/grenade/apache-nuget-repo
Attachment #8628937 -
Attachment is obsolete: true
Attachment #8628939 -
Attachment is obsolete: true
Attachment #8630988 -
Flags: feedback?(dustin)
Assignee | ||
Comment 12•9 years ago
|
||
redirects to support NuGet repo url conventions
Attachment #8629966 -
Attachment is obsolete: true
Attachment #8631015 -
Flags: review?(dustin)
Comment 13•9 years ago
|
||
Comment on attachment 8631015 [details] [diff] [review]
For PuppetAgain modules/puppetmaster/templates/data.conf.erb
Review of attachment 8631015 [details] [diff] [review]:
-----------------------------------------------------------------
::: modules/puppetmaster/templates/data.conf.erb
@@ +59,5 @@
> </Directory>
> +
> +# NuGet repo redirection rules
> +<Directory /data/repos/nuget/>
> + AllowOverride All
Probably worth a comment here regarding the .htaccess file since otherwise it's easy to assume that these RedirectMatch's replace that functionality.
Attachment #8631015 -
Flags: review?(dustin) → review+
Updated•9 years ago
|
Attachment #8630988 -
Attachment is patch: false
Updated•9 years ago
|
Attachment #8630988 -
Flags: feedback?(dustin) → feedback+
Assignee | ||
Comment 14•9 years ago
|
||
Now with comments!!!
Attachment #8631015 -
Attachment is obsolete: true
Attachment #8631630 -
Flags: review?(dustin)
Comment 15•9 years ago
|
||
Comment on attachment 8631630 [details] [diff] [review]
For PuppetAgain modules/puppetmaster/templates/data.conf.erb
Review of attachment 8631630 [details] [diff] [review]:
-----------------------------------------------------------------
Thanks!
Attachment #8631630 -
Flags: review?(dustin) → review+
Assignee | ||
Comment 16•9 years ago
|
||
:dustin, i'm hoping to get your opinion on a change i'd like to make to the implementation. i want to remove the need to manually run the manifest generation script when a package is added. to do this, i would add a dependency on incron (file system monitor), lock it down so that only the puppetsync user can run incron jobs and create an incron job owned by puppetsync which looks like this:
# 0 regenerate nuget manifest when package folder content changes
/data/repos/nuget/nupkg IN_CREATE,IN_DELETE /data/repos/nuget/generate-manifest.sh
in my testing, i had to make some modifications to generate-manifest.sh (add locking, so that only one instance of the script runs at a a time, add a cd to change the working dir to the script dir (cd $( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )))
do you think installing incron on the puppetmasters (via puppet, of course) is a show-stopper for this change? ie: would we have concerns about running a file-system monitor on these servers?
Assignee | ||
Comment 17•9 years ago
|
||
Shelving the incron idea after this conversation on #releng (going to think about the repo mgmt thing from the perspective of all-repos instead):
13:12 <dustin> grenade: I don't have an issue with incron specifically
13:12 <dustin> but all of our RPM and DEB repos are ./update.sh's manually, so I'm disinclined to get fancier for nuget
13:13 <dustin> it provides a nice "now it'
13:13 <dustin> "now it's deployed" moment in the process
13:13 <dustin> since the package is basically invisible until that point
13:13 <grenade> ok, that's fine. didn't want to spend too much time on it if its too wild herring
13:14 <dustin> I'm open to being convinced that the benefits outweigh the costs in complexity
13:15 <dustin> I just don't see it yet
13:20 <grenade> sure. i like it because the now it moment is when you put package x on the box and there's no, ssh in and run some script. opens up the possibility for say, relengapi to handle the upload moment and the repo just auto-updates when the package arrives. i'd probably be in favour of rpm and deb repos being done the same way. reducing two-step complexity to one
13:20 <grenade> for all of 'em. of course i haven't even convinced myself that an incron script is the right implementation. kinda stinks of voodoo magic happening invisibly. i'd prefer a proper repo mgmt app/interface.
13:22 <grenade> but i prefer incron to ssh/script manual intervention voodoo
13:23 <grenade> not so in love with the idea that i'll be sad if nobody else likes it though
13:51 <dustin> hm
13:51 <dustin> if this were implemented for everything, I'd be more comfortable with it
13:51 <dustin> although -- what happens if there's an error?
13:52 <dustin> worst would be that the repo is left in a broken state
13:52 <dustin> at least with ./update.sh you know right away
13:52 <dustin> btw this is probably related to https://bugzilla.mozilla.org/show_bug.cgi?id=1050915
Assignee | ||
Comment 18•9 years ago
|
||
https://bugzilla.mozilla.org/show_bug.cgi?id=1199290#c4 --think about moving to a public moz repo...
Assignee | ||
Updated•8 years ago
|
Status: ASSIGNED → RESOLVED
Closed: 8 years ago
Resolution: --- → WONTFIX
You need to log in
before you can comment on or make changes to this bug.
Description
•