Closed Bug 1317800 Opened 8 years ago Closed 7 years ago

enable chain of trust verification in balrogworker

Categories

(Release Engineering :: Release Automation: Other, defect)

defect
Not set
normal

Tracking

(Not tracked)

RESOLVED FIXED

People

(Reporter: mozilla, Assigned: mtabara)

References

Details

Attachments

(6 files, 4 obsolete files)

We'll have to make these changes:

a. scopes

For signing, we have 3 levels of permissions, guarded by scopes.
project:releng:signing:cert:release-signing [1], which we only allow on release-capable branches, project:releng:signing:cert:nightly-signing [2], which we only allow on nightly-capable branches, and project:releng:signing:cert:dep-signing, which can be used anywhere.

Scriptworker uses the above-linked data structures to verify that a privileged scope is only used on an appropriate branch.  Signingscript determines which level of access to grant based on those scopes [3].

We need to follow this model for the other *scripts.  For balrog, I think this would be what accounts/balrog servers/products we're allowed to push to.  That would allow chain of trust verification to make sure we're not pushing to a privileged account/product from a non-privileged branch.

If we can't separate creds at first, let's file a followup bug to do so, and verify our upload bucket location matches release, nightly, or staging based on these scopes.

[1] https://github.com/mozilla-releng/scriptworker/blob/121c474f5b21084a4a3742f21c3f30c018e5c766/scriptworker/constants.py#L219
[2] https://github.com/mozilla-releng/scriptworker/blob/121c474f5b21084a4a3742f21c3f30c018e5c766/scriptworker/constants.py#L232
[3] https://github.com/mozilla-releng/signingscript/blob/master/signingscript/task.py#L19

b. downloads / upstreamArtifacts

For signing, we used to have task.payload.unsignedArtifacts, which was a list of URLs.  Now we have task.payload.upstreamArtifacts [4], which is a list of dictionaries that look like

    "upstreamArtifacts": [{
        "paths": [
            "public/build/target.tar.bz2",
            "public/build/target.checksums"
        ],
        "formats": ["gpg"],
        "taskId": "GFPKeLbAQN2fytGOXgatIg",
        "taskType": "build"

    }, {
        ...
    }]

The taskId is the taskId of the task we're downloading from.  The paths are the artifact paths we're downloading.  I don't know if you need to use the "formats" key or need to embed any additional information; we can play with this schema.  "taskType" is there for chain of trust verification.  Currently we only support "build", "l10n", "decision", "docker-image", but we can add more.

Scriptworker will pre-download these artifacts into $artifact_dir/public/cot/$task_id/$path , and verify their SHAs before calling the script.  balrog no longer needs to download these artifacts; it can and should use the pre-downloaded artifacts on disk.

I don't know how many artifacts we download, and how large this is going to get.  If we don't want to upload all of these upstreamArtifacts at the end of the balrog task, we can move them to $work_dir or otherwise remove from $artifact_dir before the end of the task, or change where scriptworker downloads them to.

[4] https://queue.taskcluster.net/v1/task/M81unWcDQje2XEwhtmDXrw

c. scriptworker.cot.verify will need to support balrog type workers

We'll also have to support any other new task types that we depend on.

d. upstream tasks will need to point at the right deps and have chain of trust generation enabled.

To enable chain of trust generation in a non-scriptworker task, set task.payload.features.ChainOfTrust to true.
When there are additional tasks we need to set as chain of trust dependencies in non-scriptworker tasks, we add them to task.extra.ChainOfTrust.inputs, which looks like

    "inputs": {
        "docker-image": "taskId",
        ...
    }


For upstream scriptworker tasks, we have sign_chain_of_trust [5] and upstreamArtifacts.  We can also follow the same task.extra.chainOfTrust.inputs model if that's easiest.

This may prevent us from fully enabling chain of trust verification on balrog if we depend directly on other non-signing scriptworker tasks that don't yet have chain of trust enabled, but we have some prefs [6] we can use until it's all enabled end-to-end.

[5] https://github.com/mozilla-releng/scriptworker/blob/121c474f5b21084a4a3742f21c3f30c018e5c766/scriptworker/constants.py#L58
[6] https://github.com/mozilla-releng/scriptworker/blob/121c474f5b21084a4a3742f21c3f30c018e5c766/scriptworker/constants.py#L57-L60

e. puppet

With bug 1316702, we now have a shared scriptworker puppet module.  Let's use that.

* There are updated dependencies, all pushed to the python3.5 pypi location.
* We now use a scriptworker.yaml which is much larger than our previous config.json.  This is populated in the scriptworker module, using variables you pass [7].  I still have the supervisord settings in the signing scriptworker area, because the watch file list can be different per instance type.
* gpg keys - we'll need to create new gpg keys per scriptworker instance, and make sure they're signed by an appropriate key.  The trusted keys are in scriptworker/trusted and the worker keys go into scriptworker/valid in the cot-gpg-keys repo [8].

[7] https://hg.mozilla.org/build/puppet/file/tip/modules/signing_scriptworker/manifests/init.pp#l54
[8] https://github.com/mozilla-releng/cot-gpg-keys
Depends on: 1318033, 1316071
Summary: enable chain of trust verification in balrog → enable chain of trust verification in balrogworker
Assignee: nobody → aki
https://github.com/escapewindow/balrogscript/commits/module makes balrogscript a python package ; I'll be basing my balrogscript cot work on this branch.
Attached patch [wip] [broken] upstream.diff (obsolete) — Splinter Review
This is the start of a patch that adds scopes and upstreamArtifacts to balrog and beetmover tasks.  We can start consolidating beetmover l10n tasks into chunks like the repacks, if we decide fewer tasks is better.  We do seem to be going in the moar-granular-tasks-is-better direction, but with chain of trust verification chunks may be the right size.

* This is currently broken, so it mach taskgraph needs fixing.
* We need to update beetmover and balrog scripts to be able to take the new task definition (jsonschema update), use scriptworker>=1.0.0b2, copy the upstreamArtifacts from work_dir/cot/upstream-task-id.  When we're only downloading a manifest, we should also be leaving a chain of trust artifact for the task-to-beetmove task, so we should verify the shas when we download those artifacts in beetmoverscript if we're not already in scriptworker.
* once we're using upstreamArtifacts and scopes, we can tear out the obsoleted task definition items, but that's non-urgent cleanup.
This is what I'm hitting, not quite sure what broke:

Exception: In task u'beetmover-signing-linux64-nightly/opt':
extra keys not allowed @ data[u'upstream-artifacts']
not a valid value for dictionary value @ data[u'worker'][u'implementation']
extra keys not allowed @ data[u'worker'][u'update_manifest']
extra keys not allowed @ data[u'worker'][u'taskid_of_manifest']
extra keys not allowed @ data[u'worker'][u'taskid_to_beetmove']
required key not provided @ data[u'worker'][u'docker-image']
{u'attributes': {u'build_platform': u'linux64-nightly',
                 u'build_type': u'opt',
                 u'nightly': True},
 u'dependencies': {u'build': u'build-linux64-nightly/opt',
                   u'build-signing': u'signing-linux64-nightly/opt'},
 u'description': u'Linux64 Nightly Signing Beetmover',
 u'label': u'beetmover-signing-linux64-nightly/opt',
 u'run-on-projects': [],
 u'scopes': [u'project/releng/beetmover/dep'],
 u'treeherder': {u'kind': u'build',
                 u'platform': u'linux64/opt',
                 u'symbol': u'tc(BM)',
                 u'tier': 2},
 u'upstream-artifacts': [{u'paths': [u'public/build/balrog_props.json'],
                          u'taskId': {u'task-reference': u'<build>'},
                          u'taskType': u'build'},
                         {u'paths': [u''],
                          u'taskId': {u'task-reference': u'<build-signing>'},
                          u'taskType': u'signing'}],
 u'worker': {u'implementation': u'beetmover',
             u'taskid_of_manifest': {u'task-reference': u'<build>'},
             u'taskid_to_beetmove': {u'task-reference': u'<build-signing>'},
             u'update_manifest': True},
 u'worker-type': u'scriptworker-prov-v1/beetmoverworker-v1'}
Comment on attachment 8817066 [details] [diff] [review]
[wip] [broken] upstream.diff

Review of attachment 8817066 [details] [diff] [review]:
-----------------------------------------------------------------

::: taskcluster/taskgraph/transforms/beetmover.py
@@ +126,5 @@
>                  'build_type': dep_job.attributes.get('build_type'),
>              },
>              'run-on-projects': dep_job.attributes.get('run_on_projects'),
>              'treeherder': treeherder,
> +            'upstream-artifacts': upstream_artifacts,

To solve your bustage, you want the upstream artifacts as part of the `worker` definition not the overall task.
Attached patch [wip] [broken] upstream3.diff (obsolete) — Splinter Review
This fixes the beetmover non-l10n.  The beetmover-l10n is broken; I think this is because we don't have a transform to turn the worker['upstream-artifacts'] into payload['upstreamArtifacts'].

To add this transform, I think we have to add a decorator for every beetmover-LOCALE-DEPJOB.  This made me wonder if it might be best to chunk these the same way as the l10n repack jobs, which we can then call beetmover-DEPJOB, which would be easier to write a decorator for.

In that case, we might have

for locale in dep_job.attributes.get('chunk_locales', []):
    paths.append("public/build/{}/balrog_props.json".format(locale))
# populate upstream_artifacts
# populate job chunk_locales
# yield a single description, rather than one per locale

and then add a transform for each beetmover-DEPJOB.  It's very likely I might be misunderstanding how this works, though.
Attachment #8817066 - Attachment is obsolete: true
I added a jsonschema to https://github.com/escapewindow/balrogscript/tree/cot .

TODO for balrogscript cot:
* actually check the task definition against the jsonschema
* use the new scopes to determine capabilities, and restrict which trees can use nightly/release scopes (i think the latter will be handled by https://github.com/escapewindow/scriptworker/commit/914e5c7b3e8604fd3cf7aacfd9649f4e7638f803 , so we may get this for free when we use the latest scriptworker)
* use the `upstreamArtifacts` from disk
* any additional artifact downloads need their shas verified against the chain of trust artifacts on disk
Attached patch [wip] [broken] upstream4.diff (obsolete) — Splinter Review
Same as upstream3 but with colon separated scopes.
Attachment #8817105 - Attachment is obsolete: true
Comment on attachment 8817244 [details] [diff] [review]
[wip] [broken] upstream4.diff

Review of attachment 8817244 [details] [diff] [review]:
-----------------------------------------------------------------

::: taskcluster/taskgraph/transforms/beetmover.py
@@ +94,5 @@
>  
> +        upstream_artifacts = [{
> +            "taskId": {"task-reference": taskid_of_manifest},
> +            "taskType": "build",
> +            "paths": ["public/build/balrog_props.json"],

For here, to solve l10n, you can expand the path with the l10n path if you desire (for current beet design).

basically:

'''
if job.get('label'):
  props_path = 'public/build/{}/balrog_props.json'.format(job.label)
else:
  props_path = 'public/build/balrog_props.json'

...
'''

::: taskcluster/taskgraph/transforms/beetmover_l10n.py
@@ +31,5 @@
> +                    raise NotImplementedError(
> +                        "can't beetmove a signing task with multiple dependencies")
> +                dep_name = dep_job.dependencies.keys()[0]
> +                taskid_of_manifest = "<" + str(dep_name) + ">"
> +            upstream_artifacts = [{

That is instead of doing all this stuff here...

@@ +56,5 @@
>                  'label': label,
>                  'locale': locale,
>              }
> +            beet_description.setdefault('worker', {})
> +            beet_description['worker']['upstream-artifacts'] = upstream_artifacts

alternatively you can set a beet description required value, with a default, which defaults to 'public/build/balrog_props.json' and has the l10n path if its coming from here, and the beet job itself will NOT use the path unless its a signing job.
Attached patch [needs testing] upstream5.diff (obsolete) — Splinter Review
This works!  It at least passes
  ./mach taskgraph tasks --json -p ~/Desktop/1205_date_params.yml

This will require us to have updated scriptworker instances that won't barf on the new task definition schema, for beetmover at least.
Attachment #8817244 - Attachment is obsolete: true
Assignee: aki → mtabara
First iteration on balrogworker puppet refactoring usinng the shared scriptworker module. Didn't test it yet, need few pre-requisites before that can happen:

1. create poduction branch in https://github.com/mozilla-releng/balrogscript
2. switch nightlies to use this 'production' branch rather than defaulting - suppose I need to tweak this https://hg.mozilla.org/build/puppet/file/tip/modules/balrog_scriptworker/manifests/init.pp#l76 to add branch too
3. Merge :aki's COT PR to master https://github.com/mozilla-releng/balrogscript/pull/9/ 
4. Create balrogscript 0.0.1 and upload it to puppet as a python package
5. test this patch against my local puppet environment to find any possible caveats

Misc/Questions:
* merging to another branch is not blocking but is more or less nicer. I could work on top of aki's PR too but seems a bit more complicated from my point of view
* I need to play with balrogscript and tools clone repo to see how they integrate; I'm a bit confused as to how they corelate right now
* since scriptworker will not be able to download the cot.json manifest from beetmoverscript I suppose the validation will fail. Is there a way I can turn the pre-script validation off? Just to make sure the rest of it works as expected ...
Flags: needinfo?(aki)
Attachment #8822437 - Flags: feedback?(aki)
(In reply to Mihai Tabara [:mtabara]⌚️GMT from comment #11)
> Created attachment 8822437 [details] [diff] [review]
> First draft of puppet balrogworker refactoring using the shared scriptworker
> 
> First iteration on balrogworker puppet refactoring usinng the shared
> scriptworker module. Didn't test it yet, need few pre-requisites before that
> can happen:
> 
> 1. create poduction branch in https://github.com/mozilla-releng/balrogscript
> 2. switch nightlies to use this 'production' branch rather than defaulting -
> suppose I need to tweak this
> https://hg.mozilla.org/build/puppet/file/tip/modules/balrog_scriptworker/
> manifests/init.pp#l76 to add branch too
> 3. Merge :aki's COT PR to master
> https://github.com/mozilla-releng/balrogscript/pull/9/ 
> 4. Create balrogscript 0.0.1 and upload it to puppet as a python package
> 5. test this patch against my local puppet environment to find any possible
> caveats

I think that's right.  1-4 should be relatively quick and straightforward.  Let me know if you have questions.

> 
> Misc/Questions:
> * merging to another branch is not blocking but is more or less nicer. I
> could work on top of aki's PR too but seems a bit more complicated from my
> point of view

+1

> * I need to play with balrogscript and tools clone repo to see how they
> integrate; I'm a bit confused as to how they corelate right now

+1. I think it should work, but I haven't tried from puppet.

> * since scriptworker will not be able to download the cot.json manifest from
> beetmoverscript I suppose the validation will fail. Is there a way I can
> turn the pre-script validation off? Just to make sure the rest of it works
> as expected ...

a) yes, https://github.com/mozilla-releng/scriptworker/blob/master/scriptworker/constants.py#L59 . the shared puppet module might not allow for that, but if you're working in your own env you can edit that line
b) once we have upstreamArtifacts in the task definition, you should be able to run `verify_cot` against any balrogscript task without having to run the entire task through scriptworker again
c) once you believe you're ready, you can pause the existing balrog scriptworker, have your new balrog scriptworker point at the right workerGroup etc., and retrigger tasks to verify.  Once that works, we're ready to cut over.
Flags: needinfo?(aki)
Comment on attachment 8822437 [details] [diff] [review]
First draft of puppet balrogworker refactoring using the shared scriptworker

Looks good!  A few comments below.  I imagine we may find more things to fix during testing.

>diff --git a/modules/balrog_scriptworker/files/dep.pubkey b/modules/balrog_scriptworker/files/dep.pubkey
>deleted file mode 100644
>--- a/modules/balrog_scriptworker/files/dep.pubkey
>+++ /dev/null

I'm guessing you're deleting the pubkeys because they're in the package?

>diff --git a/modules/balrog_scriptworker/manifests/init.pp b/modules/balrog_scriptworker/manifests/init.pp

>+    scriptworker::instance {
>+        "${balrog_scriptworker::settings::root}":
>+            basedir                  => "${balrog_scriptworker::settings::root}",
>+            task_script_executable   => "${balrog_scriptworker::settings::task_script_executable}",
>+            task_script              => "${balrog_scriptworker::settings::task_script}",
>+            task_script_config       => "${balrog_scriptworker::settings::task_script_config}",
>+            task_max_timeout         => $balrog_scriptworker::settings::task_max_timeout,
>+            username                 => "${users::builder::username}",
>+            group                    => "${users::builder::group}",
>+            worker_group             => "$env_config[worker_group]",
>+            worker_type              => "$env_config[worker_type]",
>+            cot_job_type             => "unknown",

this should be "balrog"

>     mercurial::repo {
>         "tools":
>             hg_repo => "${balrog_scriptworker::settings::tools_repo}",
>             dst_dir => "${balrog_scriptworker::settings::root}/balrogscript/tools",
>             user    => "${users::builder::username}",
>             branch  => "${balrog_scriptworker::settings::tools_branch}",
>             require => [
>                 Class["packages::mozilla::py27_mercurial"],
>                 Python35::Virtualenv["${balrog_scriptworker::settings::root}"],
>-                Git::Repo["balrogscript"],
>             ];
>     }

We may have to either play with PYTHONPATH or sys.path because I think balrogscript.py currently expects tools to live as a sibling to its parent dir... which would mean we'd have to clone tools inside the virtualenv.

>diff --git a/modules/balrog_scriptworker/manifests/settings.pp b/modules/balrog_scriptworker/manifests/settings.pp
>--- a/modules/balrog_scriptworker/manifests/settings.pp
>+++ b/modules/balrog_scriptworker/manifests/settings.pp
>@@ -1,8 +1,23 @@
> class balrog_scriptworker::settings {
>-    include ::config
>+    $root = "/builds/balrogworker"
>+    $task_script_executable = "${root}/bin/python"

Hm, don't you want this to be the py27venv python?  This executable will be used to launch balrogscript.
Attachment #8822437 - Flags: feedback?(aki) → feedback+
(In reply to Aki Sasaki [:aki] from comment #13)
> Comment on attachment 8822437 [details] [diff] [review]
> First draft of puppet balrogworker refactoring using the shared scriptworker
> 
> Looks good!  A few comments below.  I imagine we may find more things to fix
> during testing.

Yep, definitely ;)

 
> >diff --git a/modules/balrog_scriptworker/files/dep.pubkey b/modules/balrog_scriptworker/files/dep.pubkey
> >deleted file mode 100644
> >--- a/modules/balrog_scriptworker/files/dep.pubkey
> >+++ /dev/null
> 
> I'm guessing you're deleting the pubkeys because they're in the package?

Yes. Will grab them the same way we do for the schema - https://hg.mozilla.org/build/puppet/file/tip/modules/beetmover_scriptworker/templates/script_config.json.erb#l35 . I saw in your PR you've included the keys in the balrogscript manifest python package.
 
> >diff --git a/modules/balrog_scriptworker/manifests/init.pp b/modules/balrog_scriptworker/manifests/init.pp
> 
> >+    scriptworker::instance {
> >+        "${balrog_scriptworker::settings::root}":
> >+            basedir                  => "${balrog_scriptworker::settings::root}",
> >+            task_script_executable   => "${balrog_scriptworker::settings::task_script_executable}",
> >+            task_script              => "${balrog_scriptworker::settings::task_script}",
> >+            task_script_config       => "${balrog_scriptworker::settings::task_script_config}",
> >+            task_max_timeout         => $balrog_scriptworker::settings::task_max_timeout,
> >+            username                 => "${users::builder::username}",
> >+            group                    => "${users::builder::group}",
> >+            worker_group             => "$env_config[worker_group]",
> >+            worker_type              => "$env_config[worker_type]",
> >+            cot_job_type             => "unknown",
> 
> this should be "balrog"

Thanks! I superficially looked in the scriptworker constants.py code for this but thought the beetmover/balrog part were not merged. Mea culpa.
 
> 
> >     mercurial::repo {
> >         "tools":
> >             hg_repo => "${balrog_scriptworker::settings::tools_repo}",
> >             dst_dir => "${balrog_scriptworker::settings::root}/balrogscript/tools",
> >             user    => "${users::builder::username}",
> >             branch  => "${balrog_scriptworker::settings::tools_branch}",
> >             require => [
> >                 Class["packages::mozilla::py27_mercurial"],
> >                 Python35::Virtualenv["${balrog_scriptworker::settings::root}"],
> >-                Git::Repo["balrogscript"],
> >             ];
> >     }
> 
> We may have to either play with PYTHONPATH or sys.path because I think
> balrogscript.py currently expects tools to live as a sibling to its parent
> dir... which would mean we'd have to clone tools inside the virtualenv.

Yeah, good point! ;) That's why I said I was a bit confused as to how these two communicate with each other. But moving this to my environment should give me some more trial-and-error gear to play with. 

> >diff --git a/modules/balrog_scriptworker/manifests/settings.pp b/modules/balrog_scriptworker/manifests/settings.pp
> >--- a/modules/balrog_scriptworker/manifests/settings.pp
> >+++ b/modules/balrog_scriptworker/manifests/settings.pp
> >@@ -1,8 +1,23 @@
> > class balrog_scriptworker::settings {
> >-    include ::config
> >+    $root = "/builds/balrogworker"
> >+    $task_script_executable = "${root}/bin/python"
> 
> Hm, don't you want this to be the py27venv python?  This executable will be
> used to launch balrogscript.

Good catch, thanks \o/ 
Even worse, I noticed it at some point but forgot about it when pushing. I'll definitely fix it.
Spent a good chunk of hours with :aki today doing pair programming to finalize the staging instance on my pinned environment. Couple of observations:
* Thanks a lot :Aki for the patience and help to walk me through the variety of issues we've encoutered
* balrogscript changes are here [1] - I expect to have more commits there as I need to fix the tests
* puppet patches that eventually made us have a successfull puppet environment are here[2]
* specific scriptworker various fixes lie in a separate patch in this bug at [3]. Once that gets reviwed and landed, I shall rebase them against the changes from [2]
* we used [4] to setup and add a gpg key-pair for the staging instance machine. I created + signed the gpg keypair on my local machine, following which I've transfered them under the puppet server and added them encrypted to hiera following these rules [5]. Nevertheless, the encrpytion part in hiera has been done using [6], crypthing and using the one-liner for files.

We now have a successful delpoyment in a clean staging enviroment, using the shared scriptworker module.
Leftovers:
* test the balrogscript part and make sure changes from [1] work as expected - use a dummy task to make sure the tasks are being claimed (e.g. malformed payload or something)
* fix the tests for balrogscript in [1] before submitting this to review
* create production (balrogworker-1) gpg keys, sign the commit and add them to [7]. Once that lands, we're done with balrogscript/balrogworker CoT enabling, with only graph-enabling patches left (that are to enable the upstream-artifacts).
* remember to switch the settings/root from `/builds/balrogworker/` to `/builds/scriptworker/` to fully migrate to a shared-system
* other misc improvements

Side note: we'll need to do the same for beetmoverscript/beetmoverworker.

[1]: https://github.com/mozilla-releng/balrogscript/pull/12
[2]: https://github.com/mozilla/build-puppet/pull/23
[3]: https://reviewboard.mozilla.org/r/101972/
[4]: http://scriptworker.readthedocs.io/en/latest/chain_of_trust.html#new-scriptworker-gpg-keys
[5]: http://scriptworker.readthedocs.io/en/latest/new_instance.html#puppet
[6]: https://wiki.mozilla.org/ReleaseEngineering/PuppetAgain/Secrets
[7]: https://github.com/mozilla-releng/cot-gpg-keys
Comment on attachment 8823445 [details]
Bug 1317800 - fixes in shared scriptworker module.

https://reviewboard.mozilla.org/r/101972/#review102396

Thank you!
Attachment #8823445 - Flags: review?(aki) → review+
Attachment #8817266 - Attachment is obsolete: true
It looks like task.py currently needs both balrog and beetmover to land at the same time :\ Maybe we can update the current beetmover instance's jsonschema so we can land this without breaking it?
Similar to https://bugzilla.mozilla.org/show_bug.cgi?id=1289822#c26 I've attempted to use dummy task to see if all goes well down to actually importing the balrog submitter cli tools. Found a bunch of issues along the way in both puppet patches and balrogscript repo.

* hacked the tools import part as discussed in IRC chat by passing its patch via script_config
* redid the logging in the script to prevent errors such as 'No handlers could be found for logger "balrogscript.balrogscript" - as logging was previously configured after being used
* misc other changes
Attachment #8824036 - Flags: review?(aki)
:aki forgot to ask you yesterday, sorry about that - may I push the scriptworker puppet fixes patch? Just wanted to double-check it ain't breaking signing-scriptworker when we do default -> production puppet merge.
Flags: needinfo?(aki)
* added the keys to hiera for balrogworker-1
* PR to add the gpg signed pubkey in scriptworker/valid with a signed commit
Attachment #8824069 - Flags: review?(aki)
(In reply to Mihai Tabara [:mtabara]⌚️GMT from comment #21)
> :aki forgot to ask you yesterday, sorry about that - may I push the
> scriptworker puppet fixes patch? Just wanted to double-check it ain't
> breaking signing-scriptworker when we do default -> production puppet merge.

Yes please.  a) I'm around now, b) in theory it won't break anything, and c) even if it does we're not tier 1 yet.
Flags: needinfo?(aki)
Comment on attachment 8824036 [details] [review]
Balrogscript fixes for CoT

This looks good!
Attachment #8824036 - Flags: review?(aki) → review+
Comment on attachment 8824069 [details] [review]
Add balrogworker-1 key in cot-gpg-keys.

Woot!
Attachment #8824069 - Flags: review?(aki) → review+
(In reply to Aki Sasaki [:aki] from comment #25)
> Comment on attachment 8824069 [details] [review]
> Add balrogworker-1 key in cot-gpg-keys.
> 
> Woot!

I may need your help with merging this, I don't think I have the rights to do it.
Merged.
Comment on attachment 8823445 [details]
Bug 1317800 - fixes in shared scriptworker module.

https://hg.mozilla.org/build/puppet/rev/a1638bc744ac
Attachment #8823445 - Flags: checked-in+
Brought up another instance to prepare for the cut-over: balrogworker-2.srv.releng.usw2.mozilla.com
(In reply to Aki Sasaki [:aki] from comment #10)
> Created attachment 8817266 [details] [diff] [review]
> [needs testing] upstream5.diff
> 
> This works!  It at least passes
>   ./mach taskgraph tasks --json -p ~/Desktop/1205_date_params.yml
> 
> This will require us to have updated scriptworker instances that won't barf
> on the new task definition schema, for beetmover at least.

I've been playing with this locally. still needs work to finish up adding all the upstream artifact paths but here is what was done yesterday:

small additional diff: https://people-mozilla.org/~jlund/taskgraph_beet_balrog.diff

here is the list of artifact names ripped and interpolated from beet templates: https://people-mozilla.org/~jlund/taskgraph_beet_balrog.diff

I plan to finish this by tomorrow morning
* kill unused code
* rename balrogscript -> script for consistency with signing worker and beetmoverworker
* some pep8
Attachment #8825543 - Flags: review?(aki)
Attachment #8825543 - Flags: review?(aki) → review+
Blocks: 1330276
Depends on: 1330476
No longer depends on: 1316071
This is done on date.
Status: NEW → RESOLVED
Closed: 7 years ago
Resolution: --- → FIXED
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Created:
Updated:
Size: