Closed
Bug 1384241
Opened 7 years ago
Closed 7 years ago
Add daemon that triggers (partial) |mach build faster| builds on filesystem input changes (using pywatchman)
Categories
(Firefox Build System :: General, enhancement)
Firefox Build System
General
Tracking
(firefox56 fixed)
RESOLVED
FIXED
mozilla56
Tracking | Status | |
---|---|---|
firefox56 | --- | fixed |
People
(Reporter: nalexander, Assigned: nalexander)
References
(Depends on 1 open bug, Blocks 5 open bugs)
Details
Attachments
(3 files, 8 obsolete files)
This ticket tracks a subset of the functionality of Bug 1176557, namely triggering (partial) |mach build faster| builds when files change in the topsrcdir.
Comment hidden (mozreview-request) |
Comment hidden (mozreview-request) |
Comment hidden (mozreview-request) |
Assignee | ||
Comment 4•7 years ago
|
||
To test, I've been using wscat, like:
â‹Š> ~ wscat -c 'ws://localhost:8080/websocket' 12:37:48
connected (press CTRL+C to quit)
< {"type": "change", "change": {"inputs": ["/Users/nalexander/Mozilla/gecko/browser/themes/shared/aboutSocialError.css"], "unrecognized": [], "outputs": ["browser/chrome/browser/skin/classic/browser/aboutProviderDirectory.css", "browser/chrome/browser/skin/classic/browser/aboutSocialError.css"]}, "sequence": 0}
That's the change message coming from:
â‹Š> ~/M/gecko touch browser/themes/shared/aboutSocialError.css
Comment 5•7 years ago
|
||
mozreview-review |
Comment on attachment 8890029 [details]
Bug 1384241 - Part 2: Add |mach watch|: pywatchman based |mach build faster| daemon.
https://reviewboard.mozilla.org/r/161082/#review166466
I'm not done with the review. But please add a new patch that adds watchman detection to moz.configure and check for the presence of watchman before attempting to use it.
Ideally, we'd pass the path to watchman to pywatchman. However, it looks like the binary name 'watchman' is hardcoded in pywatchman. I'm pretty sure I've signed Facebook's CLA, so I'll create a PR for that.
::: python/mozbuild/mozbuild/mach_commands.py:315
(Diff revision 1)
> +
> + @Command('watch', category='build', description='Watch and re-build the tree.')
> + def watch(self):
> + """Watch and re-build the source tree."""
> + self._activate_virtualenv()
> + self.virtualenv_manager.install_pip_package('pywatchman==1.3.0')
I'm fine installing pywatchman from pip for now. However, pywatchman has a C extension. This will likely fail to install on Windows because building Python 2.7 extensions on Windows requires a copy of the Visual Studio 2008 toolchain, which most people don't have installed.
We'll want error checking around this package install. And, we may want to skip it entirely on Windows until we hook up binary distribution of a pywatchman wheel.
Comment 6•7 years ago
|
||
mozreview-review |
Comment on attachment 8890028 [details]
Bug 1384241 - Part 1: Add input to output mappings.
https://reviewboard.mozilla.org/r/161080/#review166556
I approve of the new features! The new `inputs()` method could be beneficial for writing different build backends!
But r- because a v2 of this series will be required.
I would encourage you to split this part to its own bug so it can be landed sooner, as mozpack is surprisingly active this week and you are likely to get bit rotted.
::: python/mozbuild/mozpack/copier.py:183
(Diff revision 1)
> + for output, file in self:
> + output = mozpath.normpath(output)
> + for input in file.inputs():
> + input = mozpath.normpath(input)
> + tree[input].add(output)
> + return dict(tree.iteritems())
Nit: pretty sure you can exclude `.iteritems()` here since `dict(<dict>)` does a shallow copy.
::: python/mozbuild/mozpack/files.py:487
(Diff revision 1)
> if not dest.exists():
> errors.fatal("Required existing file doesn't exist: %s" %
> dest.path)
>
> + def inputs(self):
> + return iter(())
Meh.
Most consumers likely don't care that this method returns an iterator. I would make the API contract "returns an iterable" and return lists or sets instead.
::: python/mozbuild/mozpack/files.py:506
(Diff revision 1)
> + pp = Preprocessor(defines=self.defines, marker=self.marker)
> + pp.setSilenceDirectiveWarnings(self.silence_missing_directive_warnings)
I'm slightly concerned about the duplication of code around invoking the preprocessor here and in `copy()`. It would be very easy for these to get out of sync and for bugs to form.
I think it's fine since the code is minimal and we don't touch it often. But if you wanted to refactor the "invoke the preprocessor" code to a standalone function, I'd approve!
::: python/mozbuild/mozpack/files.py:510
(Diff revision 1)
> + def inputs(self):
> + pp = Preprocessor(defines=self.defines, marker=self.marker)
> + pp.setSilenceDirectiveWarnings(self.silence_missing_directive_warnings)
> +
> + with open(self.path, 'rU') as input:
> + with open(os.devnull, 'w') as output:
I'm honestly not sure if this is better or worse than writing to a throwaway `io.BytesIO` or a no-op object. It probably doesn't matter.
Attachment #8890028 -
Flags: review?(gps) → review-
Comment 7•7 years ago
|
||
mozreview-review |
Comment on attachment 8890029 [details]
Bug 1384241 - Part 2: Add |mach watch|: pywatchman based |mach build faster| daemon.
https://reviewboard.mozilla.org/r/161082/#review166560
First, thank you for writing this patch. This feature is obviously a good idea and great things will eventually come from it.
There are many review comments. I view this feature much like artifact builds were in the beginning: experimental. "Perfect is the enemy of good" definitely applies and we shouldn't get bogged down in minutia. I'm all for landing *something* and then refactoring like crazy as necessary.
I've left a few higher-level/architecture comments inline.
Another idea I'd like you to consider is refactoring this as a build backend. That way, the input/output map can be regenerated as part of running `config.status` rather than the somewhat hacky approach of parsing the fastermake's `.track` files. Or at the very least, it will allow us to share backend-related code easier. As it stands, code in this new file exists outside the build system core and it is essentially bolted on top. It feels fragile and prone to breakage.
If this were integrated as a build backend, I imagine we'd have to use a true daemon. We could start the daemon automatically during the first build and it would be running in the background. Another benefit to integrating it into the actual build system is you'll have better control over locks and concurrent execution. I imagine we'll get into some states where an existing, user-initiated build is triggering topsrcdir changes (e.g. .pyc file updates). We'll probably want to defer subscription updates until other build processes have ended. Watchman has a facility for this. But obviously you need signaling in place to tell it when to pause subscriptions. That's where having a formal build backend could be useful.
TBH, I'm not sure where to draw the line in terms of acceptable quality for an initial landing. Perfect is the enemy of good. But I don't want to incur too much technical debt either. This will be a like artifact builds and will introduce a lot of "traffic" when it lands. If we can minimize the number of bugs by choosing a better architecture up front, I think that's worth sacrificing some speed for.
I definitely want to think a bit more about the future "architecture" for a background daemon and file watching. I also want to error on the side of being able to achieve fast builds with as little user involvement as possible. For example, artifact builds are amazing but since they require a manual opt-in, there are developers not seeing their benefits. I would very much like to devise a filesystem watching/building feature to require as little user involvement as possible. It's fine if we don't have that initially. But we should have a transition plan.
Again, this will be a killer feature. If you want help hacking on it, I think I can find time. But you know me - I'm always over-committed, so no promises :)
::: commit-message-ef2c4:5
(Diff revision 1)
> +- watching in a running Python instance, rather than invoking a
> + process. I want to avoid Python startup overhead, which dwarfs the
> + build faster incremental updates.
Huh? `mach`'s startup overhead should be under 100ms on a new machine. And there's a lot of room to optimize that if it becomes a problem. Startup overhead of a non-mach process will be dozens of milliseconds shorter.
I think it is worth discussing this architecture in more detail because daemons are hard. Although I see from your next patch that you have a very compelling use of a daemon. And there's other things I've wanted to shoehorn into a generic build/mach daemon over the years. While a generic daemon is scope bloat for this bug, it does skew my (eventual) architecture preference towards a daemon.
::: commit-message-ef2c4:17
(Diff revision 1)
> +- watching subdirectories of topsrcdir. I particularly want to avoid
> + topobjdir-in-topsrcdir.
Assuming Mercurial is being used, it will have a watch of the entire topsrcdir. So I don't think this optimization is worth while: just watch the entire topsrcdir.
If the overhead of watchman watching a topobjdir-in-topsrcdir is significant (it is multiple CPU seconds during a build but I haven't measured exactly), we can warn/ban topobjdir-in-topsrcdir in configure when watchman is being used.
Finally, another benefit of subscribing to the root directory is that watchman is VCS aware and it will avoid emitting events while a Mercurial or Git transaction is ongoing. i.e. it waits for the working directory operation to "settle" before informing consumers. You definitely want this.
::: python/mozbuild/mozbuild/faster_daemon.py:69
(Diff revision 1)
> + defines = dict((name, self.config_environment.defines[name])
> + for name in self.config_environment.defines
> + if name not in self.config_environment.non_global_defines)
Ewww. Could I get you to add a helper property to configenvironment.py instead? Or ignore if you don't want the scope bloat :)
::: python/mozbuild/mozbuild/faster_daemon.py:72
(Diff revision 1)
> + defines.update({
> + 'AB_CD': 'en-US',
> + 'BUILD_FASTER': 1,
> + })
> + defines.update({
> + 'BOOKMARKS_INCLUDE_DIR': mozpath.join(self.config_environment.topsrcdir,
Please add inline comment why all these are necessary.
::: python/mozbuild/mozbuild/faster_daemon.py:74
(Diff revision 1)
> + defines = dict((name, self.config_environment.defines[name])
> + for name in self.config_environment.defines
> + if name not in self.config_environment.non_global_defines)
> + defines.update({
> + 'AB_CD': 'en-US',
> + 'BUILD_FASTER': 1,
Values should be strings.
::: python/mozbuild/mozbuild/faster_daemon.py:82
(Diff revision 1)
> + @property
> + def file_copier(self):
> + if self._file_copier is None:
You can use `@mozbuild.util.memoized_property` for this cached property pattern.
::: python/mozbuild/mozbuild/faster_daemon.py:87
(Diff revision 1)
> + finder = FileFinder(mozpath.join(self.config_environment.topobjdir, 'faster'))
> + for path, f in finder.find('*.track'):
> + manifest = InstallManifest(fileobj=f.open())
This feels like a layering violation. I assume similar code exists somewhere in a build backend? Is there room to consolidate?
::: python/mozbuild/mozbuild/faster_daemon.py:133
(Diff revision 1)
> +
> + return watches
> +
> + def subscribe_to_dir(self, name, dir_to_watch):
> + query = {
> + 'expression': ['true'],
When you change this to watch the entire topsrcdir, be sure to exclude .hg and .git. e.g.
```
'not', ['anyof', ['dirname', '.hg'],
['name', '.hg', 'wholename']
]
```
(That's just for Mercurial, obviously.)
::: python/mozbuild/mozbuild/faster_daemon.py:145
(Diff revision 1)
> + root = watch['watch']
> + if 'relative_path' in watch:
> + query['relative_root'] = watch['relative_path']
> +
> + # Get the initial clock value so that we only get updates.
> + query['since'] = self.client.query('clock', root)['clock']
I think this may want `sync_timeout` and/or `empty_on_fresh_instance` defined.
::: python/mozbuild/mozbuild/faster_daemon.py:166
(Diff revision 1)
> +
> + for dat in data:
> + files |= set([mozpath.join(self.config_environment.topsrcdir, name, f)
> + for f in dat.get('files', [])])
> +
> + files = set([f for f in files if os.path.isfile(f)])
Watchman 3.1+ supports a "type" field in the subscription. If you receive that, you can avoid the stat() here.
Even better, the query expression can limit to certain types so this filtering is done in watchman itself!
::: python/mozbuild/mozbuild/faster_daemon.py:172
(Diff revision 1)
> +
> + return files
> +
> + def incremental_copy(self, copier, force=False, verbose=True):
> + # Just like the 'repackage' target in browser/app/Makefile.in.
> + if 'cocoa' == self.config_environment.substs['MOZ_WIDGET_TOOLKIT']:
I hate that the build system does this. But what are you gonna do?
::: python/mozbuild/mozbuild/faster_daemon.py:200
(Diff revision 1)
> + '''
> + Return an iterator of `FasterBuildChange` instances as inputs
> + to the faster build system change.
> + '''
> +
> + self.client = pywatchman.client()
By default, the watchman client uses JSON. JSON is slow.
There's an `useImmutableBser` argument that tells the client to use Python objects that are immutable. This can translate to significant performance wins for large data exchanges. It is probably best to use that.
::: python/mozbuild/mozbuild/faster_daemon.py:200
(Diff revision 1)
> + '''
> + Return an iterator of `FasterBuildChange` instances as inputs
> + to the faster build system change.
> + '''
> +
> + self.client = pywatchman.client()
You'll want a `try..finally` here so you can `self.client.close()` to ensure the watchman connection is cleaned up on function exit.
::: python/mozbuild/mozbuild/faster_daemon.py:217
(Diff revision 1)
> + _watch_result = self.client.receive()
> +
> + changed = self.changed_files()
It looks like the returned object is always populated. But `self.changed_files()` does a blind iteration of all subs and tests each one. This feels a bit wasteful.
Assuming you switch to the immutable bser codec, test for `hasattr(_watch_result, 'subscription')` and no-op.
::: python/mozbuild/mozbuild/faster_daemon.py:226
(Diff revision 1)
> + continue
> +
> + result = FasterBuildChange()
> +
> + for change in changed:
> + change = mozpath.normpath(change)
I think path normalization should be done in `changed_files()`.
::: python/mozbuild/mozbuild/faster_daemon.py:234
(Diff revision 1)
> + if output not in result.output_to_inputs:
> + result.output_to_inputs[output] = set()
> + result.output_to_inputs[output].add(input)
Use a `collections.defaultdict`.
::: python/mozbuild/mozbuild/mach_commands.py:318
(Diff revision 1)
> + daemon = Daemon(self.config_environment)
> + return daemon.watch()
OK - so this isn't exactly a "daemon" because it never detaches. Instead, the process just sits around, waiting for events or termination.
That's fine. But it does mean that users have to opt in to this and start a foreground process. That's a bit annoying.
I'd be tempted to create `mach daemon start` and `mach daemon stop` to control a background daemon. It can kill itself after N seconds of inactivity so it doesn't "leak." We could also have it self-terminate when the objdir is deleted.
But I suppose this would be a follow-up.
Attachment #8890029 -
Flags: review?(gps) → review-
Comment 8•7 years ago
|
||
mozreview-review |
Comment on attachment 8890030 [details]
Bug 1384241 - Part 2: Push changes to WebSocket listeners.
https://reviewboard.mozilla.org/r/161084/#review166582
I don't know gevent, so this code is over my head. Fortunately, I know a few Mozillians who can help out when the time comes. I'd start with Tarek Ziade and Ryan Kelly.
I think the concept of starting a WebSocket server so Firefox can receive notifications about updated files is pretty rad. Opens up a lot of awesome potential workflows and features for hacking on Firefox within Firefox (which has always been a concept I've wanted to explore).
The websocket code should live outside a mach_commands.py file. We don't want to pollute mach_commands.py files. As we move to a model where we have daemons, I think we'll find that we regret a lot of the code being in mach_commands.py and not consumable as a library by non-mach processes/daemons.
It also feels like we're missing a generic primitive for subscribing Python callbacks to filesystem changes. We have 2 consumers in 2 patches. I imagine there will be more. How do we dispatch events to multiple consumers? (YAGNI may apply here.)
::: python/mozbuild/mozbuild/mach_commands.py:319
(Diff revision 1)
> + def watch(self, verbose=False):
> """Watch and re-build the source tree."""
> self._activate_virtualenv()
> self.virtualenv_manager.install_pip_package('pywatchman==1.3.0')
> + self.virtualenv_manager.install_pip_package('bottle==0.12.13')
> + self.virtualenv_manager.install_pip_package('gevent-websocket==0.10.1')
gevent requires compiled extensions and may have problems installing. However, they publish binary wheels, so we may get lucky.
::: python/mozbuild/mozbuild/mach_commands.py:350
(Diff revision 1)
> + except pywatchman.CommandError as ex:
> + print('watchman:', ex.msg, file=sys.stderr)
> + sys.exit(1)
> +
> + except pywatchman.SocketTimeout as ex:
> + print('watchman:', str(ex), file=sys.stderr)
> + sys.exit(2)
> +
> + except KeyboardInterrupt:
> + # Suppress ugly stack trace when user hits Ctrl-C.
> + sys.exit(3)
It would be really nice if the watchman details were abstracted away.
Attachment #8890030 -
Flags: review?(gps) → review-
Assignee | ||
Comment 9•7 years ago
|
||
mozreview-review |
Comment on attachment 8890028 [details]
Bug 1384241 - Part 1: Add input to output mappings.
https://reviewboard.mozilla.org/r/161080/#review166590
::: python/mozbuild/mozpack/copier.py:183
(Diff revision 1)
> + for output, file in self:
> + output = mozpath.normpath(output)
> + for input in file.inputs():
> + input = mozpath.normpath(input)
> + tree[input].add(output)
> + return dict(tree.iteritems())
This is really converting from a `defaultdict` to a `dict`, but your suggestion seems correct. I'll do this.
::: python/mozbuild/mozpack/files.py:487
(Diff revision 1)
> if not dest.exists():
> errors.fatal("Required existing file doesn't exist: %s" %
> dest.path)
>
> + def inputs(self):
> + return iter(())
I agree, but I always avoid returning modifiable collections to consumers, and I don't want consumers to expect ordering. I'll consider dropping the iterator.
::: python/mozbuild/mozpack/files.py:506
(Diff revision 1)
> + pp = Preprocessor(defines=self.defines, marker=self.marker)
> + pp.setSilenceDirectiveWarnings(self.silence_missing_directive_warnings)
The structure of the code isn't really amenable to extracting a helper -- the dep-writing code is entangled with the consuming `copy()`. I'm going to leave this for now.
::: python/mozbuild/mozpack/files.py:510
(Diff revision 1)
> + def inputs(self):
> + pp = Preprocessor(defines=self.defines, marker=self.marker)
> + pp.setSilenceDirectiveWarnings(self.silence_missing_directive_warnings)
> +
> + with open(self.path, 'rU') as input:
> + with open(os.devnull, 'w') as output:
It wasn't clear to me if there was a no-op object already defined; a preliminary search didn't reveal one. Hence, `os.devnull`... which is basically that, provided by the OS.
Assignee | ||
Comment 10•7 years ago
|
||
mozreview-review |
Comment on attachment 8890029 [details]
Bug 1384241 - Part 2: Add |mach watch|: pywatchman based |mach build faster| daemon.
https://reviewboard.mozilla.org/r/161082/#review167080
Thanks for the great review! I'm excited to see you're excited!
::: commit-message-ef2c4:17
(Diff revision 1)
> +- watching subdirectories of topsrcdir. I particularly want to avoid
> + topobjdir-in-topsrcdir.
This VCS-aware debouncing is compelling, so I've switched to a single watch of topsrcdir.
::: python/mozbuild/mozbuild/faster_daemon.py:69
(Diff revision 1)
> + defines = dict((name, self.config_environment.defines[name])
> + for name in self.config_environment.defines
> + if name not in self.config_environment.non_global_defines)
Done as a Pre: patch.
::: python/mozbuild/mozbuild/faster_daemon.py:72
(Diff revision 1)
> + defines.update({
> + 'AB_CD': 'en-US',
> + 'BUILD_FASTER': 1,
> + })
> + defines.update({
> + 'BOOKMARKS_INCLUDE_DIR': mozpath.join(self.config_environment.topsrcdir,
I have done so, referencing the ranges I culted this from. It might be worth pushing these hacks up into the `acdefines()` helper I declared... but it wouldn't get them out of Faster Make's `rules.mk` :(
::: python/mozbuild/mozbuild/faster_daemon.py:74
(Diff revision 1)
> + defines = dict((name, self.config_environment.defines[name])
> + for name in self.config_environment.defines
> + if name not in self.config_environment.non_global_defines)
> + defines.update({
> + 'AB_CD': 'en-US',
> + 'BUILD_FASTER': 1,
Fixed.
::: python/mozbuild/mozbuild/faster_daemon.py:87
(Diff revision 1)
> + finder = FileFinder(mozpath.join(self.config_environment.topobjdir, 'faster'))
> + for path, f in finder.find('*.track'):
> + manifest = InstallManifest(fileobj=f.open())
You can see how I've interpreted this in the patch series I'm about to push.
::: python/mozbuild/mozbuild/faster_daemon.py:133
(Diff revision 1)
> +
> + return watches
> +
> + def subscribe_to_dir(self, name, dir_to_watch):
> + query = {
> + 'expression': ['true'],
Done.
::: python/mozbuild/mozbuild/faster_daemon.py:145
(Diff revision 1)
> + root = watch['watch']
> + if 'relative_path' in watch:
> + query['relative_root'] = watch['relative_path']
> +
> + # Get the initial clock value so that we only get updates.
> + query['since'] = self.client.query('clock', root)['clock']
Reading the documentation for `sync_timeout` at https://facebook.github.io/watchman/docs/cmd/query.html, I don't know what you intend. I don't think you'd want `sync_timeout` to be _larger_ than the default, which is 60s. Do you think `sync_timeout` should be 0, and we should accept inconsistent filesystem states? In both cases I don't think this approach is going to produce a good developer experience, but it might be more useful to at least push _some_ output changes.
I think I agree about `empty_on_fresh_instance`, so I'll add that now.
::: python/mozbuild/mozbuild/faster_daemon.py:166
(Diff revision 1)
> +
> + for dat in data:
> + files |= set([mozpath.join(self.config_environment.topsrcdir, name, f)
> + for f in dat.get('files', [])])
> +
> + files = set([f for f in files if os.path.isfile(f)])
I've used this. Good suggestion!
::: python/mozbuild/mozbuild/faster_daemon.py:172
(Diff revision 1)
> +
> + return files
> +
> + def incremental_copy(self, copier, force=False, verbose=True):
> + # Just like the 'repackage' target in browser/app/Makefile.in.
> + if 'cocoa' == self.config_environment.substs['MOZ_WIDGET_TOOLKIT']:
Well... I think we should either make Mac OS X really use `Nightly.app` during the build, or we should replace the existing functionality with a bit of Python to do it with manifests. But that's totally out of scope :)
::: python/mozbuild/mozbuild/faster_daemon.py:200
(Diff revision 1)
> + '''
> + Return an iterator of `FasterBuildChange` instances as inputs
> + to the faster build system change.
> + '''
> +
> + self.client = pywatchman.client()
The version of pywatchman in pip is 1.3.0, which doesn't support this flag. I think we can push this to follow-up, since it probably depends on solving this binary wheel problem.
::: python/mozbuild/mozbuild/faster_daemon.py:217
(Diff revision 1)
> + _watch_result = self.client.receive()
> +
> + changed = self.changed_files()
I cannot use the bser codec with the version of pywatchman that I can install with pip, so I'm not going to do this. However, the iteration over subs will be just a single sub (corresponding to $topsrcdir).
::: python/mozbuild/mozbuild/faster_daemon.py:226
(Diff revision 1)
> + continue
> +
> + result = FasterBuildChange()
> +
> + for change in changed:
> + change = mozpath.normpath(change)
I concur.
::: python/mozbuild/mozbuild/faster_daemon.py:234
(Diff revision 1)
> + if output not in result.output_to_inputs:
> + result.output_to_inputs[output] = set()
> + result.output_to_inputs[output].add(input)
I don't want to return a `defaultdict` 'cuz queries silently succeed with empty results. (I got bitten by this earlier.) Since I want to create a valid `FasterBuildChange`, I chose to use a regular dictionary and manually manage updates.
I'm going to keep this unless you feel very strongly.
::: python/mozbuild/mozbuild/mach_commands.py:315
(Diff revision 1)
> +
> + @Command('watch', category='build', description='Watch and re-build the tree.')
> + def watch(self):
> + """Watch and re-build the source tree."""
> + self._activate_virtualenv()
> + self.virtualenv_manager.install_pip_package('pywatchman==1.3.0')
You are correct. Worse, VS 2008 is missing `stdint.h`, which needs to be copied in from some github repo. (The instructions are in the Google Doc I circulated to you.)
I have no idea how to set up a pywatchman binary wheel, so I'll add try-block here but not try to address the larger problem. Eventually we can link to MDN docs around pywatchman installation on Windows.
::: python/mozbuild/mozbuild/mach_commands.py:318
(Diff revision 1)
> + daemon = Daemon(self.config_environment)
> + return daemon.watch()
The flow that seems most compelling is |mach run --watch|, which can start the WebSocket server _and_ set an environment variable or provide command line argument to the spawned Firefox instance to connect to the server. (Otherwise, you have a port negotiation problem to solve.)
For first landing, I'd really like to stick to a waiting process. It has many advantages:
- it's clear when it's running;
- it's clear where diagnostics end up;
- it's easier to make reliable on Windows.
So let's make detaching follow-up.
Comment hidden (mozreview-request) |
Comment hidden (mozreview-request) |
Comment hidden (mozreview-request) |
Comment hidden (mozreview-request) |
Comment hidden (mozreview-request) |
Comment hidden (mozreview-request) |
Comment hidden (mozreview-request) |
Comment hidden (mozreview-request) |
Comment hidden (mozreview-request) |
Assignee | ||
Updated•7 years ago
|
Attachment #8890028 -
Attachment is obsolete: true
Comment 20•7 years ago
|
||
mozreview-review |
Comment on attachment 8890668 [details]
Bug 1384241 - Review comment: Make faster make backend write one manifest.
https://reviewboard.mozilla.org/r/161846/#review167186
The changes here seem mostly reasonable.
But can you please squash these fixups into the appropriate commit? I'd like to review the final - not intermediate - state. If it's one big patch fine - I can handle that.
It's getting late, so I probably won't review tonight. But if you have a squashed version in my inbox tomorrow morning, I'll get to it before EOD. I understand you are on the clock for finishing this :)
Comment hidden (mozreview-request) |
Comment hidden (mozreview-request) |
Comment hidden (mozreview-request) |
Assignee | ||
Updated•7 years ago
|
Attachment #8890029 -
Attachment is obsolete: true
Attachment #8890029 -
Flags: review?(gps)
Assignee | ||
Updated•7 years ago
|
Attachment #8890663 -
Attachment is obsolete: true
Attachment #8890663 -
Flags: review?(gps)
Assignee | ||
Updated•7 years ago
|
Attachment #8890664 -
Attachment is obsolete: true
Attachment #8890664 -
Flags: review?(gps)
Assignee | ||
Updated•7 years ago
|
Attachment #8890665 -
Attachment is obsolete: true
Attachment #8890665 -
Flags: review?(gps)
Assignee | ||
Updated•7 years ago
|
Attachment #8890666 -
Attachment is obsolete: true
Attachment #8890666 -
Flags: review?(gps)
Assignee | ||
Updated•7 years ago
|
Attachment #8890668 -
Attachment is obsolete: true
Attachment #8890668 -
Flags: review?(gps)
Comment 24•7 years ago
|
||
Here's a patch to use alongside `./mach watch` for testing. To do so:
Terminal 1: ./mach watch
Terminal 2: ./mach run
Then you should be able to:
1) change a CSS file (like browser.css) and see the style update in the running browser immediately
2) reload an in content page (i.e. about:preferences) to see markup/JS changes for that page
3) ctrl+alt+r / cmd+opt+r to restart the browser to pick up JSM / JS changes (anything that `./mach build faster` builds)
4) ctrl+alt+shift+r / cmd+opt+shift+r to 'reload' the active browser window. This should pick up browser.js / browser.xul changes. This one is known to be buggy, but it's useful for at least testing out ./mach watch changes.
Comment 25•7 years ago
|
||
mozreview-review |
Comment on attachment 8890662 [details]
Bug 1384241 - Pre: Extract acdefines() helper function.
https://reviewboard.mozilla.org/r/161834/#review167564
Attachment #8890662 -
Flags: review?(gps) → review+
Comment 26•7 years ago
|
||
mozreview-review |
Comment on attachment 8890667 [details]
Bug 1384241 - Add |mach watch|: pywatchman based incremental |mach build faster| shell.
https://reviewboard.mozilla.org/r/161844/#review167584
This is looking much better.
I'm playing with this locally, and it works surprisingly well! As far as human perception is concerned, file changes are updated in the objdir instantaneously.
I did manage to make it crash by running `hg update` to an older changeset. Update removed a file and it barfed about an unknown file. We'll definitely want some improvements to handle that. But I think it can be deferred as a follow-up.
I suspect there's an opportunity to move more of this code into a build backend and to actually invoke a build process so this code doesn't need to concern itself with backend implementation details. Yes, it will add dozens of ms of overhead to invoke a Python process. But I think it will still be fast enough. That can be deferred to a follow-up though: the code here is pretty minimal. It's just that I'd prefer something closer to the build backend making "do X when Y changes" determinations.
::: python/mozbuild/mozbuild/backend/fastermake.py:158
(Diff revision 2)
>
> mk.add_statement('include $(TOPSRCDIR)/config/faster/rules.mk')
>
> + unified_manifest = InstallManifest()
> for base, install_manifest in self._install_manifests.iteritems():
> + if base.startswith('dist/bin'):
On Linux and Windows artifact builds, *every* manifest is for dist/bin. I wonder if we should limit this to artifact builds for now and assert() instead.
::: python/mozbuild/mozbuild/backend/fastermake.py:159
(Diff revision 2)
> + unified_manifest.add_entries_from(install_manifest,
> + base=mozpath.relpath(base, 'dist/bin'))
We know the string prefix. Since config.status run times can be annoying, it is probably worth optimizing the relpath into a string slice.
::: python/mozbuild/mozbuild/faster_daemon.py:57
(Diff revision 2)
> +
> +
> +class FasterBuildException(Exception):
> + def __init__(self, cause):
> + Exception.__init__(self)
> + self.cause = cause
AFAICT this is never used. I'm getting this exception when I run `mach watch` initially. But no details of the inner exception are printed!
::: python/mozbuild/mozbuild/faster_daemon.py:89
(Diff revision 2)
> + @mozbuild.util.memoized_property
> + def file_copier(self):
Please add a TODO in here to track needing to refresh the file copier when the underlying manifest changes. If e.g. a new file is added and we run `config.status`, the cached manifest will be out of date.
Out of curiosity, how long does it take to call this function? Does it need to be cached?
::: python/mozbuild/mozbuild/faster_daemon.py:101
(Diff revision 2)
> +
> + unified_manifest.populate_registry(file_copier, defines_override=self.defines)
> +
> + return file_copier
> +
> + def subscribe_to_topsrcdir(self, dir_to_watch):
Since `self.config_environment` exists, you don't need to pass the directory to watch. Although I do like functions that accept arguments instead of relying on state. But since `topsrcdir` is in the name, it should be argument-less.
::: python/mozbuild/mozbuild/faster_daemon.py:126
(Diff revision 2)
> + # Get the initial clock value so that we only get updates.
> + query['since'] = self.client.query('clock', root)['clock']
> +
> + return self.client.query('subscribe', root, 'topsrcdir', query)
You should probably log something here, as the first time a watch on the topsrcdir is established, watchman will need to walk the directory tree and this could take several seconds. It would be nice to have feedback that it is doing something.
::: python/mozbuild/mozbuild/faster_daemon.py:126
(Diff revision 2)
> + # Get the initial clock value so that we only get updates.
> + query['since'] = self.client.query('clock', root)['clock']
> +
> + return self.client.query('subscribe', root, 'topsrcdir', query)
I /think/ we can set an initial `since` value of `c:0:0` to get all changes since the watch was registered.
I'm not sure at which point we could hit a long pause querying watchman though. I'm guessing during `subscribe`. But you'll have to test it.
::: python/mozbuild/mozbuild/faster_daemon.py:179
(Diff revision 2)
> +
> + self.client = pywatchman.client()
> +
> + try:
> + # TODO: restrict these capabilities to the minimal set.
> + self.client.capabilityCheck(required=['term-dirname', 'cmd-watch-project', 'wildmatch'])
I think we also need `clock-sync-timeout` to use the `clock` command.
::: python/mozbuild/mozbuild/faster_daemon.py:190
(Diff revision 2)
> + if not outputs:
> + raise Exception("Refusing to watch input ({}) with no outputs".format(input))
> +
> + while True:
> + try:
> + _watch_result = self.client.receive()
I /think/ there may be a bug in pywatchman where it doesn't handle interrupted system calls. So we may get some wonky stacks with ^C. But I'd have to check. Not much we can do.
::: python/mozbuild/mozbuild/faster_daemon.py:220
(Diff revision 2)
> + except pywatchman.SocketTimeout as ex:
> + # Abstract away pywatchman errors.
> + raise FasterBuildException(e)
`ex` vs `e`.
::: python/mozbuild/mozbuild/faster_daemon.py:220
(Diff revision 2)
> +
> + except pywatchman.CommandError as e:
> + # Abstract away pywatchman errors.
> + raise FasterBuildException(e)
> +
> + except pywatchman.SocketTimeout as ex:
I'm hitting this exception locally. It occurs calling the `clock` command. You'll want to pass `sync_timeout` to the command. I'm guessing the value should be upwards of 30s so it works on Windows.
::: python/mozbuild/mozbuild/mach_commands.py:313
(Diff revision 2)
>
> @CommandProvider
> +class Watch(MachCommandBase):
> + """Interface to watch and re-build the tree."""
> +
> + @Command('watch', category='build', description='Watch and re-build the tree.')
This could be deferred to a follow-up, but there should be a `condition` for this `@Command` so the command is only available if there is a configured build and watchman is available. Search for `MachCommandConditions` in the repo.
::: python/mozbuild/mozbuild/mach_commands.py:319
(Diff revision 2)
> + def watch(self):
> + """Watch and re-build the source tree."""
> + self._activate_virtualenv()
> + try:
> + self.virtualenv_manager.install_pip_package('pywatchman==1.3.0')
> + except:
bareword `except` is evil because it swallows `KeyboardInterrupt` and `SystemExit`. Almost always use `except Exception` (which is still not great but less evil).
::: python/mozbuild/mozbuild/mach_commands.py:326
(Diff revision 2)
> + 'https://developer.mozilla.org/en-US/docs/TODO_DOCUMENTATION_URL')
> + return 1
> +
> + from mozbuild.faster_daemon import Daemon
> + daemon = Daemon(self.config_environment)
> + return daemon.watch()
Missing method :)
::: python/mozbuild/mozpack/manifests.py:334
(Diff revision 2)
> + def add_entries_from(self, other, base=''):
> + """
> + Copy data from another mozpack.copier.InstallManifest
> + instance, adding an optional base prefix to the destination.
> +
> + This allows to merge two manifests into a single manifest, or
> + two take the tagged union of two manifests.
> + """
This is a useful function! However, it may be missing adjustment to `self._source_files`. See the `__ior__` implementation for this class, which is what we currently use to merge manifests.
You'll probably want to copy `__ior__`'s code here, modify to support a prefix/base, then have `__ior__` call into this method.
Attachment #8890667 -
Flags: review?(gps) → review-
Comment 27•7 years ago
|
||
mozreview-review |
Comment on attachment 8890030 [details]
Bug 1384241 - Part 2: Push changes to WebSocket listeners.
https://reviewboard.mozilla.org/r/161084/#review167604
If we land this, I'd appreciate getting review from someone who knows gevent because I sure don't!
I have heard a number of horror stories about gevent though. Echoing our chat earlier, I'm not opposed to the idea of requiring Python 3.5+ for this feature and changing this code to use `asyncio`. This will need an asyncio watchman client, however. https://github.com/facebook/watchman/pull/316 tracks that. Worst case, that may be a weekend project for me :)
Attachment #8890030 -
Flags: review?(gps) → review-
Comment hidden (mozreview-request) |
Comment hidden (mozreview-request) |
Assignee | ||
Updated•7 years ago
|
Attachment #8890030 -
Attachment is obsolete: true
Comment 30•7 years ago
|
||
mozreview-review |
Comment on attachment 8890667 [details]
Bug 1384241 - Add |mach watch|: pywatchman based incremental |mach build faster| shell.
https://reviewboard.mozilla.org/r/161844/#review168022
Aside from the issues I opened, this is good enough for a first implementation!
::: python/mozbuild/mozbuild/faster_daemon.py:119
(Diff revision 3)
> + ],
> + ],
> + ],
> + 'fields': ['name'],
> + }
> + watch = self.client.query('watch-project', dir_to_watch)
I randomly get a timeout on this line. Not sure why.
If I pass `timeout=5.0` in the client constructor, I make the timeout go away.
But increasing timeouts only wallpapers over the problem. If this is something could take take many seconds to complete, we need to retry or something.
Please increase the timeout in the client. And add a TODO to investigate this.
::: python/mozbuild/mozpack/manifests.py:344
(Diff revision 3)
> + self._source_files |= other._source_files
> +
> + for dest in sorted(other._dests):
> + if base:
> + new_dest = mozpath.join(base, dest)
> + self._add_entry(new_dest, other._dests[dest])
`new_dest` can be undefined.
Attachment #8890667 -
Flags: review?(gps) → review+
Comment hidden (mozreview-request) |
Assignee | ||
Comment 32•7 years ago
|
||
mozreview-review |
Comment on attachment 8890667 [details]
Bug 1384241 - Add |mach watch|: pywatchman based incremental |mach build faster| shell.
https://reviewboard.mozilla.org/r/161844/#review168006
Maybe these comments got lost? MozReview seems to confused, but I'll post them just in case.
::: python/mozbuild/mozbuild/faster_daemon.py:89
(Diff revision 2)
> + @mozbuild.util.memoized_property
> + def file_copier(self):
It does non-trivial file system access -- the unified manifest is ~600kb on my Mac OS X system.
Really, this was cached since it was doing the unification of all the manifests into a copier, which was _also_ doing the preprocessing/dependency tree inversion. Since that's not the way anymore, this probably doesn't need to be cached.
But let's leave that to the TODO!
::: python/mozbuild/mozbuild/mach_commands.py:326
(Diff revision 2)
> + 'https://developer.mozilla.org/en-US/docs/TODO_DOCUMENTATION_URL')
> + return 1
> +
> + from mozbuild.faster_daemon import Daemon
> + daemon = Daemon(self.config_environment)
> + return daemon.watch()
Rebasing issue with part 2. I've reinstated this, so you can actually run it locally :)
Assignee | ||
Comment 33•7 years ago
|
||
mozreview-review-reply |
Comment on attachment 8890667 [details]
Bug 1384241 - Add |mach watch|: pywatchman based incremental |mach build faster| shell.
https://reviewboard.mozilla.org/r/161844/#review167584
> On Linux and Windows artifact builds, *every* manifest is for dist/bin. I wonder if we should limit this to artifact builds for now and assert() instead.
Yeah, this is a good approach. I've moved this into an `is_artifact_build` block and asserted.
> We know the string prefix. Since config.status run times can be annoying, it is probably worth optimizing the relpath into a string slice.
Done. I used `relpath` since it handles optional trailing slash (`dist/bin` vs `dist/bin/other`) cleanly.
> AFAICT this is never used. I'm getting this exception when I run `mach watch` initially. But no details of the inner exception are printed!
My intention is to abstract away the pywatchman details, while still allowing a consumer to see what happened underneath. Rather htan baking the `cause` into the `repr`, I've added a `message` to the underlying exception. (n.b., I think Python 3 handles exception chaining in this manner natively.)
> Since `self.config_environment` exists, you don't need to pass the directory to watch. Although I do like functions that accept arguments instead of relying on state. But since `topsrcdir` is in the name, it should be argument-less.
Fair enough. I've partially reverted to the generic function previously expresed, and then removed the argument from the specific helper.
> You should probably log something here, as the first time a watch on the topsrcdir is established, watchman will need to walk the directory tree and this could take several seconds. It would be nice to have feedback that it is doing something.
Yeah, I had a TODO about the bad logging; I've replaced it with more verbose logging in the registration process.
> I /think/ we can set an initial `since` value of `c:0:0` to get all changes since the watch was registered.
>
> I'm not sure at which point we could hit a long pause querying watchman though. I'm guessing during `subscribe`. But you'll have to test it.
I have never seen long pauses, but that might be because I have `fsmonitor` enabled.
A quick search doesn't reveal any documentation about an initial `since` value for `query()`. I'm not really sure I _want_ to see all changes since the watch was registered; maybe it makes sense and maybe it doesn't. It's like running `mach build faster` for the user; maybe they want that, and maybe they don't. (I expect that when we have all the kinks worked out, they _will_ want that, and the daemon will really be keeping `topobjdir` in sync with `topsrcdir`.)
> I think we also need `clock-sync-timeout` to use the `clock` command.
Good catch! This was from https://github.com/facebook/watchman/blob/19aebfebb0b5b0b5174b3914a879370ffc5dac37/python/bin/watchman-wait#L137, which doesn't have that capability checked explicitly. Maybe it's implied with some other capability? In any case, I've added that capability to the list.
> I /think/ there may be a bug in pywatchman where it doesn't handle interrupted system calls. So we may get some wonky stacks with ^C. But I'd have to check. Not much we can do.
Yeah, https://github.com/facebook/watchman/blob/19aebfebb0b5b0b5174b3914a879370ffc5dac37/python/bin/watchman-wait#L217 is definitely trying to avoid such stacks. I've removed that, but I'll add it back to `mach_commands.py`.
> I'm hitting this exception locally. It occurs calling the `clock` command. You'll want to pass `sync_timeout` to the command. I'm guessing the value should be upwards of 30s so it works on Windows.
OK, I've added the `sync_timeout`. I don't see this on my Mac, but I'll test on my Windows laptop and hopefully everything will be okay.
> This could be deferred to a follow-up, but there should be a `condition` for this `@Command` so the command is only available if there is a configured build and watchman is available. Search for `MachCommandConditions` in the repo.
I've changed the category to 'post-build', but I don't see a solid pattern for verifying there's a built object directory. (Maybe `get_binary_path`?) I've made the implementation test for watchman and artifacts to give feedback rather than silently hiding it; that'll indirectly ensure there's a configured build at least.
Comment 34•7 years ago
|
||
Pushed by nalexander@mozilla.com:
https://hg.mozilla.org/integration/autoland/rev/28a461fb134e
Pre: Extract acdefines() helper function. r=gps
https://hg.mozilla.org/integration/autoland/rev/597ee9bd1097
Add |mach watch|: pywatchman based incremental |mach build faster| shell. r=gps
Comment 35•7 years ago
|
||
bugherder |
https://hg.mozilla.org/mozilla-central/rev/28a461fb134e
https://hg.mozilla.org/mozilla-central/rev/597ee9bd1097
Status: ASSIGNED → RESOLVED
Closed: 7 years ago
status-firefox56:
--- → fixed
Resolution: --- → FIXED
Target Milestone: --- → mozilla56
Updated•7 years ago
|
Product: Core → Firefox Build System
You need to log in
before you can comment on or make changes to this bug.
Description
•