> I believe this is not about globally changing shell env but just for mach's internal usage. Opting out autoupdate sounds good, though. Exactly, this is correct. * The pieces of MozillaBuild would remain as-is, and Mach would fetch and internally use the tools it needs * Just like for downstream Linux distros, it'd be possible to manually manage build dependencies ----- > To that end, be really really aware of how much auto-updating would really buy our devs I don't understand this sentence, I think "buy" is a typo - I don't know which word it's replacing, though :) > Right now I update mozilla-build about yearly, and it barely takes minutes to download and install a new version. Let's be realistic about how much time this would save, if this were more automatic: https://xkcd.com/1205/ :) Though this saves the amount of time developers spend _downloading and updating_ MozillaBuild, keep in mind that this effort is implicitly allocated elsewhere: * There's additional lag when the build team works to leverage features/performance/bugfixes in new dependency versions, because rather than "just bumping the dep", we have to release a new MozillaBuild, at which point it takes time to propagate to devs. We _do_ still need to support the "old dependency" versions, but the "general case" of developers, they don't benefit from the new version until far later. * More resources have to be committed by the build team when supporting a dev to resolve a local problem. Being able to wave away a whole class of potential failures because dependency versions are pinned is a major benefit. * Considering the lack of resources on the tools team right now, the hours poured into support seriously cut back the amount of improvements we can do. > Also, in particular mach has had some issues recently, such as handling of git-worktrees with python virtual envs. Let's avoid situations where devs avoid fetching central because of breakage. I'm assuming you're referring to [this bug](https://bugzilla.mozilla.org/show_bug.cgi?id=1739067). This is tough, and there's only a few ways to avoid the issues here: 1. Don't perform improvements that carry risk of regression: this leaves us in a case where the faults of the build system restrict us from making serious improvements - if we aren't able to address structural flaws, then we're going to have more critical problems in the future 2. Just don't write bugs 😉: if the build team was in the state it was three years ago, this situation would be less likely because of the amount of experience around potential regressions was so much higher. However, we lost the majority of our "heavy hitters" in 2020, and my foresight isn't (yet?) as strong as theirs were. > The more things that we can start "assuming" because we can "just auto-download tools" generally risks fragility I'd flip this around and say that `cargo` is a fantastic example of a tool that "auto-downloads" and *auto-builds* the pieces you need, and it generally "just works". If you had to manually compile each crate you depended on and manually wire it together, it would be far more error-prone, especially at the intermediate-level of software experience. > and slowly locks out workflows not 100% supported by the system. (which is bad for dev productivity) Note that due to downstream constraints such as Linux distros, we will always need to support non-auto-bootstrapped tools. The goal of this bug is to improve the experience of the "general" developer, but we won't be closing the "non-auto-bootstrap" use case anytime soon. > The bootstrapping process already causes over-aggressive downloading, making it harder to use in bandwidth-constrained (tethering!) or offline (planes!) scenarios. True, there's probably efficiency improvements that can be made. I would say that in recent years there's a more "download-happy" consensus, especially with the advent of "pinned/locked" dependencies. However, I think this is a great thing - removing a whole class of "dependency mismatch" bugs is a huge boon to stability and support costs. (Of course, this is still referring to the "general case" - in the separate use-case of "build using the manually-set-up dependencies I have on my machine", the situation is essentially unchanged). > I like the msys2 package idea, though! 💪 > But frankly I think MozillaBuild is in an amazing place right now, and I'd definitely prefer no-change to change here. This might be the origin of where our stances differ here. From my perspective, I see that MozillaBuild is in drastic need of improvement: * `./mach try fuzzy` breaks copy-paste until you restart the terminal * `CTRL-C` sometimes kills the wrong process and breaks output * The terminal doesn't always close when you `exit` * `hg` doesn't work interactively (e.g.: `hg histedit`) * Colours don't work consistently (`./mach build` warnings, etc) * `python3 -m venv` doesn't work, so you can't use `pipx`, etc * Packages inside of MozillaBuild are out-of-date (e.g. [Python](https://bugs.python.org/issue37380)), can only be updated by having a new MozillaBuild release deployed * <more that I don't know off the top of my head> To me, the costs of these issues dramatically outweighs the "risk of change" when we endeavour to fix them.
Bug 1741960 Comment 5 Edit History
Note: The actual edited comment in the bug view page will always show the original commenter’s name and original timestamp.
> I believe this is not about globally changing shell env but just for mach's internal usage. Opting out autoupdate sounds good, though. Exactly, this is correct. * The pieces of MozillaBuild would not be affected by Mach when it fetches and internally uses the tools it needs * Just like for downstream Linux distros, it'd be possible to manually manage build dependencies ----- > To that end, be really really aware of how much auto-updating would really buy our devs I don't understand this sentence, I think "buy" is a typo - I don't know which word it's replacing, though :) > Right now I update mozilla-build about yearly, and it barely takes minutes to download and install a new version. Let's be realistic about how much time this would save, if this were more automatic: https://xkcd.com/1205/ :) Though this saves the amount of time developers spend _downloading and updating_ MozillaBuild, keep in mind that this effort is implicitly allocated elsewhere: * There's additional lag when the build team works to leverage features/performance/bugfixes in new dependency versions, because rather than "just bumping the dep", we have to release a new MozillaBuild, at which point it takes time to propagate to devs. We _do_ still need to support the "old dependency" versions, but the "general case" of developers, they don't benefit from the new version until far later. * More resources have to be committed by the build team when supporting a dev to resolve a local problem. Being able to wave away a whole class of potential failures because dependency versions are pinned is a major benefit. * Considering the lack of resources on the tools team right now, the hours poured into support seriously cut back the amount of improvements we can do. > Also, in particular mach has had some issues recently, such as handling of git-worktrees with python virtual envs. Let's avoid situations where devs avoid fetching central because of breakage. I'm assuming you're referring to [this bug](https://bugzilla.mozilla.org/show_bug.cgi?id=1739067). This is tough, and there's only a few ways to avoid the issues here: 1. Don't perform improvements that carry risk of regression: this leaves us in a case where the faults of the build system restrict us from making serious improvements - if we aren't able to address structural flaws, then we're going to have more critical problems in the future 2. Just don't write bugs 😉: if the build team was in the state it was three years ago, this situation would be less likely because of the amount of experience around potential regressions was so much higher. However, we lost the majority of our "heavy hitters" in 2020, and my foresight isn't (yet?) as strong as theirs were. > The more things that we can start "assuming" because we can "just auto-download tools" generally risks fragility I'd flip this around and say that `cargo` is a fantastic example of a tool that "auto-downloads" and *auto-builds* the pieces you need, and it generally "just works". If you had to manually compile each crate you depended on and manually wire it together, it would be far more error-prone, especially at the intermediate-level of software experience. > and slowly locks out workflows not 100% supported by the system. (which is bad for dev productivity) Note that due to downstream constraints such as Linux distros, we will always need to support non-auto-bootstrapped tools. The goal of this bug is to improve the experience of the "general" developer, but we won't be closing the "non-auto-bootstrap" use case anytime soon. > The bootstrapping process already causes over-aggressive downloading, making it harder to use in bandwidth-constrained (tethering!) or offline (planes!) scenarios. True, there's probably efficiency improvements that can be made. I would say that in recent years there's a more "download-happy" consensus, especially with the advent of "pinned/locked" dependencies. However, I think this is a great thing - removing a whole class of "dependency mismatch" bugs is a huge boon to stability and support costs. (Of course, this is still referring to the "general case" - in the separate use-case of "build using the manually-set-up dependencies I have on my machine", the situation is essentially unchanged). > I like the msys2 package idea, though! 💪 > But frankly I think MozillaBuild is in an amazing place right now, and I'd definitely prefer no-change to change here. This might be the origin of where our stances differ here. From my perspective, I see that MozillaBuild is in drastic need of improvement: * `./mach try fuzzy` breaks copy-paste until you restart the terminal * `CTRL-C` sometimes kills the wrong process and breaks output * The terminal doesn't always close when you `exit` * `hg` doesn't work interactively (e.g.: `hg histedit`) * Colours don't work consistently (`./mach build` warnings, etc) * `python3 -m venv` doesn't work, so you can't use `pipx`, etc * Packages inside of MozillaBuild are out-of-date (e.g. [Python](https://bugs.python.org/issue37380)), can only be updated by having a new MozillaBuild release deployed * <more that I don't know off the top of my head> To me, the costs of these issues dramatically outweighs the "risk of change" when we endeavour to fix them.
> I believe this is not about globally changing shell env but just for mach's internal usage. Opting out autoupdate sounds good, though. Exactly, this is correct. * The pieces of MozillaBuild would not be affected by Mach when it fetches and internally uses the tools it needs * Just like for downstream Linux distros, it'd be possible to manually manage build dependencies ----- > To that end, be really really aware of how much auto-updating would really buy our devs ~I don't understand this sentence, I think "buy" is a typo - I don't know which word it's replacing, though :)~ Ah, re-reading this, I see now that the interpretation is: "be aware of how much value would be provided to devs by implementing auto-updating". > Right now I update mozilla-build about yearly, and it barely takes minutes to download and install a new version. Let's be realistic about how much time this would save, if this were more automatic: https://xkcd.com/1205/ :) Though this saves the amount of time developers spend _downloading and updating_ MozillaBuild, keep in mind that this effort is implicitly allocated elsewhere: * There's additional lag when the build team works to leverage features/performance/bugfixes in new dependency versions, because rather than "just bumping the dep", we have to release a new MozillaBuild, at which point it takes time to propagate to devs. We _do_ still need to support the "old dependency" versions, but the "general case" of developers, they don't benefit from the new version until far later. * More resources have to be committed by the build team when supporting a dev to resolve a local problem. Being able to wave away a whole class of potential failures because dependency versions are pinned is a major benefit. * Considering the lack of resources on the tools team right now, the hours poured into support seriously cut back the amount of improvements we can do. > Also, in particular mach has had some issues recently, such as handling of git-worktrees with python virtual envs. Let's avoid situations where devs avoid fetching central because of breakage. I'm assuming you're referring to [this bug](https://bugzilla.mozilla.org/show_bug.cgi?id=1739067). This is tough, and there's only a few ways to avoid the issues here: 1. Don't perform improvements that carry risk of regression: this leaves us in a case where the faults of the build system restrict us from making serious improvements - if we aren't able to address structural flaws, then we're going to have more critical problems in the future 2. Just don't write bugs 😉: if the build team was in the state it was three years ago, this situation would be less likely because of the amount of experience around potential regressions was so much higher. However, we lost the majority of our "heavy hitters" in 2020, and my foresight isn't (yet?) as strong as theirs were. > The more things that we can start "assuming" because we can "just auto-download tools" generally risks fragility I'd flip this around and say that `cargo` is a fantastic example of a tool that "auto-downloads" and *auto-builds* the pieces you need, and it generally "just works". If you had to manually compile each crate you depended on and manually wire it together, it would be far more error-prone, especially at the intermediate-level of software experience. > and slowly locks out workflows not 100% supported by the system. (which is bad for dev productivity) Note that due to downstream constraints such as Linux distros, we will always need to support non-auto-bootstrapped tools. The goal of this bug is to improve the experience of the "general" developer, but we won't be closing the "non-auto-bootstrap" use case anytime soon. > The bootstrapping process already causes over-aggressive downloading, making it harder to use in bandwidth-constrained (tethering!) or offline (planes!) scenarios. True, there's probably efficiency improvements that can be made. I would say that in recent years there's a more "download-happy" consensus, especially with the advent of "pinned/locked" dependencies. However, I think this is a great thing - removing a whole class of "dependency mismatch" bugs is a huge boon to stability and support costs. (Of course, this is still referring to the "general case" - in the separate use-case of "build using the manually-set-up dependencies I have on my machine", the situation is essentially unchanged). > I like the msys2 package idea, though! 💪 > But frankly I think MozillaBuild is in an amazing place right now, and I'd definitely prefer no-change to change here. This might be the origin of where our stances differ here. From my perspective, I see that MozillaBuild is in drastic need of improvement: * `./mach try fuzzy` breaks copy-paste until you restart the terminal * `CTRL-C` sometimes kills the wrong process and breaks output * The terminal doesn't always close when you `exit` * `hg` doesn't work interactively (e.g.: `hg histedit`) * Colours don't work consistently (`./mach build` warnings, etc) * `python3 -m venv` doesn't work, so you can't use `pipx`, etc * Packages inside of MozillaBuild are out-of-date (e.g. [Python](https://bugs.python.org/issue37380)), can only be updated by having a new MozillaBuild release deployed * <more that I don't know off the top of my head> To me, the costs of these issues dramatically outweighs the "risk of change" when we endeavour to fix them.