Lately a couple of tools were busted due to version conflicts of mozbase packages. I think that all of them are externally hosted and depend on the official releases on PyPI (or the internal mirror). Here the list of the ones I'm at least aware of: * firefox-ui-tests * mozdownload * mozmill (hotfix-2.0 branch) * talos * maybe others too The reason for all the trouble was this time the release of mozlog 3.0 via bug 1014760. It introduced API changes which were not backward compatible. As usual the mozlog package got a major version bump to indicate the API change, and also mozbase packages using mozlog got a version bump. But - and here it comes - due to the usage of `>=` it will only work for the most recent versions of those packages. If your tool is using an older version because its pegged to it, it will break. Here an example: Imagine you have released a tool which has a longterm support branch and uses a specific version of mozrunner. It's pegged e.g. to `mozrunner == 5.35`. At the time you created the version mozrunner 5.35 depends on several other mozbase packages e.g. mozprofile 0.21. Creating a new virtual environment and installing the tool via `pip install my-cool-tool` works like expected. Now imagine something breaks in that longterm support release and you have to fix it without having to bump all the packages to the most recent version. Sometimes you even can't without having to majorly rewrite the code. So creating a new virtual environment and installing the tool will surprisingly break. Why that? Because mozrunner 5.35 lists the dependency for mozprofile as `>=0.21` and meanwhile version 0.23 exists. In this new version mozlog 3.0 is used and it's getting installed. It's clearly working for mozprofile 0.23 but not for mozrunner 5.35 which has `mozlog>=2.11` in its setup.py file. As result your tool is busted and requires a lot of time to get it fixed. A workaround for now would be to add all deps of sub modules to your own setup.py file, which would prevent the installation of newer versions. It's something I had to do for Mozmill 2.0.10 as you can see here: https://github.com/mozilla/mozmill/blob/hotfix-2.0/mozmill/setup.py#L17 It's far from being ideal. So we recently had a discussion about the dependency handling on the internal auto-tools list, and I think it's time to better bring this topic up here so everyone else can chime in. The last email in this thread was from Gregory and he gave some nice links which I want to post here: https://python-packaging-user-guide.readthedocs.org/en/latest/index.html https://python-packaging-user-guide.readthedocs.org/en/latest/requirements.html Just to blockquote one sentence from this page which reflects the problem I talked above: > It is not considered best practice to use install_requires to pin > dependencies to specific versions, or to specify sub-dependencies > (i.e. dependencies of your dependencies) Because of that I think we should talk about how to setup dependencies between mozbase packages to also care about outside consumers which are not always on the latest version of those packages. In the following you can find my proposal, but feel free to comment on that if you agree or disagree. Maybe something better exists... Given that you should not pin a specific version you have to specify a version range. For the example tool above and mozrunner it could look like `mozrunner >= 5.0, <6.0`. It will ensure that minor version updates will be used, but any API change which results in a major version bump will be ignored. And this should also be done for any mozbase package, e.g. mozcrash: http://mxr.mozilla.org/mozilla-central/source/testing/mozbase/mozcrash/setup.py current dependency setting: --------------------------- deps = ['mozfile >= 1.0', 'mozlog >= 3.0'] proposed dependency setting: ---------------------------- deps = ['mozfile >= 1.0, <2.0', 'mozlog >= 3.0, <4.0'] I'm not sure if this fixes all the problems we have. So please give your feedback.
+1, these issues are really a pain. The solution proposed to limit major version number seems good to me. :)
Yeah, that's kind of unfortunate, sorry about that.. I like the idea of a range, but the challenge is how do you know ahead of time what the maximum version is going to be? For example, in order for mozrunner 5.35 to have a "mozlog < 3.0" dependency, we would have had to commit that at the time mozrunner 5.35 was uploaded (i.e predict that 3.0 was going to be incompatible years in advance). We could assume that every future major version bump is going to be incompatible, and that might work well enough, but it probably won't be true in practice. The other problem is that there isn't really a standard way to deal with backwards compatibility. I bumped the major version there because it was a major change, but there are often minor version bumps that only slightly break backwards compatibility (i.e modify a function signature). In these cases, a version range still won't help. I think a range might end up being just as much work (if not more) to maintain than the current situation. In other words, I don't think there is an easy solution to this problem :/.
I'm not opposed to using a range and assuming major bumps will be incompatible, it's probably better than nothing.. but just pointing out it won't entirely fix the problem.
It might be easier to use the compatible release specifier: https://www.python.org/dev/peps/pep-0440/#compatible-release and to limit breaking changes to major version numbers as mentioned previously in the comments.
You need to log in before you can comment on or make changes to this bug.