We have to update some existing packages, and also add new ones so we can install packages via pip.
Actually we don't need anything for Mozmill here, given that we get a full environment with all tools pre-installed. For TPS the following packages involved here are: TPS: mozversion mozdevice TPS-CI: fxa-python-client cryptography cffi pycparser six PyBrowserID PyHawk I see problems. Even when the six package is available fxa-python-client cannot find this dependency. This might be related to https://github.com/mozilla/fxa-python-client/pull/13. I will have to check that. To be able to compile cffi, the python-dev and libffi-dev packages need to be installed via apt. As I can see no other module or package adds this dep so far. Where is the best place for it? All the above adds a lot of deps on the pypi side. I wonder if we should better create a pre-configured environment also for TPS. So we only have to download it from e.g. http://www.mozqa.com. Dustin, what do you think?
Really, use whichever option is more maintainable for you. I don't really know what a "pre-configured environment" is in this case, but if that makes the most sense for your use cases, and matches what others would be using, then that's a good choice here. Adding Python packages is no problem. The python::virtualenv class can install individual packages regardless of requirements, so you could install six (and the rest) that way, perhaps avoiding the issue you've seen. As for Python-dev, that may not be necessary if you're using the /tools Python. Adding a package::libffi that installs libffi and libffi-dev is not a problem.
(In reply to Dustin J. Mitchell [:dustin] from comment #2) > Really, use whichever option is more maintainable for you. I don't really > know what a "pre-configured environment" is in this case, but if that makes > the most sense for your use cases, and matches what others would be using, > then that's a good choice here. That means we create the virtual environment on another machine, and install all the necessary packages. Compilation also takes place, so no user of this environment would have to install any compiler on his/her machine. It's just a zip archive we would have to download. > Adding Python packages is no problem. The python::virtualenv class can > install individual packages regardless of requirements, so you could install > six (and the rest) that way, perhaps avoiding the issue you've seen. Right now the installation of packages is done by: http://mxr.mozilla.org/mozilla-central/source/testing/tps/create_venv.py We have to extend the list of packages for TPS-CI with e.g. fxa-python-client: https://github.com/mozilla/coversheet/pull/34/files#diff-0da473e708e67a5fcab047da8e8a8c1bR70 This will most likely not fix the problem I'm currently facing. That one looks different and may somewhat related to how we activate the virtual environmnet from within a Python script. That way ~/.pip/pip.config doesn't seem to be taken into account. Which is strange. > As for Python-dev, that may not be necessary if you're using the /tools > Python. Adding a package::libffi that installs libffi and libffi-dev is not > a problem. That version is still on 2.7.3. We need at least 2.7.6. So we would have to upgrade. Probably we would have to do this for OS X anyway.
The Python upgrade shouldn't be too difficult (just a version number change and recompile). We could make the Python version a config variable, too. It does sound like a pre-configured environment will be the best choice.
Sounds perfect for me. So closing out this bug. I will file an issue on the to create tps repository.