We're already using pylibmc for Heroku (on recommendation of the Cashier docs), and searches for comparison of the two seem to marginally favour pylibmc. pylibmc seems to be more active, and supports things like compression & better exception reportings: http://sendapatch.se/projects/pylibmc/misc.html#differences-from-python-memcached In addition, in bug 1181587 I found that python-memcached v1.54 had a significant (four times slower) performance regression compared to v1.53 - that hadn't even been noticed in the 4 months since release. python-memcached doesn't even have a benchmark script checked into the repo, unlike pylibmc. As such, I think we should just switch to pylibmc and be done with it.
Moving to Heroku will mean we switch to pylibmc just due to the way they are set up. Let's do so explicitly before we transition, to make any side effects clearer, and so there's one fewer stack change as part of the main move to Heroku.
Assignee: nobody → emorley
The last few fays I've been digging into django-heroku-memcacheify, django-pylibmc, pylibmc and Django's native pylibmc backend, since: 1) the way django-pylibmc handles the environment variables makes it harder than it need be for us to handle prod+stage+heroku+vagrant+travis simultaneously 2) it's crazy we have to use the first two in that list just to add some basic handling to what is mostly already supported by Django (its native pylibmc backend just doesn't support binary mode and username/password auth) I've opened a PR against django-pylibmc to remove some cruft: https://github.com/django-pylibmc/django-pylibmc/pull/36 However in so doing discovered there's a bug in the timeout handling when using binary mode, for which I've filed: https://github.com/lericson/pylibmc/issues/202 I've looked into what it would take to upstream some of django-pylibmc, so we could just stop using it entirely: https://github.com/django-pylibmc/django-pylibmc/issues/37 But for now I think we'll have to stick with it, so I'll open a PR soon for (1) above, to make the environment variable handling easier to override.
Created attachment 8716368 [details] [review] [treeherder] mozilla:switch-to-pylibmc > mozilla:master
Attachment #8716368 - Flags: review?(cdawson) → review+
Commit pushed to master at https://github.com/mozilla/treeherder https://github.com/mozilla/treeherder/commit/1dc1f7fee59074a55866d4ad0e2bcde83b494972 Bug 1182043 - Use pylibmc instead of python-memcached We're already using pylibmc on Heroku, since we need its SASL authentication support. However we were still using python-memcached on Vagrant, Travis, stage & prod. To reduce the number of simultaneous changes when we migrate to Heroku, and to ensure at that point we're testing what we ship, this switches us to pylibmc pre-emptively for all environments. Django does have a native pylibmc backend , however it doesn't support SASL authentication, so we have to use the custom django-pylibmc backend instead , which we choose to use everywhere (even though only Heroku's memcache instances require auth) for consistency. Installing pylibmc requires libmemcached-dev, which is installed by default on Travis, has just been installed on stage/prod (bug 1243767), and as of this change, is now installed in the Vagrant environment too. The comment mentioning that pylibmc must be present in the root requirements.txt file no longer applies, since the Python buildpack now uses pip-grep (rather than regex) to determine whether it is in the requirements files , and so handles included requirements files too. Example: https://emorley.pastebin.mozilla.org/8858007 I'll be checking for any changes in performance when this is deployed, however if anything I expect it to be faster since it's not pure Python.  https://github.com/django/django/blob/1.8.7/django/core/cache/backends/memcached.py#L171  https://github.com/django-pylibmc/django-pylibmc/blob/master/django_pylibmc/memcached.py  https://github.com/heroku/heroku-buildpack-python/blob/v75/bin/steps/pylibmc#L22
Status: NEW → RESOLVED
Last Resolved: 2 years ago
Resolution: --- → FIXED
You need to log in before you can comment on or make changes to this bug.