Mako wheels have computed install_requires

Issue #249 open
Robert Robert
created an issue

Terrible title, sorry.

So Mako computes a value for install_requires in This results in the wheel thats built being python version specific - concretely if you build a wheel using Python 3.2 it won't depend on MarkupSafe, but on 3.2 or 2.6/2.7 it will will.

This shows up when wheels are reused across Python versions (one way to do this is via the pip wheel cache): - if Python 3.2 is used to install mako, then when an install is done on 3.3 MarkupSafe isn't installed. Conversely if Python 3.3 is used to do the first install, then on 3.2 MarkupSafe will be installed even though its (presumably) incompatible.

A similar issue applies to the argparse conditional: if the wheel is built on 2.6, it will drag in argparse when installed on 2.7/3.x.

Your setup.cfg has universal=1 so it looks like you intend to build wheels suitable for multiple Python versions, but to do so you cannot use computed requirements - you must instead use conditional requirements.

Attached is a patch to do this.

Unfortunately, sufficiently old versions of pip and setuptools don't support computed requirements. I don't know the minimum versions that do, but pip 1.5.6 (the default on my Trusty install) does - the following is from a wheel built with the patch attached.:

$ pip install ./Mako-1.0.1-py2.py3-none-any.whl 
Unpacking ./Mako-1.0.1-py2.py3-none-any.whl
Downloading/unpacking MarkupSafe>=0.9.2 (from Mako==1.0.1)
  Downloading MarkupSafe-0.23.tar.gz
  Running (path:/home/robertc/.virtualenvs/t/build/MarkupSafe/ egg_info for package MarkupSafe

Installing collected packages: Mako, MarkupSafe
  Running install for MarkupSafe

    building 'markupsafe._speedups' extension
    x86_64-linux-gnu-gcc -pthread -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fno-strict-aliasing -D_FORTIFY_SOURCE=2 -g -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/usr/include/python2.7 -c markupsafe/_speedups.c -o build/temp.linux-x86_64-2.7/markupsafe/_speedups.o
    x86_64-linux-gnu-gcc -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -D_FORTIFY_SOURCE=2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wl,-Bsymbolic-functions -Wl,-z,relro -D_FORTIFY_SOURCE=2 -g -fstack-protector-strong -Wformat -Werror=format-security build/temp.linux-x86_64-2.7/markupsafe/_speedups.o -o build/lib.linux-x86_64-2.7/markupsafe/
Successfully installed Mako MarkupSafe
Cleaning up...

Comments (41)

  1. Robert Robert reporter

    Removing universal won't fix the issue because wheel defaults to building a wheel that can be used on all minor versions of the same python. E.g. removing universal fixes py 2.7 after 3.2, but not py 3.3 after 3.2.

    Using a strong tag like --python-tag cp26 when building the wheel would isolate everything- but I don't know how to do that automatically from within yet.

  2. Michael Merickel

    I think the correct solution is for you to duplicate the install requires into the [metadata] section of setup.cfg so that the wheel dist-info is built with conditional dependencies for py32.

    universal = 1
    requires-dist =
        argparse; python_version == '2.6'
        MarkupSafe; python_version != '3.2'
  3. Michael Bayer repo owner

    Removing universal won't fix the issue because wheel defaults to building a wheel that can be used on all minor versions of the same python. E.g. removing universal fixes py 2.7 after 3.2, but not py 3.3 after 3.2.

    This makes no sense. Removing the [wheel] section turns Mako into just a plain old like any of ten thousand other packages on pypi, many of which contain conditionals in their What is the solution for all of these packages? How is psycopg2 going to work ? psycopg2's has conditionals in it, it builds differently on every platform, there's nothing magic in its setup.cfg, no weird [metadata] section or anything like that. Is there a bug in psycopg2 and if not why not? I have no idea what --python-tag is and just today tweeted how incredibly burdensome the constant stream of changes in Python packing tools like pbr, pip and related weird packages none of us know what are used for like cffi are just making all of us miserable.

  4. Donald Stufft

    Wheel correctly handles compiled software because it can detect that it's compiled and thus it makes a platform specific wheel, so psycopg2 works fine (though I recognize that psycopg2 was just an example). The real problem comes not from compiled projects, but pure Python ones that use conditional logic in their which is opaque to pip/setuptools/wheel. Mako is an example of this which dynamically adjusts it's install_requires via Python code in the based on the Python version. Since pip/setuptools/wheel has no way to know that Mako has done this, it falls back to it's defaults for pure Python wheels which is saying "this wheel is good for all of Python 2.x" or "This wheel is good for all of Python 3.x" depending on if you're running under Python 3 or Python2. The universal flag simply modified that slightly to combine them so that it said "This wheel is good for Python 2.x AND Python 3.x".

    Previously to pip 7.0 if a project's produced a wheel whose filename indicated it supported a broader range of environments than it actually did it would be mostly OK as long as they didn't publish a Wheel to PyPI. The only time they would have a wheel created for them is if someone created a wheel for them for their own uses (such as a manually managed wheel cache) and most people probably would not report issues to the project for failures to build a correct wheel in that case. However, since pip 7.0, if we cannot locate a wheel when we search for files to install, instead of directly installing the sdist, we instead create a Wheel from that project and then (unless the user has disabled it) we cache that wheel locally and will reuse it in the future anytime the file location routine found that same sdist. This respects the compatibility tags in the Wheel filename, so as long as the projects generate correct wheels, everything works fine. However if the project creates a wheel whose filename indicates a broader range of environments then it actually supports, then pip will reuse that wheel in scenarios where it's not appropriate. In the case of pure Python wheels, defaulting to either all of Python 2 or all of Python 3 based on the current version of Python.

    So, the question then becomes how do we handle this?

    1. One way is for projects to all ensure that it's doesn't generate an overally broad wheel file, typically by using environment markers instead of manually coding conditionals into the The obvious downside to this way is that it requires authors to do work to catch up, however it is a better overall solution once they've actually done that.
    2. Another way, is for projects to simply break building wheels from their projects and instead raise an error saying they do not support installation by Wheels. Ideally this would be a short term patch that they would remove with the first option once they've had more time since disabling Wheels means their users will get slower installs and possibly in the future they will become uninstallable all together.
    3. The final way is for pip to forcibly override the compatibility tags in the wheel that the project's produces and force them to be narrower than it may actually be. This has the downside that the wheel cache becomes less efficient than it could be, however it will use the incorrect wheel in far fewer situations.

    For Mako, I would suggest the first option which can be implemented either by duplicating your install_requires into setup.cfg with the conditionals being indicated by environment markers (unlike what @Robert Robert said, this does not require distutils2, the wheel project will use this values instead of install_requires if they exist) OR you can be implemented by embedding the environment markers directly into the (The downside here is that this will require that users installing Mako have a minimum version of pip and setuptools installed, the upside is less duplication). This would solve the problem with Mako with the implicit wheel cache, with people generating their own Wheels, and make it possible for you to publish wheels on PyPI easily for this project. The patch that @Robert Robert wrote implements the second method of this and if you applied it you would be able to add back the universal flag and create a single wheel that worked for both Python 2 and Python 3, but will require new-ish setuptools and pip. If that isn't acceptable for you, but some duplication in listing your dependencies would be acceptable I can send a patch for the first option as well.

    There will be discussion over on pip about the third thing, but I think that we should probably force wheels to be more specific than they actually might be since we do not know whether a project is emitting correct wheels or not (and given how many legacy projects there are, there may be more than a few that do). This would solve Mako's problem in the specific case of the implicit pip wheel cache, but not for any other cases where people might be generating a Wheel.

    I fully understand that the changes in the ecosystem have made some people's lives harder and I wish there was a better way forward than to try and progress us towards a better eventual system with some short term pain. There has been a lot of technical debt added to the entire ecosystem to deal with the shortcoming of the system, and sooner or later all technical debt has to be paid down, and that's what we're doing now. The alternative is to never progress and to simply deal with a subpar system forever. I hope that as things progress, the pain will end up being worth it as we start to crawl out of the technical debt hole and things start working more consistently and sanely for all involved. I do however recognize this can be painful, and I'm sorry for that.

  5. Michael Bayer repo owner

    OK, thanks for that explaination. As far as Mako's conditionals, there are only two. One is the thing where Armin grumpily didn't want to support MarkupSafe in Python 3.2 and lower. At this point I am fine putting out a new Mako that just doesn't support Python 3.2, we' almost at 3.5 now and nobody uses 3.2.

    As far as the argparse thing, I have that same conditional in Alembic also. This is because I like to use argparse, which comes included in Python 3 but not in Python 2. In this case, if I just put "argparse" in the requirements, last time I looked into this is downloads and installs argparse even on Py3k where it's already there. But this seems like something that can be conditional based on Py2k/Py3k which from what I'm reading suggests that at that level, there's no issue because non-universal wheels will build separately for py2k/py3k.

    So can we propose the most cross-compatible patch possible, given that we can drop 3.0->3.2 and start at Python 3.3, and whatever it is I'm supposed to do in order to have argparse available without the double-install on Py3k, we can do that. But obviously Mako is a "long tail" library that's installed in all kinds of crufty environments so I would hope that with an older pip or setuptools it still can install fine.

    I'd also imagine that even though psycopg2 has the magic "we're compiled!" flag that changes everything, there's still a ton of projects with conditionals in their That putting a conditional in a is now illegal is a big deal, it's fine for Mako to be an early test case for this but I think the repercussions of this need to be carefully worked out. I know about the pip wheel thing and I think it is great as I watch build environments do a better and better job of not double-downloading/installing things these days, but I sort of thought that without "universal", it would build wheels at least for specific Python versions (e.g. 2.6, 2.7, etc).

  6. Michael Merickel

    argparse is in the stdlib from 2.7+ afaik so if you don't want a double install there you still need to do the [metadata] solution I described above. Apparently if the issue was as clear-cut as 2 vs 3 then you could leave out that section and just stop tagging mako as a universal wheel.

  7. Robert Robert reporter

    [aside] I did not realise that wheel consulted static metadata; that makes the plan for static dependencies more awkward; ca la vie.

    Anyhow - yes, the churn is hard, but this specific problem has existed since the wheel project was created, late 2012: its not a new thing now. I'm reporting it now because we just noticed.

    The most compatible version should be the metadata based one - it duplicates your requirements, so will only affect folk using wheels, and that implies opting in or recent versions of tools, either of which should be fine.

  8. Robert Robert reporter

    Yes, you'll see this: UserWarning: setup.cfg requirements overwrite values from warnings.warn('setup.cfg requirements overwrite values from')

    from bdist_wheel. You probably want to put a comment in both setup.cfg and explaining that they should be kept in sync.

  9. Donald Stufft

    @Michael Bayer Yes.

    The important thing to remember is that section will completely replace the install_requires section when the wheel is built. So you need to make sure that you list all your dependencies in both locations. It appears Mako is a single source 2 and 3 project (so no 2to3 or other thing that would make anything other than dependencies different on 2.x vs 3.x) so once you add that, you can add the universal flag back and get a single wheel for 2 and 3 that will conditionally install dependencies based on what is in setup.cfg. Just to be clear though, if one of your other projects conditionally changes what is installed based on the Python version (like if you have a py2 and a py3 version of a file and your changes which one gets installed) then you can still use the setup.cfg dependencies, but you won't want to use the universal flag.

    If you have any questions about how to make your projects wheel safe (or really, any packaging question), feel free to @ mention me or ping me on twitter or whatever.

  10. Donald Stufft

    You don't have to maintain both, setuptools and pip both support specifying the conditionals directly inside of (this is what the original patch @Robert Robert wrote does). The caveat is that support for that feature requires a modernish pip and setuptools so individual projects have to decide whether they would rather support older setuptools/pip and add duplication or use the newer feature and remove duplication but make older pip/setuptools unsupported.

  11. Donald Stufft

    By the way, by modernish I mean support was added for sdist based markers in setuptools in 0.7 and I think they were broken in pip until pip 6.0 (but I may be wrong, it may work in older ones).

  12. Michael Bayer repo owner

    alrighty, taking some notes here. Running an old Python 2.6.6 on a VM, making a virtualenv and doing "./bin/pip install Mako-1.0.2.dev0.tar.gz", which is an sdist of the that only includes the newer extras_require format, and a blank install_requires in all cases. We want to see that both argparse and MarkupSafe are recognized as dependencies.

    virtualenv 1.11.3 which has setuptools 2.2 pip 1.5.3 does not honor the directives in extras_require, it seems to just ignore them.

    same for venv 1.11.4 (pip 1.5.4)

    same for venv 1.11.5 (pip 1.5.5, setuptools 3.4.4)

    same for venv 1.11.6 (pip 1.5.6, setuptools 3.6) - so @Robert Robert that contradicts what you were seeing...

    then in venv 12.0, which has setuptools 7.0 and pip 6.0, now it recognizes the requirements in extras_require.

    So it looks like setuptools 7.0 is where this feature was added.

    now of course setuptools has a full changelog at, however I have read every single word between versions 3.6 and 7.0 and there is not a single thing that even slightly resembles that this new feature is present. This is kind of a problem, on top of all the other ones. E.g. Mako is of course going to get this right because three packaging people are helping me directly. Not sure what the answer is for everyone else...

  13. Michael Bayer repo owner

    OK, now running it doing a "python install" to use only setuptools and now I'm seeing something different, setuptools.3.6 is honoring those directives. Will keep trying.

  14. Michael Bayer repo owner

    So, just this patch:

    diff --git a/setup.cfg b/setup.cfg
    index 5bad26e..870572d 100644
    --- a/setup.cfg
    +++ b/setup.cfg
    @@ -1,7 +1,8 @@
     tag_build = dev
     addopts= --tb native -v -r fxX
    diff --git a/ b/
    index 56754aa..b05322b 100644
    --- a/
    +++ b/
    @@ -3,28 +3,24 @@ import os
     import re
     import sys
    -v = open(os.path.join(os.path.dirname(__file__), 'mako', ''))
    -VERSION = re.compile(r".*__version__ = '(.*?)'", re.S).match(
    +py26 = sys.version_info >= (2, 6)
    -readme = open(os.path.join(os.path.dirname(__file__), 'README.rst')).read()
    -if sys.version_info < (2, 6):
    +if not py26:
         raise Exception("Mako requires Python 2.6 or higher.")
    -markupsafe_installs = (
    -            sys.version_info >= (2, 6) and sys.version_info < (3, 0)
    -        ) or sys.version_info >= (3, 3)
    +extras_require = {
    +    ":python_version!='3.2'": ['MarkupSafe>=0.9.2'],
    +    ":python_version=='2.6'": ['argparse'],
     install_requires = []
    -if markupsafe_installs:
    -    install_requires.append('MarkupSafe>=0.9.2')
    -    import argparse
    -except ImportError:
    -    install_requires.append('argparse')
    +v = open(os.path.join(os.path.dirname(__file__), 'mako', ''))
    +VERSION = re.compile(r".*__version__ = '(.*?)'", re.S).match(
    +readme = open(os.path.join(os.path.dirname(__file__), 'README.rst')).read()
    @@ -51,7 +47,7 @@ setup(name='Mako',
    -      extras_require={},
    +      extras_require=extras_require,
           mako = mako.ext.turbogears:TGPlugin

    so far, very old setuptools seem to be fine w/ it in the " install" case, and w/ pip you kind of need the 6.0 level of things but it's not clear if that's the case yet.

  15. Michael Bayer repo owner

    also, I can find no mention whatsoever of the ":python_version" syntax in conjunction with extras throughout all of We've basically got this:

    shouldn't this whole world of essential features and practices at least be....documented...before posting bugs on downstream projects? I'm starting to think im crazy here. When I make the slightest change in SQLAlchemy's API, you get an enormous paragraph with examples explaining the whole thing. I wouldn't think of expecting my users to know things that aren't even documented ?

  16. Michael Bayer repo owner

    yeah, it seems like really old setuptools recognize the dependencies, but im actually not sure if that's because it just is taking everything in extras_require or it actually knows what ":python_version" means.

  17. Donald Stufft

    There are some PEPs that include them, but ultimately you're right. We've historically been pretty bad about documentation and though I think we're better than before we have a long way to go.

  18. Michael Bayer repo owner

    so this is a tricky-ish case on my end, I can put out the all-new-syntax only, and what actually happens is that Mako gracefully degrades if MarkupSafe isn't present anyway. So if this syntax were failing for users on old pips/setuptools, I'd probably never hear about it. argparse is already going to be there 99.99% of the time anyway. They'd just be using a less "secure" and slower string escaping tool.

  19. Michael Bayer repo owner

    how big a deal is it that if one says "pypy bdist_wheel" on SQLAlchemy, which has C extensions that are optional, you get a "universal" py2k wheel w/o the C extensions? is there no chance that wheels can somehow include the "im a 2.7-only wheel" without actual C extensions being present ?

  20. Michael Bayer repo owner

    every problem ive seem to have had with wheels comes down to this urge it has to build "universal" packages. If there were only a way to say, "no-universal=1", and you always get a python version as well as platform on your .wheel (e.g. pypy specific, 2.7 specific, 3.2 specific, etc), it would be great. Wheels work terrifically for caching things you've already downloaded and built. I don't see at all why its so important that these wheels are cross-python-version however, unless you're distributing a pure-Python package as a wheel on pypi, which still seems to be fairly pointless as if you just put up a .tar.gz your pip will have your wheel locally anyway. If I run a different Python version or interpreter, I'd expect that all the wheels would be entirely different.

  21. Donald Stufft

    Ah, optional C extensions which don't get installed on PyPy. So the basic answer is, if you do nothing and someone installs SQLAlchemy with CPython first they'll get a wheel cached that has the compiled bits and are only good for just that Python, if they install it with PyPy last it will create a "universal" py2 wheel, however pip will prefer a wheel with C extensions over a wheel without them, so in that scenario things will just work. HOWEVER, if the operation is inversed and PyPy is used to install before anything else, then pip will see that there is already an acceptable wheel available and will use that instead of building the C extensions on CPython. There is a small hack you can use to make this not the case though, CFFI does it so that it creates a "compiled" wheel on PyPy that doesn't actually contain any compiled code. You can just do this:

  22. Log in to comment