Eco helps you maintain a local Python ecosystem (your required modules, scripts, etc).

It takes a simple approach: you tell it what the requirements are and Eco fetches and installs those packages in a single directory, independent from your global Python installation. It is designed for projects that should just work when you type PYTHONPATH=./eco/lib python. However, a script to do this is created for you as ./eco/bin/python (read the scripts section for more info).

Eco is a wrapper around pip but is less functional than pip. Long live pip!

Download / Install

Eco requires both pip and setuptools; eco itself can be run after installation with the eco script or can be run from source as /path/to/eco-source/eco.py. Currently, eco is only available from source, so check out the code and do:

$ cd eco-source
$ sudo python setup.py install

You should now be able to run:

$ eco --help

And if you have neither pip nor setuptools, preface the above command with:

$ wget http://peak.telecommunity.com/dist/ez_setup.py
$ sudo python ez_setup.py
$ easy_install pip

Basic usage

Let's say you have a project and it depends on Nose version 0.10.4. cd into the root of your project and type:

$ eco 'nose==0.10.4'
Downloading/unpacking nose==0.10.4
  Downloading nose-0.10.4.tar.gz (284Kb): 284Kb downloaded
  Running setup.py egg_info for package nose
Installing collected packages: nose
  Running setup.py install for nose
    Installing nosetests script to /path/to/project/eco/bin
Successfully installed nose==0.10.4

Your ecosystem is ready

  scripts : ./eco/bin
  modules : ./eco/lib
  data    : ./eco/data

As you can see, you now have three local directories (locations are customizable). Within ./eco/lib is the nose module and its egg-info file:

$ ls -1 ./eco/lib

Within ./eco/bin is the nosetests script and a script for invoking the Python interpreter:

$ ls ./eco/bin

And within /eco/data are nose's man pages:

$ ls ./eco/data

To start work on your project simply type:

$ ./eco/bin/python

Or read on for how to create custom scripts.

Defining requirements in a text file

Instead of typing in all requirement specs on the command line, you could store them in a file. For example you can create a file named requirements.txt with these lines:

# for docs:
# for tests:

And run eco like this:

eco -r requirements.txt

to build an ecosystem with SQLAlchemy at version 0.4.8, the latest Sphinx, and the latest nose.

Defining requirements in setup.py

Since you typically want to work with your module the same way a user would, Eco will automatically read module requirements from a setup.py file. Note that specifying requirements in setup() is only supported by setuptools. If a setuptools-enabled setup.py file exists in the directory you run Eco from, Eco will inspect its keyword arguments to determine the requirements for your project. Here is an example:

from setuptools import setup, find_packages
    # etc...

In this case, Eco will automatically install SQLAlchemy, Sphinx, and nose to your local ecosystem. There are some options to customize this behavior.

Eco will also run python setup.py develop so that your project is available to all scripts via an egg-link file.

Note that you can combine setup.py requirements with file-based requirements. Say, for example, to install additional modules while testing:

eco -r test-requirements.txt

Using config files to set defaults

Eco will look for two config files in your working directory, setup.cfg and eco.cfg. If either of those exist then the name / value pairs in the [eco] section will be used as default values for command line options. If both files exist, they will both be parsed but options declared in eco.cfg will take precedence.

Here is an example of setting a custom value for --index-url, --eco-lib, and --eco-bin

index-url = http://my-pypi/
eco-lib = ./site-packages/
eco-bin = ./bin

Local modules vs. sys.path modules

Eco assumes you only want to build modules that do not already exist in your sys.path. That way you can install hard to build modules once for all projects and use separate ecosystems in each project for custom versions of modules. Alternatively, if you want to ignore already installed modules when resolving requirements, specify --ignore-sys.

Working with scripts

Special Python script

To work with modules inside your ecosystem, you can run a custom Python script at:


This will invoke the Python interactive shell with all your ecosystem modules available for import.

Shell scripts

To create a custom command line script that uses modules inside your ecosystem, define an [eco.scripts] section in either eco.cfg or setup.cfg like this

run-tests = my_module.test_runner:run_tests

This would create a script named ./eco/bin/run-tests that executes more or less the following code

import sys
# add eco/lib to sys.path

from my_module.test_runner import run_tests
if __name__ == '__main__':
    sys.exit(run_tests() or 0)

WSGI scripts

To create a WSGI compatible script that uses modules inside your ecosystem, define an [eco.wsgi_scripts] section in either eco.cfg or setup.cfg like this

blog.wsgi = my_blog:application

This would create a script named ./eco/bin/blog.wsgi that would execute from my_blog import application which can be handled by mod_wsgi or any other WSGI server container. If the name after the colon is not named application it will be aliased as such for WSGI compatibility.

Setuptools scripts

For projects that are already using setuptools entry points for creating scripts, Eco will make a valiant attempt to patch the auto-generated scripts so that they work as expected in your local ecosystem. Obviously, Eco does not patch any global scripts you installed with pip or easy_install.

How is this different from VirtualEnv?

Eco is a bit like virtualenv but it has less features. The key differences are that eco puts all modules in one directory and there are no symlinks to system modules. This makes it possible to manage eco directories in version control or to zip them for Google App Engine, etc. Eco also makes project setup as simple as running the eco command without any arguments. There are some virtualenv wrappers that also make setup simple; paver is one worth considering for your own needs.


  • Create multi-version scripts automaticaly (i.e. nosetests-2.5, nosetests-2.6)
  • Add support for "installing" local project scripts with .pth files (currently they are only installed in develop mode which works by way of egg-link files)
  • Add options --ignore-tests-req, --ignore-setup-req to disable installation of requirements in tests_require= and setup_requires=

Known Issues

  • You will get caught in a near infinite loop if you try to build an ecosystem that depends on PasteDeploy (i.e. for a Pylons app). This is due to Paste's entry_points. There is a patch to pip in http://bitbucket.org/kumar303/pip/ but it is a little intrusive since it uses addsitedir(). It has not been accepted into pip yet and will probably need to become less intrusive before that happens. You will need to install this fork of pip to fix the problem.
  • Eco forces you to keep a clean global site-packages. In other words, if you require version 1.2 of SomeModule in your ecosystem but you have 1.1 installed in your global site-packages then eco will not build an ecosystem until you have removed the 1.1 egg. This is a side effect of using addsitedir() to make setuptools work within an ecosystem. Until someone convinces me otherwise, I consider it a feature ;)
  • Using development eggs in eco is hard. You have to copy over TheModule.egg-info directory from the source into eco/lib (you don't have to copy over the module). Maybe this will be improved.


Eco source is available using Mercurial with:

$ hg clone http://bitbucket.org/kumar303/eco/


Please submit bugs and patches to the bitbucket tracker. All contributors will be acknowledged. Thanks!


Eco was created by Kumar McMillan but only exists because of work by Ian Bicking on pip and Phillip J. Eby on pkg_resources and setuptools.


Augie Fackler


Eco is Copyright 2009 Kumar McMillan and is available for use, distribution, or modification under the MIT License. Have fun.