fabdeploy / README



    fab --list

to get list of available tasks.

Executing tasks

You can pass arguments to tasks using following ways:

- Call ``setup_fabdeploy()`` to setup empty configuration and host ``$USER@localhost``. You will be prompted for any missing variable (once per task)::

    from fabdeploy.api import setup_fabdeploy

- Pass global configuration to ``setup_conf()``::

    from fabdeploy.api import setup_conf

    def staging():
        env.conf = setup_conf(dict(
        env.hosts = [env.address]

  Then tasks can be runned without arguments::

    fab staging postgres.create_db

- Pass arguments directly to task::

    fab staging postgres.create_db:db_name=mydb,db_user=myuser


There are some conventions how to configure fabdeploy:

- You should use Python OrderedDict, because often order is important::

    from collections import OrderedDict

    BASE_CONF = OrderedDict([
        ('sudo_user', 'fabdeploy'),

- Each value can contain Python formatting::

    BASE_CONF = OrderedDict([
        ('supervisor.log_dir', '%(log_dir)s/supervisor'),

- Remote dirs should have posfix ``_dir``. You can and should use task ``fabd.mkdirs`` to create all remote dirs with one command. It will look like this::

    $ fab fabd.mkdirs
    mkdir --parents /path/to/dir1 /path/to/dir2 /path/to/dir3

- Local dirs have postfix ``_ldir`` (similar to Fabric ``cd`` and ``lcd``).

- Dirs (postfix ``_dir`` and ``_ldir``) and pathes (postfix ``_path`` and ``_lpath``) can be lists. This list will be passed to ``os.path.join()`` or ``posixpath.join()``. Previous example can look like this::

    BASE_CONF = OrderedDict([
        ('supervisor.log_dir', ['%(log_dir)s', 'supervisor']),

- You can configure each task individually::

    BASE_CONF = OrderedDict([
        ('postgres.db_name', 'postgresql_db'), # module=postres
        ('mysql.db_name', 'mysql_db'),         # module=mysql
        ('mysql.create_db.db_user', 'root'),   # module=mysql, task=create_db

Configuration is stored in task instance variable ``self.conf``. Each task has its own copy of configuration. Configuration variables are searched in following places:

- task keyword argument ``var`` (``fab task:foo=bar``);
- task instance method ``var()`` decorated with ``@conf()``;
- key ``var`` in ``env.conf`` dict;
- ask user to provide variable ``var`` using fabric prompt.


To upload project using tar archive you can use ``tar`` task with default arguments::

    fab staging tar.push

You can also write task to upload only your static dir using the same task::

     from fabdeploy import tar

     def push_static():, 'static'),
                      target_dir=posixpath.join(env.conf.django_dir, 'static'))

Writing your task

Your task is class-based fabric class except fabdeploy manages configuration for you::

    from fabdeploy.api import Task, conf

    class MessagePrinter(Task):
        def message(self):
            if 'message' in self.conf:
                return self.conf.message
            return 'Hi!'

        def do(self):
            if self.conf.secret == '123':

    message_printer = MessagePrinter()

Then you can run this task like this::

    $ fab message_printer
    > secret = 123
    $ fab message_printer:message='Hello world!'
    > secret = 123
    Hello world!

Fabfile example

Typical fabfile may look like this::

    from collections import OrderedDict
    from fabric.api import task, settings
    from fabdeploy.api import *


    BASE_CONF = OrderedDict(
       ('django_dir', 'projectname'),
       ('supervisor_programs', [
           (1000, 'group', ['gunicorn'])

    def prod():
        conf = BASE_CONF.copy()
        conf['address'] = ''
        env.conf = setup_conf(conf)
        env.hosts = [env.conf.address]

    def install():'~/.ssh/')

        with settings(warn_only=True):

        for app in ['supervisor']:

    def setup():

    def deploy():'gunicorn')