Source

fabdeploy / README

Usage
=====

Type::

    fab --list

to get list of available tasks.

Executing tasks
===============

You can pass arguments to tasks using following ways:

- Call ``setup_fabdeploy()`` to setup empty configuration and host ``$USER@localhost``. You will be prompted for any missing variable (once per task)::

    from fabdeploy.api import setup_fabdeploy
    setup_fabdeploy()

- Pass global configuration to ``setup_conf()``::

    from fabdeploy.api import setup_conf

    @task
    def staging():
        env.conf = setup_conf(dict(
            address='user@host',
            db_name='mydb',
            db_user='myuser'
        ))
        env.hosts = [env.address]

  Then tasks can be runned without arguments::

    fab staging postgres.create_db

- Pass arguments directly to task::

    fab staging postgres.create_db:db_name=mydb,db_user=myuser

Configuration
=============

There are some conventions how to configure fabdeploy:

- You should use Python OrderedDict, because often order is important::

    from collections import OrderedDict

    BASE_CONF = OrderedDict([
        ('sudo_user', 'fabdeploy'),
    ])

- Each value can contain Python formatting::

    BASE_CONF = OrderedDict([
        ('supervisor.log_dir', '%(log_dir)s/supervisor'),
    ])

- Remote dirs should have posfix ``_dir``. You can and should use task ``fabd.mkdirs`` to create all remote dirs with one command. It will look like this::

    $ fab fabd.mkdirs
    mkdir --parents /path/to/dir1 /path/to/dir2 /path/to/dir3

- Local dirs have postfix ``_ldir`` (similar to Fabric ``cd`` and ``lcd``).

- Dirs (postfix ``_dir`` and ``_ldir``) and pathes (postfix ``_path`` and ``_lpath``) can be lists. This list will be passed to ``os.path.join()`` or ``posixpath.join()``. Previous example can look like this::

    BASE_CONF = OrderedDict([
        ('supervisor.log_dir', ['%(log_dir)s', 'supervisor']),
    ])

- You can configure each task individually::

    BASE_CONF = OrderedDict([
        ('postgres.db_name', 'postgresql_db'), # module=postres
        ('mysql.db_name', 'mysql_db'),         # module=mysql
        ('mysql.create_db.db_user', 'root'),   # module=mysql, task=create_db
    ])

Configuration is stored in task instance variable ``self.conf``. Each task has its own copy of configuration. Configuration variables are searched in following places:

- task keyword argument ``var`` (``fab task:foo=bar``);
- task instance method ``var()`` decorated with ``@conf()``;
- key ``var`` in ``env.conf`` dict;
- ask user to provide variable ``var`` using fabric prompt.

Customization
=============

To upload project using tar archive you can use ``tar`` task with default arguments::

    fab staging tar.push

You can also write task to upload only your static dir using the same task::

     from fabdeploy import tar

     @task
     def push_static():
         tar.push.run(src_dir=os.path.join(env.conf.django_ldir, 'static'),
                      target_dir=posixpath.join(env.conf.django_dir, 'static'))

Writing your task
=================

Your task is class-based fabric class except fabdeploy manages configuration for you::

    from fabdeploy.api import Task, conf

    class MessagePrinter(Task):
        @conf
        def message(self):
            if 'message' in self.conf:
                return self.conf.message
            return 'Hi!'

        def do(self):
            if self.conf.secret == '123':
                puts(self.conf.message)
            else:
                puts('huh?')

    message_printer = MessagePrinter()

Then you can run this task like this::

    $ fab message_printer
    > secret = 123
    Hi!
    $ fab message_printer:message='Hello world!'
    > secret = 123
    Hello world!

Fabfile example
===============

Typical fabfile may look like this::

    from collections import OrderedDict
    from fabric.api import task, settings
    from fabdeploy.api import *


    setup_fabdeploy()

    BASE_CONF = OrderedDict(
       ('django_dir', 'projectname'),
       ('supervisor_programs', [
           (1000, 'group', ['gunicorn'])
       ])
    )


    @task
    def prod():
        conf = BASE_CONF.copy()
        conf['address'] = 'user@prodhost.com'
        env.conf = setup_conf(conf)
        env.hosts = [env.conf.address]


    @task
    def install():
        users.create.run()
        ssh.push_key.run(pub_key_file='~/.ssh/id_rsa.pub')

        system.setup_backports.run()
        system.install_common_software.run()

        with settings(warn_only=True):
            postgres.create_role.run()
            postgres.create_db.run()
            postgres.grant.run()

        nginx.install.run()

        for app in ['supervisor']:
            pip.install.run(app=app)


    @task
    def setup():
        fabd.mkdirs.run()

        gunicorn.push_config.run()
        nginx.push_gunicorn_config.run()
        nginx.restart.run()


    @task
    def deploy():
        fabd.mkdirs.run()
        postgres.dump.run()

        git.init.run()
        git.push.run()
        django.push_settings.run()
        supervisor.push_configs.run()

        virtualenv.create.run()
        virtualenv.pip_install.run(app='gunicorn')

        django.syncdb.run()
        django.migrate.run()
        django.collectstatic.run()

        supervisor.d.run()
        supervisor.restart_programs.run()