1. Pypy
  2. Untitled project
  3. codespeed2

Overview

Codespeed2

This is the source code of Codespeed2

About

This is loosely based on Codespeed

Codespeed2 has the aim of providing a solid API to handle benchmark timings. A frontend will allow viewing the benchmark data through graphs and other drill downs.

Technologies

Backend

We are using cyclone (and by extension Twisted) - as well as MongoDB.

Frontend

We are using AngularJS and D3.js.

Setup

Prerequisites:

  • MongoDB >= 2.4.3
    (The Windows 2008+ build uses newer features of Windows to enhance performance. Use this build if you are running 64-bit Windows Server 2008 R2, Windows 7, or greater.)

  • OpenSSL >= 1.0.2a
    For Windows download the full Win32OpenSSL. NOT the light version. You also need the specific version of the Visual C++ 2008 Redistributable that is on that page.

  • The packages as listed in requirements.txt (we recommend setting up a virtualenv)

  • pywin32 (for Windows only - download the binary and easy_install it into the virtualenv)

MongoDB Configuration

Modify the 'mongodb' section in each of the following configuration files to match your setup:

Windows Development

Additional Requirements:

Running

Shell & init.d

See /start.sh or the init script at /cyclone_site/scripts/init.d/codespeed2.sh

From an IDE / running directly:

Run /cyclone_site/web.py to start the server.

Options:

Short Long Description
-p --port The port to listen on [default: 8888].
-l --listen The interface to listen on [default: 0.0.0.0].
-c --conf The config file that will used by the Codespeed2 cyclone application [default: site.conf].

Testing

Tests are written using python unittest and can be run using anything that handles that (eg: pytest) - or your IDE's built in runner.

JSON sample files are in /samples/bench_results.tar.gz and must be extracted to /samples/ for some tests to pass.

Usage

Run the server as described above. By default it will run on port 8888 (and listen on all interfaces).

Next you will need to submit some data to the API's upload endpoint(s). The expected format is described in /doc/new_json_format.txt.

The API URLs you'll be interested in are:
/api/upload/benchmark_runner_raw_output
and
/api/upload/benchmark_runner_raw_baseline_output

Alternatively you can populate the DB with some test data by running the tests in the test_populate_with_sample_data module.
Remember to comment the skip line as these tests are usually excluded from test runs.
If you do this you will also need to hit the /api/admin/get_revision_dates_from_bitbucket api endpoint to populate necessary date information from Bitbucket. The data won't display correctly otherwise.