Nightly Benchmarks: Tracking Results with Codespeed

Background

Codespeed is a project for tracking performance. I discovered it when the PyPy project started using Codespeed to track performance. Since then development has been done to make its setup easier and provide more display options.

Anyway, two posts ago I talked about running nightly benchmarks with Hudson. Then in the previous post I discussed passing parameters between builds in Hudson. Both of these posts are worth reading before trying to setup Hudson with Codespeed.

Codespeed Installation/Configuration

Django Quickstart

Codespeed is built on Python and Django. Some basic knowledge of Django is needed in order to get everything up and running. Don’t worry, it’s not that hard to learn the bit that is needed. manage.py is all you need to know about to setup and view Codespeed. There is information about deploying Django to a real web server, but I won’t be covering that here.

Here are the commands to get Django running:

syncdb

syncdb is used to initialize the database with the necessary tables. It will also setup an admin account. With the sqlite3 database selected, it will create the database file when this command is run.

The command is:

python manage.py syncdb

runserver

The next command is the runserver command. This runs the built-in django server. In the documentation they state you’re not supposed to use it in a production environment, so make sure to deploy to a production environment if you plan to host it on the Internet or high traffic network.

The command is:

python manage.py runserver 0.0.0.0:9000

By default the server will run on 127.0.0.1:8000. Setting the IP to 0.0.0.0 allows connections from any computer. This works well if you’re on a local area network and want to set it up on a VM over SSH, but still be able to access the web interface from your computer. The port is the port for the server to run on. To view Codespeed, point your browser at 127.0.0.1:9000 or the IP of the machine it’s on with the colon 9000.

Django has many settings that may or may not need to be tweaked for your environment. They can be set through the speedcenter/settings.py file.

Codespeed Setup/Settings

Now for setting up the actual Codespeed server. First check it out using git. The clone command is:

git clone http://github.com/tobami/codespeed.git

The settings file is speedcenter/codespeed/settings.py.

Most of the default values will work fine. They’re mostly for setting default values for various things in the interface.

One thing that does need to be configured is the environment. Start by running the syncdb command and then run the server using runserver. Now that the server is running, browse to the admin interface. If you ran the server on port 9000, point your browser at http://127.0.0.1:9000/admin. Login using the username and password you created during the syncdb call. A Codespeed environment must be created manually. The environment is the machine you’re running the benchmarks on. After logging in, click Add next to the Environment label. Fill in the various fields and remember the name of it. Save it when you’re done. The name will be used later when submitting benchmark data to Codespeed.

Submitting Benchmarks

This will pick up where my last tutorial left off. The benchmarks were running as a nightly job in Hudson. Sending benchmark data to Codespeed will take a bit of programming. I’m going to continue the example with JRuby, so the benchmarks and submission process are written in Ruby.

In order to submit benchmarks information must be transferred from the JRuby build job to the Ruby Benchmarks job. My last post discussed how to transfer parameters between jobs. Using the Parameterized Trigger Plugin and passing extra parameters using a properties file will allow you to get all the necessary parameters to the benchmarks job.

The required information for submitting a benchmark result to Codespeed includes:

This information can be included but is optional:

The above information is passed to Codespeed through an encoded URL. Have the URL point to http://127.0.0.1:9000/results/add/ and encode the parameters for sending. For the JRuby benchmarks, the following parameters are sent from the JRuby job to the to the ruby benchmarks job.

COMMIT_ID=$(git rev-parse HEAD)
COMMIT_TIME=$(git log -1 --pretty=\"format:%ad\")
RUBY_PATH=$WORKSPACE/bin/jruby
REPO_URL=git://github.com/jruby/jruby.git

The other fields are derived from the benchmarks job itself.

Here is the source code for submission through Ruby:

output = {}
canonical_name = doc["name"].gsub '//', '/'
output['commitid'] = commitid
output['project'] = BASE_VM
output['branch'] = branch
output['executable'] = BASE_VM
output['benchmark'] = File.basename(canonical_name)
output['environment'] = environment
output['result_value'] = doc["mean"]
output['std_dev'] = doc["standard_deviation"]
output['result_date'] = commit_time
 
res = Net::HTTP.post_form(URI.parse("#{server}/result/add/"), output)
puts res.body

It’s a good idea to always print out the response as it will contain debug information. There is an example of how to submit benchmarks to Codespeed using Python in the Codespeed repository in the tools directory.

Viewing Results

After results are in the the Codespeed database, you can view the data through the web interface. Direct a browser at http://127.0.0.1:9000. The changes view shows the trend over the last revisions. The timeline view allows you to see a graph of recent revisions, and the newly added comparison view will compare different executables running the same benchmark.

Posted on July 19, 2010 at 10:16 am by Joe · Permalink
In: Uncategorized · Tagged with: , ,

Leave a Reply