Bees with machine guns | Load testing

Another interesting PyCon talk to watch was Best practices for impossible deadlines. One of the tools mentioned in the talk is Bees with machine guns, a load testing tool that uses the power of EC2 and with a name like that I was gleeful thinking about the day when I would have an opportunity to use it.

This week the day came :D And despite the useful README I had to tweak a bit to get things working; here are my notes for later, and for anyone else who might ask themselves the same questions.

Bringing up the swarm

The documentation mentions some of the defaults you should override, but not all of them and you might find yourself scratching your head wondering why things are timing out.

Pre-requisites:

  • Creates a 'bees' (or whatever) security group, with SSH/port 22 open (-g)
  • Make sure the AMI you're trying to find is available with that name in the zone you're in (-i)
  • Make sure the region in your Boto config file matches the region you're trying to connect to (-z)
  • Mind the default username! If you don't specify it, it will default to newsapp and the bees won't be able to connect (-l)
  • For additional paranoia I also created a specific key pair just for the bees, to make sure there's no way in hell I can bring down an instance with something important on it (-k)
./bees up -s 4 -g bees -k bees -i ami-12345678 -z eu-west-1a -l root

Still timing out!

Now... The bees use the apache benchmark tool to hammer your site. If the tool doesn't produce any output the script assumes there was a time out. Which means, if the tool isn't installed on the AMI the bees will always time out! Without an appropriate error message. There are a few options. You could create a custom AMI that has ab installed. This is a bit overkill. Or, if the AMI you're using supports it (not a given!) you can launch a user data script by tweaking the bees.py script around line 100 and adding something like this to the run_instances call. (Debian-based system, use sudo if necessary.)

user_data = '#!bin/bash\napt-get -y install apache2-utils'

thus becoming:

reservation = ec2_connection.run_instances(
        image_id=image_id,
        min_count=count,
        max_count=count,
        key_name=key_name,
        security_groups=[group],
        instance_type=EC2_INSTANCE_TYPE,
        placement=zone,
        user_data = '#!bin/bash\napt-get -y install apache2-utils')

Other tips

If testing say a Django-based site, don't forget to add the ending slash at the end of the URL. ab doesn't follow 301 codes.

If changing regions to test from elsewhere:

  • Find the new AMI id for the image. It has to be EBS backed.
  • You may encounter errors if the zone doesn't offer t1.micro images, but the error message will clearly indicate so.
  • ~/.boto must be updated:
[Boto]
#ec2_region_name = eu-west-1
#ec2_region_endpoint = eu-west-1.ec2.amazonaws.com
ec2_region_name = us-east-1
ec2_region_endpoint = us-east-1.ec2.amazonaws.com

[Credentials]
aws_access_key_id = ......
aws_secret_access_key = .....

links

social