Skip to main content

Amazon + Blender 2.74

I’ve been branching out doing some 3D modelling at work recently and wanted to share the rendering pipeline I settled on. I began on my laptop Dell XPS 15 (XPS15-6846sLV), GPU rendering worked straight away from Windows 7, but took some setting up with Archlinux. This setup was pretty good for modeling, but I quickly realized that I needed some more serious heat dissipation for rendering. I then moved to my workstation which has much more CPU power and a dedicated NVS 310 graphics card, not too shabby but sample renders were taking too long. I borrowed a work colleague's GeForce GTX TITAN with 12GB GDDR5 ram - unsurprisingly this beast sped things up considerably.

I still had to render more frames than would make sense on one machine - especially in the time I had to work with; so I set up Blender on Amazon Web Services to take advantage of the EC2 instances with GPUs (g2.2xlarge). Taking advantage of favourable spot prices in North Virginia one can run GPU instances for between 5 and 10 cents per hour. In the EC2 Management console there is a section Limits where you may need to put in a request to be allowed more than 5 g2.2xlarge instances. Amazon came back with an approval the same day for me.

Getting Blender to render remotely from the command line with the GPU took some fiddling so I've created a public AMI which I hope will save some hassles.

I've had small jobs of between 5 and 20 machines running at once for up to 48 hours. I couldn't think of an easier way to get short term access to 20 high end machines with $2000 graphics cards in them!

To get started with an AWS Blender render farm, follow the excellent instructions on the Brenda documentation but substitute the AMI with my image which has CUDA enabled and Blender 2.74 installed:

North Virginia (us-east-1): ami-7207351a
Sydney (ap-southeast-2): ami-9f5220a5

Locally:

You can happily launch a spot machine, SSH in and render from the command line.
E.g. to run the BMW27 benchmark:
$ blender -b -P /opt/cuda_setup.py BMW27.blend -F PNG -o out.png -s 1 -e 1 -a
...
Time: 02:47:48
Note if you don’t include -P /opt/cuda_setup.py it will still run but will use the CPU instead of GPU.

Remotely with brenda:

Use a frame-template with the addition -P /opt/cuda\_setup.py:
brenda-work -T frame-template -s 1 -e 1 push
Tip run nvidia-smi to see GPU utilization during rendering.
Note the AMI is based on StarCluster which uses Ubuntu 12.04 as the underlying operating system. Blender 2.73 is also present under /opt/blender-2.73.


Popular posts from this blog

Python and Gmail with IMAP

Today I had to automatically access my Gmail inbox from Python. I needed the ability to get an unread email count, the subjects of those unread emails and then download them. I found a Gmail.py library on sourceforge, but it actually opened the normal gmail webpage and site scraped the info. I wanted something much faster, luckily gmail can now be accessed with both pop and imap. After a tiny amount of research I decided imap was the better albiet slightly more difficult protocol. Enabling imap in gmail is straight forward, it was under labs. The address for gmail's imap server is: imap.gmail.com:993 Python has a library module called imaplib , we will make heavy use of that to access our emails. I'm going to assume that we have already defined two globals - username and password. To connect and login to the gmail server and select the inbox we can do: import imaplib imap_server = imaplib . IMAP4_SSL ( "imap.gmail.com" , 993 ) imap_server . login ( use...

Bluetooth with Python 3.3

Since about version 3.3 Python supports Bluetooth sockets natively. To put this to the test I got hold of an iRacer from sparkfun . To send to New Zealand the cost was $60. The toy has an on-board Bluetooth radio that supports the RFCOMM transport protocol. The drive  protocol is dead easy, you send single byte instructions when a direction or speed change is required. The bytes are broken into two nibbles:  0xXY  where X is the direction and Y is the speed. For example the byte 0x16 means forwards at mid-speed. I was surprised to note the car continues carrying out the last given demand! I let pairing get dealt with by the operating system. The code to create a  Car object that is drivable over Bluetooth is very straight forward in pure Python: import socket import time class BluetoothCar : def __init__ ( self , mac_address = "00:12:05:09:98:36" ): self . socket = socket . socket ( socket . AF_BLUETO...

Matplotlib in Django

The official django tutorial is very good, it stops short of displaying data with matplotlib - which could be very handy for dsp or automated testing. This is an extension to the tutorial. So first you must do the official tutorial! Complete the tutorial (as of writing this up to part 4). Adding an image to a view To start with we will take a static image from the hard drive and display it on the polls index page. Usually if it really is a static image this would be managed by the webserver eg apache. For introduction purposes we will get django to serve the static image. To do this we first need to change the template. Change the template At the moment poll_list.html probably looks something like this: <h1>Django test app - Polls</h1> {% if object_list %} <ul> {% for object in object_list %} <li><a href="/polls/{{object.id}}">{{ object.question }}</a></li> {% endfor %} </ul> {% else %} <p>No polls...