Skip to main content

Dynamic Reflection

After three years I'm leaving my engineering job at Dynamic Controls! I started as a summer intern back in 2010 and was subsequently hired as a Systems Engineer. Most of my time at Dynamic Controls was spent within the system's architecture team, designing, prototyping and specifying protocols for powered wheelchairs.

I've really enjoyed and grown in my first real world job. Dynamic was a great fledgling ground, while there I've had good mentors and great friends. One thing that stood out from the first day is the company has a strong vision of where it wants to be. The culture strongly promotes innovation, openness, sustainability and a desire for excellence. I've certainly enjoyed telling people that I help design powered wheelchairs as well!

I was very proud to be on the team that helped to design and implement the LiNX architecture. Of course there was plenty of pressure during my time as we released the first in a series of innovative wheelchair controllers. I can't wait to see a LiNX chair driving down the street knowing I helped design and program it.

The culture around learning and furthering personal development was particularly notable at Dynamic. After being exposed to a couple of the training courses I decided to run an "Introduction to Programming Course"; naturally I taught the course in Python. I took four groups of about a dozen colleagues through a twelve week course. Just before leaving I ran a condensed version of the course for our neighbours down at Allied Telesis. This emphasis on training extended to sending engineers to conferences around the country.

Dynamic Controls is increasingly embracing open source software. I took over maintenance of a useful set of internal tools that are used for talking to a Controller Area Network and after much refactoring I released the source to the world on It was very exciting to have contributors actually using the library, and even more exciting when it got forked and someone contributed an entire new backend. The suite of tools that are used internally went from only working with one hardware platform to working with a multitude.

One of earlier internal website projects that I worked on was a Requirements and Traceability tool. This PHP site was written with two other engineers and was one of the hardest database setups I've ever had to design. I never appreciated how difficult versioning of deeply linked items could be! After using this "all singing all dancing" site through one product release, we decided that the maintenance cost would just be too large so we ended up getting an off the shelf solution called Jama. After using Jama successfully for more than a year we released an open source library for interfacing with Python: It's still early days on that one, but I hope others will find it useful.

Any true reflection includes all the warts as well as the good. What hasn't been fantastic about my experience? First there were some dichotomies, it seemed very odd that a research and development department that aspires to be a center of excellence would require the use of archaic technology. Checking email was actually difficult from linux (which the entire software team use) as IMAP or POP wasn't enabled and the web client provided had stopped receiving updates sometime around Internet Explorer 5 being released. The engineering department enter their project time in a program designed for Windows 95. I don't want to start talking about the overly restrictive internet proxy.

Occasionally the bureaucracy of having to have multiple reviewers, approvers and all the various hoops that are imposed for anyone in the medical industry got annoying but I do fully understand why it is necessary.

Lastly and definitely a big positive to end on would be the fantastic people I had the opportunity to work with. I learnt so much about myself, about teamwork, and about engineering professionalism.

Its going to be a few months break now as I travel to South America with my partner Sarah. You can follow our journey with us at

Popular posts from this blog

My setup for downloading & streaming movies and tv

I recently signed up for Netflix and am retiring my headless home media pc. This blog will have to serve as its obituary. The box spent about half of its life running FreeNAS, and half running Archlinux. I’ll briefly talk about my experience with FreeNAS, the migration, and then I’ll get to the robust setup I ended up with.

The machine itself cost around $1000 in 2014. Powered by an AMD A4-7300 3.8GHz cpu with 8GB of memory. A SilverStone DS380 case is both functional, quiet and looks great. The hard drives have been updated over the last two years until it had a full compliment of 6 WD Green 4TiB drives - all spinning bits of metal though.

Initially I had the BSD based FreeNAS operating system installed. I had a single hard drive in its own ZFS pool for TV and Movies, and a second ZFS pool comprised of 5 hard drives for documents and photos.

FreeNAS is straight forward to use and setup, provided you only want to do things supported out of the box or by plugins. Each plugin is install…

Driveby contribution to Python Cryptography

While at PyConAU 2016 I attended the Monday sprints and spent some time looking at a proposed feature I hoped would soon be part of cryptography. As most readers of this blog will know, cryptography is a very respected project within the Python ecosystem and it was an interesting experience to see how such a prominent open source project handles contributions and reviews.

The feature in question is the Diffie-Hellman Key Exchange algorithm used in many cryptography applications. Diffie-Helman Key Exchange is a way of generating a shared secret between two parties where the secret can't be determined by an eavesdropper observing the communication. DHE is extremely common - it is one of the primary methods used to provide "perfect forward secrecy" every time you initiate a TLS connection to an HTTPS website. Mathematically it is extremely elegant and the inventors were the recipients of the 2015 Turing award.

I wanted to write about this particular contribution because man…

Python, Virtualenv and Docker

Unsurprisingly I use some very popular Scientific Python packages like Numpy, Scipy and Scikit Learn. These packages don't get on that well with virtualenv and pip as they take a lot of external dependencies to build. These dependencies can be optional libraries like libblas and libatlas which if present will make Numpy run faster, or required dependencies like a fortran compiler.

Back in the good old days you wouldn't pin all your dependency versions down and you'd end up with a precarious mix of apt-get installed and pip installed packages. Working with other developers, especially on different operating system update schedules could be a pain. It was time to update your project when it breaks because of a dependency upgraded by the operating system.

Does virtualenv fully solve this? No, not when you have hard requirements on the binaries that must be installed at a system level.

Docker being at a lower level gives you much more control without adding too much extra comp…