Adrian Mouat on ‘Understanding Docker and Containerisation’

Sometimes, what can seem just a useful innovation in IT infrastructure can have a significant effect higher up. Containerisation is one of those things, and one of its experts outlined the how and why in a Software Development Community of Practice industry talk.

One of the advantages of being in Edinburgh is that we have quite the tech scene on our door step. Sometimes literally, as when one of the pioneers of the now ubiquitous Docker container technology turns out to work out of the Codebase side of Argyle House. And that’s not the only connection Adrian has with us; he used to work at the EPCC part of the university. Which made the idea of inviting him over for a general talk on Docker and containerisation both compelling and do-able.

Being in IS, but somewhat removed from actually running server software, I was about as aware of the significance of containers as I was hazy on the details. Fortunately, I was the sort of audience Adrian’s talk was aimed at.

Specifically, he answered the main questions:

What is a container?

A portable, isolated computing environment that’s like a virtual machine, but one that shares its operating system kernel with its host. The point being that it is a lot more efficient in image size, start-up times etc. than a virtual machine.  Docker is a technology for making such containers.

What problem does it solve?

Containers solve the “it works for me” problem where a developer gets some software to work perfectly on her own machine, only to see it fail elsewhere because any of a myriad differences in the computing environment.

Why is it important?

Because it enables two significant trends in software development and architecture. One is the shift to microservices, which encapsulate functionality in small services that do just one thing, but do it well. Those microservices ideally live in their own environment, with no dependencies on anything else outside of their service interface. Container environments such as Docker are ideal for that purpose.

The other trend is devops- blurring the distinction between software development and operations, or at least bringing them much closer together. By making the software environment portable and ‘copy-able’, it becomes much easier and quicker to develop, test and deploy new versions of running software.

What’s the catch?

No technology is magic, so it was good to hear Adrian point to the limitations of Docker as well.  One is the danger of merely shifting the complexity of modern software applications from the inside of a monolithic application to a lot of microservices on the network. This can be addressed by good design and making use of container orchestration technology such as Kubernetes.

The other drawback is that containers are necessarily not great at sharing complex states. Because each small piece of software lives in splendid isolation in its own container with its own lifecycle, making sure that everyone of them is on the same page when they work together in a process is a challenge.

Overall, though, Docker’s ability to make software manageable is very attractive indeed, and, along with the shift to the cloud, could well mean that our Enterprise Architecture will look very different in a few years’ time.

Leave a Reply