olly - Fotolia
The rapid interest and adoption of Container technologies has given rise to a rapid increase in unanticipated and confusing consequences, such as a loss of oversight, and processes moving so fast they are trying to rip themselves apart. With a little planning -- and the right ecosystem of tooling -- CTOs and development managers can make sure their adoption of containers is not a short lived, messy one.
In this article, I explore why container technologies can be more problematic than beneficial. I'll also delve into ways to think ahead of any issues, leverage available tooling to avoid them, and stop saying "containers just did not work for us."
Containers isolate portions of host Linux operating systems in such a way that they behave like their own instance with unique configurations and applications. Containers are based on LXC technology, which has been built into several popular Linux distributions since 2008. The approach is very similar to virtual machines (VMs). However, containers maintain the performance of host operating systems without any loss, and are much smaller. They make it faster for developers to spin up a full stack to work on. And they allow for application releases to be faster and more predictable.
The primary container technology used today is Docker. To further the container-driven pipeline, the Docker team has added a set of command line interfaces, management tools, public and private libraries, and some configuration possibilities such as networking and security. All of these make containers a more viable solution for entire development teams.
Put a ribbon on top
The idea of compartmentalizing applications as entire stacks (operating system, system configurations, and code) is not new. Virtualization technology grew out of this idea. However, the downfall of virtualization is that the VMs are huge and married to hypervisors -- which are immutable. This means that if you wanted to snapshot VMs and move them around like containers, you would need serious bandwidth and heavy server setups. However, that is not the chief problem: you also have to be IT without the old server habits. Hypervisors always belong to IT, and IT has the habit of treating VMs as physical servers that you provision once and forget about until there is a problem. This approach does not work for software development.
The motivation to move to better compartmentalizing is also driven by the increase in application complexity: use of more components, heightened security climate, and tighter integration between code and system-level components. They all highlight the gap between developer and infrastructure, and reconcile the issues between those two.
This means that releasing applications as code alone is less than ideal. If you could release the application as the entire stack, however, then pinpointing and addressing system-to-code issues or exploits is easier. In the development space, full-stack deployments are the future. And for development teams, containers mean that developers, QA and IT can have more concrete conversations about applications, and get to the bottom of issues faster.
When the benefits become the problems
Within a few minutes a developer can pull a container image from the public library and provision an instance on his local machine. Within an hour a developer can make changes to that instance, transforming it into his own image, and publish it again on 10 other host machines, or 10 new instances on the same machine. If you extend this ability beyond the individual developer, the pull down, modification and provisioning of containers has a viral effect. And very quickly, the team has instances of containers they never knew existed -- nor what they were for.
This is where the benefit quickly becomes the problem: when there is an ecosystem of unmanaged containers. Besides the more trivial issue, such as wasted resources, container technologies create a slew of much more serious issues:
- Change management
- IT security
- Application integrity
- Resource planning
- Governance and auditing
These types of problems turn a new approach into a new enemy, and even company-destroying issues like hacks or poor application quality. Containers are built to move, and they were not built for oversight and management. As a result, there is a very limited adoption of containers today and very few organizations realizing the dream of pipelines driven by container technologies.
The leading use case for containers is with developers, who use them in a very ad-hoc way. They provision containers for quick testing and the ability to rip and replace their stack on a whim. But there is one other unique element to the most common container use case, and that is where it fits into the application. Today containers are most useful for application front-ends because one of the big values of them is to remain small and nimble. So adding databases to a container would contradict this.
Containers are well worth the investment
It has become clear that container technologies have to live with a collection of practices and other tools to make them work in a sustainable team-wide way. For organizations that see the benefits, the reality has been just out of reach, and this also puts them behind on broader modern development practices. Containers are not required for full-stack deployments, or continuous integration, delivery and deployment -- but they do make the practices easier.
The opportunity that containers afford organizations makes them well worth the investment in solving the above issues as well as pushing application delivery to the point where you are shipping containers, not code. And, fortunately, the technology and the ecosystem of strategies and tools to help are advancing rapidly.
So don't lose faith. The age of mature container adoption is coming, and it promises to change the way we view applications and software development.
Five more things to watch out for with container technologies
Making container technologies and virtualization technologies get along
What's the secret of Docker's success with containers?