What do application containers have to do with service-oriented architecture? In a word, everything. Containers are the building blocks of next-generation infrastructure, which makes it easier than ever to both run new applications on an SOA and migrate existing ones.
Let's start by explaining how containers work. A container is a software-defined environment in which code runs. Because the container abstracts the code running inside it from the system that hosts it, the code is easily portable between different host systems.
Application containers also provide isolation features for the code running inside them. That helps to control resource allocation between different containers and provides a layer of security.
The containers you probably hear most about these days are Docker containers. Docker lets you run individual applications inside containers. Docker also supports orchestration of those containers with tools like Swarm, Kubernetes and a long list of other container orchestrators.
There are other types of containers, too. Some container platforms, like OpenVZ and LXD, let you run an entire operating system inside a container. Although it is a different process, this is similar to running an OS inside a traditional virtual machine. Other types of containers, called unikernels, are designed to host minimalist self-sustaining application environments, which do not depend on an external operating system in any way.
However, Docker application containers are the most useful container technology for enterprises today. The rest of this article will focus primarily on Docker containers.
So, what do containers have to do with SOA? First, we need to explain what SOA means.
Broadly defined, SOA is an approach to application design that breaks an application down into discrete parts. Those parts are generally distributed across a system and communicate with each other over the network or through APIs.
Traditionally, SOA was implemented by writing sockets or plug-ins that allowed the different components of an application to exchange information. The services over which each part of the application communicated were then orchestrated in some way through a management layer.
Containers and SOA
Now, we can finally explain how containers make SOA easier.
Docker containers offer a new way of implementing SOA, which is more efficient in many respects. With Docker, you can run each part of an application inside a container. Then you can use a container orchestrator to manage communication between the containers and assure that each service inside your application is running smoothly.
Containers are better building blocks for SOA than traditional, noncontainerized services for the following reasons:
- Containers are easy to move between hosts. With traditional SOA, services are dependent on particular host environments. For example, if a network file system file share is set up on a CentOS Linux server to provide the storage service for your SOA, migrating the NFS host to an Ubuntu server would take some work. In contrast, moving a container from one host to another one requires very little work, because the container host environment is always the same.
- Containers are highly scalable. Because application containers can move easily between hosts, a containerized infrastructure is also easy to scale. If more instances of a certain service are needed, more containers can be spun up. Scaling is not necessarily so easy with traditional distributed systems.
- Containers are easy to update. Want to update the application running inside a container? Simply rebuild the container image on which the application is based and then restart the container. This process is simpler and more fail-proof than doing a traditional upgrade to a service on a distributed system because it is easy to roll back quickly if needed.
- Containers are reusable. One of the main goals of SOA is to make services reusable and shareable, reducing the amount of redundant services. Containers are highly reusable because container images can be easily copied and containers can move quickly between hosts.
- Containers carry less overhead. Containers provide the flexibility of running virtual services inside software-defined environments without the need to run a full virtual machine to host the services. Instead, containers share compute resources with the container host. This means you get the benefits of traditional software-defined services without the heavy resource overhead.
Migrating to containers
The last big question is how you get from a traditional SOA to a containerized SOA. The answer depends on what your application needs look like.
If you are designing and deploying new applications, adopting containers to host them is easy. You simply make sure the applications are written to run inside containers, then set up Docker, deploy your applications and use a container orchestrator to manage them.
Generally speaking, porting a legacy application to containerized infrastructure is more complicated. That's because most legacy applications are monolithic. Porting to containers, therefore, requires changes to the application itself, because the application needs to be broken down to run as a set of microservices.
However, the good news is that if an application was designed to run on SOA in the first place, it will probably be relatively easy to migrate it to containers. SOA applications are already written to run as a series of microservices, rather than as a monolith. To migrate to containers, then, the main effort will involve changing the way the different microservices communicate, but not changing the nature of the services themselves.
With Docker, application containers generally exchange information over private virtual networks. You'll need to make sure your application is configured to work with container networks. This will probably require adjustments to port mappings, host discovery and so on. But it likely will not mean that you have to rewrite your application's API.
It, therefore, takes some doing, in most cases, to migrate from a traditional SOA to a containerized one. But the work is not unreasonable, and it will pay off in the form of a more efficient, more easily managed infrastructure for your microservice-based application.
Why Docker instances are becoming the new norm
Why Java microservices work in a Docker world
Why Docker works for container development