This content is part of the Essential Guide: Developers' guide to deploying microservices and containers

Merging SOA and web services to improve microservices management

Learn how merging SOA and web services will impact cloud-deployment thinking and make room for microservices.

Microservices, meaning small application features designed for broad reuse and supporting ready deployment and...

scaling, have taken application planners by storm, but they're still only a small component of deployed applications. Part of this is because they are cloud-specific features in search of a shift in cloud-deployment thinking, and part because microservices are caught between two worlds: the world of SOA and the world of the web. In 2017, all of these issues will be resolved, and microservices management will mature to take the place in development toolboxes.

Proponents of the SOA model of applications often see microservices as an evolution of SOA to accommodate cloud computing. Microservices, correctly implemented, have most of the characteristics of RESTful, web-like, functional components. This group expects SOA governance, directory structures, security and binding principles implemented in a cloud-friendly way in microservices. For this group, SOA lives on in microservices management.

Cloud- and web-savvy architects see microservices as a formalization of "web services," meaning the building of functional components based on web principles, bound to applications through the ultra-simple and proved-out web/HTTP/HTML or JSON principles. Simplification and efficiency are the goals here, and with these changes, the web and the cloud absorb SOA.

Microservices management and cloud

A casual read of microservice best practice documents seems to show that the SOA side has been winning. Microservices are usually depicted as being accessed through an API manager function that imposes security and compliance policies and meters usage. Kong is an example of an open source version of an API manager. The tools are introduced between the microservice user and the microservice, and provide metering, security and compliance monitoring. API managers can make microservices very SOA-like, but they can also introduce a significant additional process delay because of their insertion. Microservices called often are particularly impacted by this insertion and, at some point, the accumulated delay can actually deter further microservice use.

Amazon and Microsoft both provide a cloud-hosted "API Gateway" service that has similar capabilities. The functionality of a traditional API manager is distributed in these cloud tools. If the microservices are hosted in the cloud, the Gateway introduces lower delay than using a premises-hosted API manager would. However, latency is still an issue for frequently used microservices.

Monolithic vs. microservice

It's always been possible to present microservices as simple RESTful APIs, without an intermediate manager or gateway, but this can pose a problem if microservice instances are scaled under load. Scaling requires a form of load balancing among the instances, and an API manager or gateway could provide that if it's present. An alternative that will be developed further in 2017 is the use of the Domain Name Servers (DNS) that resolve URLs to IP addresses. There are load-balancing DNS services (Cedexis and Dyn Inc.'s Traffic Director are examples) and also commercial and open source products available. NGINX Plus is widely used by over-the-top companies, for example.

Amazon and Microsoft both offer DNS and load-balancing tools as part of their clouds' web service inventory. While these don't provide perfect load balancing or security, they are nearly enough to offer microservice management without the risk of exploding latency. Both companies are said to be working on microservice-specific or microservice-tuned tools for 2017.

Designing microservices management

Another pathway for resolution that will impact microservice design in 2017 is a "serviceless" version. API managers and gateways in microservice application provide a means of registering and locating the services, and this can be essential when microservices are moved around in the cloud or scaled to accommodate work. A large microservice user has now combined the Apigee API gateway and functional programming (also called "Lambda" functions) from Amazon. Amazon's Lambda can build microservices that aren't literally hosted anywhere in particular; they are instantiated as needed and then removed. Because there's no persistent service instance to reuse, you don't need to keep track of where a microservice lives. Most interestingly, Amazon Web Services (AWS) Lambda is often linked with Amazon's cloud workflow tools, and workflows may well be the future of the cloud.

To break the cloud away from the limitations of traditional programming and data centers, you have to forget resources and hosting and think instead about applications as components linked by workflows. If microservices are viewed as traditional application components, distributing a lot of them is certain to create major issues with performance because of the multiplication of network delay that's associated with the greater number of components. However, if applications are viewed as workflow-stitched components, then the Lambda-based microservices could be instantiated along the workflows, introducing minimal incremental delay.

To do this requires thinking of microservices less as a call/return function than as a pipeline function, which is why the idea connects so well with the Lambda service of Amazon's AWS. Recall that, this year, Amazon introduced Greengrass, a software component that lets AWS Lambda run outside Amazon's cloud, close to the edge. Since Lambda already allows functions to be distributed along a path (it actually doesn't even associate a function with a specific hosting location or service) and since all Lambda programming is pipeline-oriented, it's ideal for a workflow-support mission, and a facilitator of the transformation to workflows overall.

The use of Lambda or functional programming to build microservices may end up being the most significant trend for microservices management in the coming year. Lambdas are inherently stateless, easy to build and debug, and can be used in either the traditional call/return or pipeline model, and now Amazon has proved the same Lambdas can run in their cloud or on the premises. Microsoft also supports Lambda programming as part of .NET and Azure. Since some special development treatment is essential in building an effective microservice and since Lambda development imposes just that treatment, adopting Lambdas will promote microservices overall.

Most truly significant advances in software architecture and design practices require basic changes in applications, and early examples of these advances are limited by the fact that those changes haven't yet happened. We're now on the verge of making major, cloud-centric, changes in how we build applications, and microservices management will benefit significantly from those changes in the coming year.

Next Steps

Understanding the new world of microservices and containers

Why microservices frameworks remain challenging

Containers and microservices deployment in 2017

Find out which container registry vendors are most popular

'Practical Microservices' speaks the language of web developers

Dig Deeper on Distributed application architecture