agsandrew - Fotolia

Tip

The essentials of refactoring a monolith to microservices

Enterprise apps are complex, long-lived and vital to the business. In short, they're excruciatingly difficult to modernize. If you address these prerequisites, it will hurt less.

Refactoring an existing application into microservices has clear benefits. Developers can make updates without affecting the overall application, which reduces testing and release burdens. They can reuse code across different projects and adjust the software's features as business demands dictate.

However, refactoring a monolith to microservices is not as simple as writing new microservices to replace old monolithic code blocks. Development teams must take the time to assemble the right infrastructure, implement a strong deployment and provisioning pipeline, employ intensive monitoring systems, configure service independence, closely manage service communication, and cultivate the right team structure.

Let's run through each of these required conditions.

Infrastructure

Think about the existing technical infrastructure before you start refactoring a monolith into microservices. What are the plans for disaster recovery, service discovery, load balancing and scalability? Will all of the microservices need to scale, or only a few of them? Monoliths must scale as one component, but microservices can scale independently, and this provides economical resource consumption.

Logging should also be part of the platform infrastructure. Plan to log each inbound and outbound microservice call into a preconfigured repository, such as a file or database.

Rapid provisioning and deployment

Refactoring a monolith into microservices modernizes the application. But modern code is only part of the picture. Can you fire up a new server automatically and deploy services rapidly there? Automation is a must to enable rapid provisioning, and it often requires investment in new tools and training. If the organization has a long way to go toward operations automation, this could take time.

Code refactored into microservices can deploy independently, rather than with the whole application. Several services at a time should be able to roll out, while others in various stages are en route to deployment, without tremendous effort. Set the modernization project up for success with a code deployment pipeline.

Without CI/CD in place, it would be difficult -- if not impossible -- to manage integration and deployment of multiple microservices. CI/CD pipelines sometimes rely on a degree of manual intervention to move code from creation -- and through test -- to delivery and then to production. As with resource provisioning, develop a plan for full automation as budgets and staff skills allow.

Thorough monitoring

Proper team structure is of paramount importance for the success of microservices. When the application is legacy, it might come with legacy team structure.

Microservices interact and work together in many ways, not all of which come up in the test environment. Good basic monitoring in production improves application reliability and performance. If the current monitoring setup is sufficient, it should be able to detect failures related to the CPU, network or application quickly over the entire infrastructure. The monitoring system should also be able to directly track messages as they flow through the system.

Monitor and assess application performance periodically. Establish acceptable system-level metrics, such as CPU and RAM utilization, for the microservices deployment, as well as application performance metrics. Some examples include average response time, transactions per second and application availability. These numbers might change when the application is refactored from a monolith to microservices, so ensure the team is in agreement on new thresholds.

Degree of independence

In a microservices architecture, services can either be completely independent of one another or coupled to some degree. Choose either the database-per-service or shared database approach.

If each microservice has its own database, the persistent data is private to that microservice and accessed by its API only. Although this approach promotes loose coupling, it adds complexity, because the organization must manage multiple databases -- perhaps a combination of SQL and NoSQL databases. It's also difficult to write queries that join data spanning multiple databases, and data updates must take into account the multiple services dependent upon them.

When services share a database, it ensures data consistency across the services. However, the services are not completely decoupled. If you update the schema of the database for one service, it can affect other services. Moreover, a shared database in a distributed microservices architecture can pose deployment problems and database contention issues.

Interservice collaboration

How will microservices communicate and collaborate with each other? Define well-established interfaces in the coding language, seen as contracts between microservices. Best practices dictate that these contracts withstand changes. Be aware of any real-time coupling issues for microservices that communicate over REST or gRPC calls.

Ideally, refactoring a monolith to microservices will create autonomous services that don't need to communicate with each other in real time. With autonomous services, an application can keep working even if some services fail. Take advantage of event-based messaging to implement autonomous collaboration between microservices.

Refactor the team

Proper team structure is of paramount importance for the success of microservices. When the application is legacy, it might come with legacy team structure.

Identify the services that the modernized application needs and build vertical teams for each of them. Typically, the teams own the API, business logic and data layers, and they are independent of other groups. Build-and-run teams are those that own everything starting from development to operations. Ultimately, teams should be able to scale appropriately as the business grows. Collaboration and cooperation, espoused by the DevOps movement, facilitate rapid provisioning, close monitoring and other requisites listed above.

Next Steps

Understanding the modular monolith and its ideal use cases

Dig Deeper on Enterprise architecture management

Software Quality
Cloud Computing
TheServerSide.com
Close