Our teams leverage the microservice design approach to seamlessly develop new systems and migrate services without interruption.
Software applications have traditionally followed a standard design approach for decades: teams build and configure databases, implement server-side services and features, and develop a user interface that makes interactions between their application and users possible. As applications evolve and software teams experience attrition over the years, these systems often turn into monoliths that are difficult to maintain and upgrade. Challenges in dependency management emerge, including:
- Difficulties tracking interactions of various components means modifying one area of the application can have an unexpected outcome or behavior in another part
- Adding new features and scaling the application can be challenging when the responsibilities of each application component overlap
- Complex dependency management requirements add significant time for software teams
Lean microservice-powered systems for federal customers
At Tetra Tech, we develop microservices for multiple clients, including developing new systems and migrating monolithic applications to lean microservice-powered systems.
In a recent client project, we migrated an existing system with dozens of interdependencies and a monolithic architecture, which had made maintenance and upgrades cumbersome. Our team leveraged the Strangler Pattern approach—a design pattern to migrate legacy architecture components to microservices—to develop, test, and deploy dozens of new microservices that power system messaging, alert aggregation and notification, and data formatting across multiple data formats. This enabled us to simultaneously test our microservices against the existing services and transition each service to the new microservice without any interruption to users.
Microservice solutions designed with interdependencies in mind
We employ our microservices design approach for our government customers, where components of an application are broken down into lightweight, independent services that communicate through application programming interfaces (APIs). They can maintain their own state, manage their own databases, or remain stateless. Microservices focus on solving specific domain or business capabilities and should be granular. The benefits of this approach include:
- Ensuring a modular design
- Decreasing the risk of failure in one service impacting another
- Easy updating and enhancing of services
- Deploying services independently and easily
- Selecting the technology that best fits the requirements of that service
In contrast, traditional and monolithic applications need to be completely rebuilt and deployed when components change, lose their modular structure over time, require scaling the entire application over individual components, and eliminate flexibility with technology choices.
Integration and benefits of microservices and containers
Containers have become ubiquitous in software development and deployments, and our federal clients increasingly embrace containers over traditional virtual machines. Containers provide the ability for development teams to build features and services for an application that will work in every environment—including development, testing, production, and both virtual and physical servers. They provide a definitive separation between one another while sharing resources, enabling containers to run on the same server, but run in isolation and not impact each other if there is a technical issue.
Containers can also be ephemeral and be created or destroyed easily. This enables teams to easily deploy and test new features in isolation and in any environment without impacting another developer’s workflow or other components of the application.
A container will maintain its own run time environment, tools, databases, and APIs—creating a completely isolated environment for service development. This provides a natural approach for creating and deploying microservices, while incorporating microservice development into a team’s DevSecOps pipeline and workflow. A developer within a team can develop their microservice and use Docker or OpenShift to create a container in seconds to run, test, fix bugs, and deploy their microservice. Once the developer is finished, they can destroy the container instance in seconds with no impact on other team members or other features within the application. This process speeds up the development cycle and the time to market for new features and enhancements.
With tools like Docker Compose, our teams can define each microservice as a Docker container within a single file and execute multi-container Docker applications in any environment, such as testing your services in a staging or testing environment. Docker containers can then be deployed to Docker Swarm or Kubernetes for container orchestration, deployment management, and automatic creation and tear down of containers as needed, also known as scaling. Leveraging Docker in conjunction with Docker Compose provides complete container and microservice integration, as each service is configured and managed in the container ecosystem.