Microservices, Containers and Kubernetes: Which Applications Benefit?
Application design always presents developers with a series of trade-offs. No approach is perfect for every deployment. While the market is abuzz about microservices, containers and Kubernetes—for many good reasons—instances arise when a traditional monolithic approach is a better option. Let’s take a look at what microservices, containers and Kubernetes do so we can clarify which applications mesh with the emerging technologies.
For decades, software architects and engineers have been trying to break monolithic applications into smaller, agile components. Why? Monoliths are hard to build, often inefficient and challenging to update.
Microservices and containers have emerged as building blocks for next-generation application development. The two are often paired even though they can operate autonomously of each other. Kubernetes has become the standard for managing containers at scale for agile, secure software development.
While microservices and containers have been anointed by many as the future foundation for new application design, the reality is that these ideas are not completely new. Instead, they are improvements on traditional design. In fact, many legacy Rails/Django/Node.js applications rely on microservices. A container is a method of packaging, deploying and running a specific Linux program/process. In effect, they are as old as the venerable operating system.
What is a Microservice?
A microservice is a broad category of software in which a small block of code performs a specific task. For example, a business intelligence application is responsible for processes, such as data ingestion, data management, reporting, sharing, dashboarding, etc. Each functional piece can be designed as a microservice.
Benefits
Microservices deliver many benefits. This design is more suitable for frequent deployments than monolithic applications—each microservice, having a narrow scope and more modest system requirements, can be re-deployed independently from the rest of the system. This is in contrast to monolithic applications, where coordination between multiple teams is required for every deployment and management of infrastructure resources needs to be centralized.
Also, modern testing development tools feature high levels of automation, enabling the testing process to become faster, less complex and more effective. Developers conduct more tests, so code become more robust and more resilient than with monolithic system services.
Since updates occur so rapidly, pairing microservices with lightweight and portable containers make sense. The symbiotic nature of microservices and containers enables developers to quickly provision infrastructure services, let a microservice run, de-provision the container and retire it cleanly.
Challenges
But microservices are not a panacea. In some cases, they may be overkill. If an application and development team is small and the workload isn’t heavy, a more traditional monolithic application may be all that’s required.
In addition, microservices add complexity to the development and management processes. Breaking software into smaller components means that more pieces need to be stitched together, so companies must plan carefully. Businesses can incur high upfront research, development and training investments, especially if the enterprise has little to no experience with the technology. In some cases, developers get carried away with microservices capabilities and over-engineer.
New Management Needs Emerge
Microservices and containers create management and resource planning challenges. Developers now have to solve problems like predicting how much computing resources each service will need, understanding how these requirements change under load, knowing how to carve out infrastructure partitions and divide them between microservices, or enforcing resource restrictions.
With applications changing so rapidly, automated service discovery becomes a requirement. Hard-coding IP addresses and hostnames is much too time-consuming.
Kubernetes and Microservices
Kubernetes was designed to share computing resources across multiple processes and has emerged as a good complement for applications using multiple microservices and containers. Kubernetes is the master of dynamically allocating computing resources to fill the demand and allows organizations to avoid paying for computing resources they are not using.
Kubernetes, however, is a complex technology, and many organizations have little experience with it. Handing management and deployment to a hosted Kubernetes services provider such as Amazon EKS is one option. But this approach is not always viable. For compliance and performance reasons, certain organizations need to run their own Kubernetes clusters across multiple cloud providers and enterprise data centers.
While there is no universal approach for application development, many organizations are realizing the massive cost savings and other benefits of microservices. If one does begin to dabble with microservices, then the need for containers and Kubernetes quickly arises. Because Kubernetes is a complex system that can be difficult to oversee, enterprises may consider using hosted services. When business requirements demand that enterprise run its own Kubernetes clusters or they need to publish Kubernetes applications as downloadable appliances, open source-based solutions are the best alternative.