Serverless and Containers: Not an Either/Or Decision

Containers and serverless computing are revolutionizing how cloud apps are built, and both are quickly moving into developers’ toolboxes. This has led to the hotly debated question: Which is better to use when creating an app?

This is the wrong question to ask. Instead, one should ask when to use serverless or containers for a given workload or use case within an application. Containers and serverless are two sides of the same coin, and the sum of them working together is greater than apart.

First, a bit of background on each. Containers quickly are becoming the underlying building blocks of cloud, which package up data and apps to transport them across platforms and systems. Container architectures provide a consistent means to package, secure, distribute and manage the life cycle of applications across environments. This eliminates the need to start from scratch if a team wants to run an app in more than one location.

Serverless has transformed how cloud services are used. One application of serverless, functions as a service (FaaS), is a programming model for building event-driven apps. Think of it as invisible glue, binding together relevant events and actions within apps to trigger infrastructure and services. For example, FaaS can call on AI visual recognition service when an image is uploaded to a mobile app, drawing intelligence from it to improve user experience. Because that AI tool is used only when needed, it doesn’t run constantly, which would be an enormous draw on resources and complexity.

Working together on the cloud, containers and serverless can be leveraged for different components and interactions within the app, depending on where they fit best. This allows developers to worry less about overall infrastructure, and opens up access to powerful cloud services such as AI, blockchain, IoT and quantum computing.

Evolving the Cloud Stack

The rise of cloud has spurred the pace at which software is built and delivered, as well as how easily companies experiment with new technologies.

Specifically, we’re seeing two trends. First, there is a demand to decompose large applications into smaller parts that can be updated quickly and independently from other parts of the application. Second, teams are agile and smaller so they can operate independently and deliver ideas to the market faster.

Containers and serverless are technologies that enable this new era of development. Developers only have to worry about code they are writing within one element of an app, or in response to an action or event. This minimizes concerns around infrastructure, and frees time to focus on delivering code. When used on the right platform, containers and serverless also bring scalability, load balancing and high availability, laying the groundwork for apps to roll out changes globally without any downtime.

The Harmonious Relationship Between Containers and Serverless

Developers today are dealing with increasingly challenging workloads. This could include feeding datasets into AI, or ensuring a data-intensive machine learning function has access to GPUs. Together, containers and serverless can help solve some of these pain points.

Consider an example in fintech. A financial services company could use AI to build a new risk management app, which evaluates investment options for customers. Using machine learning on a foundation of containers, the development team can build algorithms that analyze an investment’s performance and its associated risk, based on live data streaming in from current markets and historical market data. Prompted by serverless functions, these algorithms then could be fed back into risk models when needed, helping customers evaluate the potential of an investment.

So why do containers make sense for one part of the app and serverless for others? Machine learning (ML) models are stateful, meaning they need to keep track of data to learn and improve. ML services generally need special hardware such as GPUs, which can be accessed through containers. The risk evaluation and user interface services of the app only need the most current intelligence on-demand. This makes them a perfect fit to be written as a serverless function and enables the microservices that control these functions of the app to scale up or down when information is needed.

Additionally, building the app’s foundation on containers means it can be delivered securely and consistently across countries. A fully managed container service in the cloud can verify each instance is patched against emerging vulnerabilities and keep certain datasets within geographic borders. This is especially important with data compliance requirements such as the EU’s new General Data Protection Regulation (GDPR).

The Future of Containers and Serverless 

Many developers approach serverless versus containers as a question of what is best for the foundation of an app, but we need to reshape this into what is ideal for different parts of one well-running experience.

Developers need a range of tools in their toolbox for developing applications, similar to how carpenters need different tools for specific tasks. A carpenter wouldn’t use a hammer to drive a screw into a board: just because it sort of looks like a nail doesn’t mean you should hit it with a hammer. Adopting cloud has never been about restricting to one technology, and we shouldn’t start now. Instead, we must continue to design cloud platforms that embrace and encompass the benefits of emerging technologies.

Daniel Berg

Daniel is an IBM Distinguished Engineer responsible for the container and service mesh technical strategy within IBM Cloud. He has direct responsibility for the technical architecture and delivery of the IBM Cloud Kubernetes Service providing managed kubernetes clusters worldwide. Daniel has deep knowledge of container technologies including Docker and Kubernetes and has extensive experience building and operating highly available cloud-native services. Daniel is a member of the Technical Oversight Committee for the open source service mesh project and he is responsible for driving the technical integration of Istio into IBM Cloud.

Daniel Berg has 1 posts and counting. See all posts by Daniel Berg