NEWS AND RESOURCES

Microservices and Serverless Computing

Lee Hylton / February 25, 2020

In talking about serverless computing, we should first dispense with a common source of confusion– the word “serverless.” Though it would seem to indicate otherwise, it doesn’t mean that there are no servers. It simply means that you don’t have to own any servers, and, what is more, you don’t have to provision or configure the servers you use.

Though this seems like normal cloud computing, serverless computing takes it even further. Serverless – as it is commonly shortened – does not consume the always-on cloud server instances you’ve purchased; rather, your cloud provider spins up short-lived instances – on an as-needed basis — to invoke fine-grained computing tasks that you define.

To see how we get to that point in the evolution of cloud computing, let’s discuss it in the context of current approaches.

A short refresher on cloud computing

The cloud computing paradigm has accustomed enterprises to outsourcing the IT infrastructure they use. They can rent computing cycles, storage and application software as utilities from a cloud provider, on a pay-for-use basis. In this model, users benefit from the elasticity of demand offered by cloud-based solutions, but are still required to take on certain responsibilities depending on what “level” of cloud resources are chosen – whether “bare metal”, platform-as-a-service (PaaS), infrastructure-as-a-service (IaaS), or software-as-a-service (SaaS). The cloud user and cloud provider share varying responsibilities, as the following figure shows.

The cloud provider assumes the responsibilities for running and maintaining the infrastructure for everything below the dark lines in the figure. At the extreme left is the non-cloud or fully hosted solution — where the enterprise data center maintains the entire IT stack; while, moving to the right, an Infrastructure as a Service (IaaS) offering relieves it from buying and maintaining the computing hardware but requires it to deploy, run and maintain the OS, middleware and applications. Next, a Platform as a Service (PaaS) solution requires that the enterprise maintain and run any applications of its choice on a computing infrastructure wholly owned by the cloud provider. Finally, we can see an example of a Software as a Service (SaaS) solution, where even the applications are hosted by the cloud provider.

Many enterprises have found a sweet spot in the PaaS deployment model, where they can maintain control over their applications while utilizing the metered, scaling properties of the underlying cloud services. The trend now is to define new applications as cloud native, capable of running on clouds offered by different providers. This is achieved by using virtual machines and/or containers – a topic we’ll discuss in a later blog post. And cloud providers have obliged, by evolving the PaaS model to provide container support for their customers.

For this post, though, we want to explore how applications in a PaaS deployment can be structured as microservices and work in a serverless environment — where parts of running the application are outsourced to the cloud provider.

Microservices

The goal of shorter time to market and agility in software development has led to the creation of many computing paradigms – client-server remote procedures, message-based middleware, service-oriented architectures, and so on. Microservices is a step in that evolution and is the latest in the attempts to tackle how software can be made both modular and reusable.

Microservices is a software development paradigm which breaks up an application’s functionality into small parts, so that each part can be developed, deployed and maintained independently, ideally by small teams that can use the technology, language and framework of their choice. To fulfill this promise, each microservice must be well chosen, with the right level of granularity to perform a clearly defined task that can be reused by other microservices to work together to form the application. Microservices communicate using language-neutral APIs and standard protocols such as HTTP.

A simple example might be an e-commerce application, where the end user app, authentication, product catalog, product search, querying/updating the inventory database and shopping cart are each defined as a microservice, with a choreography between these services to fulfill a particular user request.

Designing a microservice based application is an art in choosing the right level of granularity for the deconstruction of an application’s functionality. Many very small microservices require too many interactions between them, leading to higher latency, while too few coarse-grained services creates stronger coupling between the parts and reduces the benefits of smaller, agile and independent development teams. When done right, microservices can add a valuable tool to an enterprise’s software development arsenal.

Serverless computing

Using the elasticity of cloud computing does not relieve the cloud user from decisions regarding how much computation power to purchase. Depending on the application’s traffic pattern, you still have to plan for how many server instances you might need, based on your anticipated peak usage. It doesn’t work to wait to spin up new servers when the demand spikes, as the latency in that process can affect user experience (and your revenues). On the flip side, when there is no traffic spike, many server instances lie idle, while still being paid for.

Serverless addresses the problem of over-provisioning and underutilization, especially for computational tasks that are high volume, short lived and occur in bursts. With serverless, you hand off such tasks to the cloud provider. A well-defined API to a microservice that does a specific task – taking an input, performing a computation and returning a result, such as a function call defined by the API – is the key to understanding where serverless best come into play. That’s why serverless is often called Function-as-a-Service (FaaS). It allows individual tasks – function calls – within an application in a PaaS environment to be outsourced to the cloud provider.

Let’s say you need to check the inventory level of a best-selling product in your catalog, as a part of serving a customer. You offload the computation of this frequently invoked “read()” function call to your cloud provider, without the need to choose or create a server instance to run it. You pay only for the number of times you invoke this function. The cloud provider dynamically manages the underlying resources, leaving the developer completely free of concerns about idle capacity, peak loads, and other server utilization issues. Thus, in that sense, the computation is “serverless”. Under the hood, the cloud provider runs each function in its own container, deleting it after the code completes execution.

Serverless is not a magic bullet for every computing situation. It is cloud provider-specific, with all the leading clouds offering this capability for a range of languages and runtimes, but some feel that this vendor lock-in is a problem. However, for well-designed microservices, a serverless solution can cut operational costs and speed up time-to-market for new features that reuse such microservices.

How Blue Sentry Can Help

Today, Blue Sentry is helping companies move application development to the cloud with our Kubernetes DevOps Accelerator — a turnkey solution where we will build, architect, and deploy a customized enterprise-grade environment for microservices adoption. Plus, we will provide full infrastructure management including security, monitoring, disaster recovery, and scalable DevOps support to ensure continuous uptime, delivery, and productivity of your team.

Companies embracing microservices are transforming IT into lean teams aligned with business priorities. Blue Sentry will help you ease the process and pave the way to cloud-native development success.

Getting started is easy. Returns will be seen in less than 30 days.

Want to get started? Click here to reach out.