Serverless Computing

GUPTA, Gagan       Posted by GUPTA, Gagan
      Published: September 22, 2021
        |  

Enjoy listening to this Blog while you are working with something else !

   

The world is changing quickly and rapidly at a quick speed. The technology and underlying architecture under it are evolving even more rapidly. In this article, we will discuss the new technology "Serverless Computing."

What is Serverless Computing?

Serverless computing is a method of deploying and running code in the cloud without dealing with server provisioning and infrastructure management. Despite its name, serverless still relies on cloud or physical servers for code execution. However, developers are not concerned with the underlying infrastructure. This is left to the serverless provider which dynamically allocates the necessary compute resources and manages them on behalf of the user.

To answer the question "what is serverless?" it helps to understand how it works. In a serverless architecture, applications are distributed to meet demand and scale requirements efficiently. Customers are only billed when an application is used, so this is particularly cost-effective in environments where applications run on-demand.

AWS Lambda functions are an example of how a serverless framework works:
-Developers write a function in a supported language or platform.
-The developer uploads the function and configuration for how to run the function to the cloud.
-The platform handling the function containerizes it.
-The platform builds the trigger to initiate the app.
-Every time the trigger executes, the function runs on an available resource.

When an application is triggered, it can cause latency as the application starts.

Components of Serverless Computing

There are three components of serverless computing.

1. API Gateway: This gateway acts as a mode of communication between the frontend and the FaaS (Function as a Service). It has API endpoints with some functions that run business logic. Moreover, if the server goes out of the equation, there is no need to deploy and manage the load balancers.

2. FaaS (Function as a Service): Some common examples of FaaS are AWS Lambda, Microsoft Azure, Google Cloud Functions, etc. The FaaS solution is the critical enabler in serverless computing. This is used in the framework of the deployment.

3. BaaS (Backend as a Service): It is a type of NoSQL database that is cloud-based distributed, and it also removed the database administration.



Our On-Premise Corporate Classroom Training is designed for your immediate training needs

Serverless Computing
Serverless Computing

How Serverless Computing Compares to BaaS, PaaS, and IaaS?

As with any software trend, there is no official definition that describes what serverless is and what it is not. That's why serverless computing is often confused with other cloud services and deployment models. The concept of serverless computing revolves around two similar areas:

Backend-as-a-Service - BaaS enables developers to focus on writing frontend interfaces while offloading all backend operations to a service provider. These behind-the-scenes tasks usually involve ready-to-go user authentication, storage, database management, and hosting services. Also, developers don't have to manage servers running their backend, enabling faster app deployments.

Functions-as-a-Service - This serverless cloud service model does away with infrastructure management. The service provider is tasked with deploying compute resources on-demand to execute the user's code. This happens whenever an event or request is triggered. Serverless functions run in stateless containers, meaning compute resources are deployed only when the function is invoked. The main point of confusion is between Backend-as-a-Service and Platform-as-a-Service (PaaS). The former is a technique of serverless computing, while the latter is a cloud deployment model. Even though they share some basic characteristics, PaaS is not aligned with the requirements of serverless.

Platform-as-a-Service - With PaaS, users rent the hardware and software solutions necessary for development workloads from a service provider for a subscription fee. It allows developers to spend more time coding without worrying about infrastructure management. On the other hand, BaaS offers additional features such as out-of-the-box user authentication, managed databases, email notifications, and the like. BaaS also allows developers to focus solely on building the frontend while integrating various backend services on demand.

Infrastructure-as-a-Service - IaaS refers to a self-service cloud solution where the provider hosts the infrastructure on behalf of the user. All server provisioning and management operations including software installation are handled by the user. Some IaaS providers also offer serverless solutions but as distinctly different products.

What are the top Serverless providers?

– AWS Lambda
– Azure Functions
– Google Cloud Functions
– Oracle Functions
– IBM Cloud Functions

Our On-Premise Corporate Classroom Training is designed for your immediate training needs

Serverless: pros and cons

Pros

There are four primary advantages of serverless computing:

- It enables developers to focus on code, not infrastructure. (It's also a polyglot environment, enabling developers to code in any language or framework: Java, Python, node.js, with which they're comfortable.)

- When implemented correctly, Serverless computing is generally very cost-effective. Customers pay for execution only. Pricing is done on a per-request basis, meaning customer pay only for the resources they use during execution.

- For certain workloads, such as ones that require parallel processing, serverless can be both faster and more cost-effective than other forms of compute. The serverless vendor handles all of the scaling on demand.

- Serverless application development platforms provide almost total visibility into system and user times and can aggregate that information systematically.

Cons

With so much to like about serverless computing, organizations are using it for a wide variety of applications. However, there are certain applications for which serverless is not favorable, and technical and business trade-offs to consider:

- Stable or predictable workloads: FaaS and serverless workloads are designed to scale up and down perfectly in response to workload, offering significant cost savings for spiky workloads. But serverless doesn't offer these savings for workloads characterized by predictable, steady or long-running processes, and managing a traditional server environment might be simpler and more cost-effective.

- Cold starts: Because serverless architectures forgo long-running processes in favor of scaling up and down to zero, they also sometimes need to start up from zero to serve a new request. For certain applications, this delay isn't much of an impact, but for others - for example, a low-latency financial application - this delay wouldn't be acceptable.

- Monitoring and debugging: These operational tasks are challenging in any distributed system, and the move to both microservices and serverless architectures (and the combination of the two) has only exacerbated the complexity associated with managing these environments carefully.

- Vendor lock-in: Serverless architectures are designed to take advantage of an ecosystem of managed cloud services and, in terms of architectural models, go the furthest to decouple a workload from something more portable, like a virtual machine (VM) or Docker container. For some companies, deeply integrating with the native managed services of cloud providers is where much of the value of cloud can be found; for other organizations, these patterns represent material lock-in risks that need to be mitigated.

What is next for serverless?

Serverless computing continues to evolve as serverless providers come up with solutions to overcome some of its drawbacks. One of these drawbacks is cold starts.

Typically when a particular serverless function has not been called in a while, the provider shuts down the function to save energy and avoid over-provisioning. The next time a user runs an application that calls that function, the serverless provider will have to spin it up fresh and start hosting that function again. This startup time adds significant latency, which is known as a "cold start".

Once the function is up and running it will be served much more rapidly on subsequent requests (warm starts), but if the function is not requested again for a while, the function will once again go dormant. This means the next user to request that function will experience a cold start. Up until fairly recently, cold starts were considered a necessary trade-off of using serverless functions.

Conclusion

Even though the name suggests the absence of servers, serverless computing still relies on cloud or physical servers. It is a computing model that eliminates infrastructure operations, allowing developers to focus on writing and deploying apps. The serverless model revolves around two key areas: Backend-as-a-Service and Functions-as-a-Service. The former provides users with a ready-to-go backend architecture, while the latter enables running apps in stateless containers. These containers are provisioned automatically based on events or triggers. As such, serverless is not a silver bullet solution for all current development problems. It's mostly geared toward non-monolithic apps that employ a microservices-based architecture.

As more and more of the drawbacks of using serverless get addressed and the popularity of edge computing grows, we can expect to see serverless architecture becoming more widespread.

It is important to note that serverless computing is recent and unrefined. Any cases of design also better fit the conventional architecture. If you want to migrate to serverless, first validate the FaaS programs to ensure that it runs fine. FaaS is a major cloud-based revolution that is expected to stabilize in the upcoming years.



Support our effort by subscribing to our youtube channel. Update yourself with our latest video`s on Data Science.

Looking forward to see you soon, till then Keep Learning !

Our On-Premise Corporate Classroom Training is designed for your immediate training needs

Serverless Computing
                         



Corporate Scholarship Career Courses