Limits to Serverless Computing - BunksAllowed

BunksAllowed is an effort to facilitate Self Learning process through the provision of quality tutorials.

Community

Limits to Serverless Computing

Share This
Serverless architecture has limitations, and it's crucial to understand when to use it and how to implement it, while also acknowledging its drawbacks and addressing them in advance to ensure successful implementation. Some of these limits include
  • You want control of your infrastructure.
  • You’re designing for a long-running server application.
  • You want to avoid vendor lock-in.
  • You are worried about the effect of “cold start.”
  • You want to implement a shared infrastructure.
  • There are a limited number of out-of-the-box tools to test and deploy locally.
FaaS is a unique service that allows a third-party provider to handle the development and deployment of server applications and systems, eliminating the need for users to develop and deploy their own systems.

Control of Infrastructure

A serverless architecture may have limitations due to the need for infrastructure control. While cloud providers maintain control and provisioning of infrastructure and OS, developers still have control over specific components. Users can choose runtime, memory, permissions, and timeout within the provider's function portal, allowing them to maintain control without maintenance.

Long-Running Server Application

Serverless architectures are designed for fast, scalable, event-driven functions, making them ideal for long-running batch operations. Cloud providers typically have a five-minute timeout period, preventing processes from taking longer than this time. The goal is to transition from batch processing to real-time, quick, and responsive functionality. Serverless architecture can be a good way to achieve this, as seen in a travel insurance company's system that sends a batch of all flights for the day to an application that monitors them and informs the business of flight delays or cancellations.

Vendor Lock-In

Vendor lock-in is a common concern for companies transitioning to serverless technology, as they fear committing to AWS and facing difficulties in transitioning to another provider. However, there are ways to simplify vendor switch using functions in applications. One popular strategy is to remove cloud provider logic from handler files, allowing for easy switch to another provider. This approach can help companies navigate the challenges of serverless technology and avoid potential vendor lock-in.

Cold Start

Cold start occurs when a function takes longer to respond to an event after a period of inactivity. To avoid this, you can establish a scheduler that wakes the function periodically. AWS offers CloudWatch, which allows you to set scheduled events to occur every so often. Azure and Google also offer this feature with timer triggers. Google does not have a direct scheduler for Cloud functions, but you can create one using App Engine Cron, which triggers a topic with a function subscription.

Shared Infrastructure

Serverless architecture's benefits depend on the provider's ability to host and maintain infrastructure and hardware, which can result in some costs for serverless applications. This can be a concern for businesses, as serverless functions can run alongside each other regardless of ownership. However, this doesn't affect the code, but ensures the same availability and scalability across competitors.

Limited Number of Testing Tools

The growth of serverless architectures faces limitations due to a lack of testing and deployment tools. However, as the field expands, there are emerging tools that aid in deployment. Cloud providers like Azure and AWS are expanding on local testing for serverless applications. NPM has released testing tools like node-lambda and aws-lambda-local, allowing testing locally without deploying to the provider. One popular deployment tool is the Serverless Framework, which is compatible with AWS, Azure, Google, and IBM. This tool simplifies configuring and deploying functions to the provider, resulting in faster development time. The Serverless Framework is an open source application framework that enables the deployment of auto-scaling, pay-per-execution, event-driven functions to various cloud platforms.



Happy Exploring!

No comments:

Post a Comment