Are You Addressing These Serverless Challenges?
Are you addressing these top serverless challenges?
March 04, 2019
March 18, 2019
Nearly half of IT decision makers are either using or evaluating serverless, a rate that's increasing over 2.7 times as quickly as containers. As a relatively new technology, there are a lot of misconceptions behind serverless computing and what it means. Today we’re unpacking each of these serverless myths one-by-one.
Serverless isn't some radical new approach to software development, but rather the next evolutionary step in cloud computing. Much like how containers abstracted away the underlying OS, serverless abstracts away the underlying environment. This lets developers focus less on environment maintenance and more on application development.
The difference now is that the hardware and operating system layers are handled by Function-as-a-Service (FaaS) providers. This could reduce the workload on operations engineers and make applications easier to develop, but it's not a radical change from containers.
Starting a serverless function from scratch (a "cold start") is relatively expensive and time-consuming. To cut on cold starts, many FaaS providers keep functions running ("warm") for several minutes after the last request. Each FaaS provider varies in how long they keep functions warm, ranging from minutes to an hour, depending on different conditions.
Developers can't always assume that each request uses a fresh instance of a function. Any changes to the function might be reused for multiple requests. The benefit is that subsequent requests will take less time to complete, but it means functions need to be completely stateless. So the next time you invoke a function, keep in mind that its environment may have been running for quite some time.
Recurring tasks like cron jobs are a perfect fit for serverless. They are predictable, short-lived, and aren't as affected by cold starts. However, this doesn't mean serverless functions are only for running cron jobs. Serverless functions are commonly used to respond to events, send messages, process data, and with Amazon API Gateway introducing support for Websockets, run user-facing webapps. Using serverless for an application depends on your application’s architecture rather than limitations in serverless itself.
Just because CSPs manage serverless platforms doesn't mean DevOps teams are completely off the hook. Serverless follows the shared responsibility model, where FaaS providers secure resources used to run cloud workloads and customers secure the workloads themselves. With serverless functions, DevOps and Security teams must now focus on vulnerabilities in application code and its dependencies.
Jeremy Daly, CTO of AlertMe and serverless veteran, explains "[serverless] introduces additional complexities in how we manage security and maintain our applications. [OWASP's Top Ten security risks] are still relevant and may even be harder to detect." Developers need to understand and account for the new structure of serverless applications when securing their applications.
The increased networking of serverless functions also puts applications at risk. Serverless functions connect to and receive data from multiple sources including HTTP requests, other functions, and message queues, for example. This significantly increases the number of vectors attackers can use, and without insight into the underlying OS, engineers need to be extra careful in how they secure their applications.
Serverless can be less expensive than running a traditional application, but that doesn't mean it will be. There are a lot of factors that go into the cost of serverless, including:
Going serverless can also have ancillary costs resulting from logging, monitoring, routing, and transferring data. For applications built on AWS Lambda, Lambda itself can account for as little as 2% of the total hosting costs. While serverless isn't always the cheaper option, the real cost savings will come from a lower TCO, smaller operations teams, and faster time-to-market.
Serverless architectures are designed for scalability. The fact that serverless functions are small and independent means they can scale up or down on-demand without consuming unnecessary resources and without affecting other functions. For large, unpredictable workloads, functions offer a level of speed and flexibility that even containers can't match.
For instance, the consumer robotics company iRobot uses serverless functions to schedule cleaning jobs, run unit tests, and deploy software updates to over 20 million robots worldwide. The system is completely event-driven and is hosted on AWS Lambda, making it much more flexible than a monolithic or even container-based application.
Securing serverless applications poses two challenges: engineers can't control the operating environment, and basic best practices (such as granting least privilege to IAM roles) are no longer adequate. Engineers need to approach functions as discrete operational units instead of one complete application and secure them as such. Coming from a monolithic or even container-based application, engineers might find this difficult.
The key difference from other architectures is that each function is essentially a vector for attackers. As a result, each function needs to be secured, assigned limited privileges, and tested independently. Teams still need to rely on the FaaS provider to report metrics and log events, but they're ultimately responsible for protecting their app against cyber threats.
Serverless and FaaS are used interchangeably, but they're two very different concepts. FaaS is an application design pattern where functions are triggered by events such as HTTP requests. Serverless is an architectural design pattern where the underlying OS and software are abstracted away. FaaS is one implementation of a serverless architecture along with data storage, analytics, messaging services, and more.
The switch to serverless has been incremental, and the debate between serverless and containers is far from over. Serverless, however, provides countless numbers of benefits to organizations in modifying the way we build applications. Continuing to unpack the nuts and bolts of each of these serverless myths will assuredly be a definitive help for enterprises debating about the best new step forward.
March 04, 2019