Serverless application architectures, sometimes referred to as Function as a Service (FaaS), are rapidly growing in popularity. The attraction of an application that scales dynamically with load, charges only for actual usage and avoids the challenges of server management is undeniable. FaaS reduces some security risks but it can also increase others. Below are several examples to consider while addressing serverless application security posture.
Going Serverless: Risk Reduction and Risk Transfer
- Server Configuration and Security – Even though FaaS is touted as being “serverless,” there are still servers underlying the technology. However, maintaining and securing those servers and their operating systems is the responsibility of the cloud provider. The risks involved with managing the underlying infrastructure are transferred to the cloud provider as with many other cloud services. Teams utilizing FaaS, while no longer responsible for keeping the underlying servers vulnerability-free, still own the risk that comes with using the service and accepting that the provider is appropriately managing the service. Particular consideration should still be given to vulnerabilities discovered in platforms used.
- Persistently-Compromised Resources – It is unlikely and difficult for attackers to maintain persistent access to a purely serverless environment due to the ephemeral nature of the underlying environment. Many common attack scenarios involve the threat actor compromising a system within the target’s network and then using that as a pivot point to discover more about the target environment. Serverless architectures provide fleeting, temporary execution environments, so even a successful attacker will be unlikely to maintain access for long.
- Scaling – Traditional web applications hosted on web servers have the challenge of scaling with the number of requests received. Cloud technologies such as AWS Auto Scaling allow more traditional web applications to adjust capacity dynamically based on load. FaaS takes the idea of scaling to an entirely different level. Each piece of the web application can scale transparently by simply invoking more serverless function instances as needed based on the number of requests received. Also, with serverless there is no reason to run a persistent base of web or application servers, since there is only a charge for the number and duration of actual FaaS invocations. Beware though, of attacks that aim to drive the monthly bill up due to rapid and continuous calling of exposed functions.
Going Serverless: Increased Risk
- Privilege Management – The management of privileges for a serverless application is paramount. Applying the principle of least privileged to various components of a serverless application can be a challenging endeavor leading to frustration and, ultimately, over-privileged access to resources. Serverless application security can be greater than that of a traditional web application due to the ability and leading practice of limiting access to resources on a function-by-function level. Given the complex nature of this task, permissions are most successfully implemented in serverless applications when considered as part of the design of the application and built in from the start.
- Credential Management – Maintaining credentials within a serverless application, such as database access, is another challenge. FaaS providers often define leading practices for configuring and maintaining these but it is the job of the development team to incorporate those practices consistently. Leveraging a service like Azure Key Vault, HashiCorp Vault, or the recently announced AWS Secrets Manager can help developers avoid embedding credentials in their application code.
- Potential Increased Attack Surface – Serverless design patterns call for web applications to be broken into constituent functions that are then reassembled in a serverless context. Applications built this way have the potential for exposing a larger attack surface. Developers must make many decisions about which functions within their applications to expose to end users and then securely and accurately do so.
- Increased Difficulty in Logging and Monitoring Applications – Web application logging is already a challenging area. Web applications create reams of data that must be analyzed to understand how users interact with the web application, let alone to detect suspicious behavior. Serverless applications can make capturing and monitoring logs even more difficult for developers because each constituent part of the application can generate its own log messages. Wading through this sea of data can be a harrowing experience. Cloud-native logging features such as CloudWatch Logs may not currently meet the needs of some teams. It may be necessary to consider other log aggregation or data collection tools such as ELK stack or vendors like DataDog or SumoLogic.
Going Serverless: Web Application Risks Remain
- Code Injection – Various types of code injection (including SQL injection) remain a risk within serverless applications. Large datasets often serve as the back ends of web applications and attackers aim to acquire that data. No matter the architecture, secure coding practices and static/dynamic application testing are key approaches to limit the introduction of such vulnerabilities.
- Authentication – Properly authenticating end users and managing those authenticated sessions are often where application weaknesses exist. Serverless architectures rely heavily on session information being passed between components accurately and web developers often struggle with handling this correctly. Broken authentication is number two on the OWASP Top 10 list of most critical web application security risks. OWASP offers guidance on preventing these risks including cheat sheets on handling authentication and session management within web applications.
- Cross-Site Scripting – Cross-site scripting takes many forms and can affect a web application no matter the underlying architecture. Static and dynamic testing of the application can often identify this issue for prompt remediation.
- Application Flow Manipulation – Developers must take care to manage each step of their application to prevent attackers from skipping steps or performing them out of order. Attacks can have different effects from authentication bypass to exposure of sensitive data.
Serverless architectures are an exciting way to approach web applications. Developers can rapidly build and deploy new capabilities for their web applications without having to set up the underlying infrastructure. But security risks still exist and teams must be aware of the potential pitfalls when leveraging serverless platforms.