AWS Solutions Architect Associate Certification
Designing for Security
Turning up Security on Compute Services Part 6
In this lesson, we dive into the serverless domain with a focus on AWS Lambda. Introduced in 2013–2014, Lambda has quickly become a key component in modern applications by enabling serverless computing and event-driven architectures.
Consider the following architecture diagram. A CloudFormation stack creates multiple resources where a user interacts with APIs (for product, basket, and order management) through an API Gateway. Depending on the URL or endpoint accessed, the gateway routes the request to the corresponding Lambda function that communicates with its dedicated DynamoDB backend. Within this microservices setup, individual functions (e.g., product handling, basket management, ordering) trigger events in EventBridge or dispatch messages to SQS queues. This diagram illustrates a straightforward microservices product predominantly powered by AWS Lambda.
Key Concept
AWS Lambda simplifies deployment and management by handling much of the heavy lifting such as infrastructure management, allowing you to focus on your application's business logic.
Lambda in Media Applications
Consider a media company using Lambda to process and transcode large video files. One major challenge is ensuring that Lambda functions complete processing within the maximum execution time limit of 15 minutes (up from 5 minutes earlier). When facing the risk of transcoding jobs exceeding this limit, several strategies can be implemented:
- Increase the Lambda memory allocation (which boosts CPU and memory but may not guarantee completion).
- Split the video file into smaller chunks, processing each chunk with a separate Lambda invocation.
- Set the Lambda function timeout to the maximum allowed limit and hope for timely completion (this is not advisable).
- Store video files in Amazon S3 and trigger processing via S3 events (useful but does not guarantee completion within 15 minutes).
The optimal strategy is to split the video into smaller chunks, ensuring individual invocations complete within the execution window.
AWS Lambda Shared Responsibility
AWS Lambda is a highly managed service. Similar to other managed services, AWS minimizes the customer’s responsibilities regarding data encryption, integrity, authentication, application management, internet access, and monitoring/logging. The diagram below illustrates the shared responsibility model for Lambda:
For example, a fintech startup building a serverless application to handle sensitive financial transactions must focus on implementing proper IAM roles and permissions for their Lambda functions. Customers are responsible for access control configurations while AWS manages physical security and infrastructure updates.
Best Practices for Environment Variables
Lambda allows you to set environment variables (e.g., database host, database name) by default. However, it is best not to place sensitive information directly in these variables. Instead, consider using services like AWS Secrets Manager or the Systems Manager Parameter Store to securely retrieve sensitive data. Even though many databases now offer IAM integration—meaning permissions are defined in the Lambda runtime role—storing sensitive data directly in environment variables should be avoided.
For instance, a healthcare company processing patient data with Lambda might choose to use encrypted environment variables to securely manage database credentials and API keys. Lambda’s built-in encryption mechanism ensures that stored data is encrypted at rest, providing an additional layer of security.
Securing Lambda Execution and Permissions
It is critical to assign minimal necessary permissions using IAM roles when configuring Lambda functions. Never embed access keys or credentials directly in your code. The recommended approach is to create a role with the least privileges required and associate it with your Lambda function. Enhance your Lambda’s security by enabling monitoring and logging with CloudWatch, CloudTrail, and X-Ray. The use of AWS X-Ray provides detailed tracing of individual components during function execution.
For example, a fintech company must configure roles with only the necessary permissions to access AWS services like S3 or DynamoDB, rather than using excessive privileges.
Leveraging AWS X-Ray and GuardDuty
Integrate AWS X-Ray with your Lambda functions by ensuring that the execution role has the permissions to send trace data. There is no need to manually embed the X-Ray SDK in your code—Lambda automatically handles much of the integration. This native capability offers an end-to-end view of your application’s performance and activity.
Additionally, AWS GuardDuty can monitor CloudWatch logs and other data sources to detect potential vulnerabilities and threats. GuardDuty’s capabilities extend to Lambda, checking for any security anomalies and ensuring enhanced protection.
AWS Step Functions: Orchestration and Security
AWS Step Functions provide orchestration for workflows by coordinating multiple AWS services. Think of these as the modern evolution of the Simple Workflow Service (SWF). For example, a media company might utilize Step Functions to manage a video processing pipeline, where the state machine handles the sequence of tasks.
When designing secure state machines, avoid storing sensitive data directly within the state machine. Instead, implement roles with fine-grained permissions for each state to ensure that only authorized services are accessed.
For financial transactions and other event-driven architectures, consider using Amazon EventBridge to capture external events and trigger the appropriate Step Functions state machine. This push model is both reliable and cost-effective compared to continuous polling.
AWS Serverless Application Model (SAM)
The Serverless Application Model (SAM) simplifies the deployment and management of serverless resources. SAM leverages AWS IAM for permission management, and many security features—such as data encryption, logging, and monitoring—are inherited from AWS services. For example, a healthcare startup managing patient records via SAM should avoid embedding sensitive configuration data directly into SAM templates. Instead, use SAM’s support for fine-grained IAM permissions to secure each resource.
When developing with the SAM CLI, ensure that your local testing environment mirrors production closely. This reduces the risk of deployment issues by preventing direct development-to-production pushes without proper staging and testing.
AWS Serverless Application Repository (SAR) and Logging
The Serverless Application Repository (SAR) serves as an image registry for serverless solution patterns. When deploying applications from SAR—especially in sensitive environments like healthcare—thoroughly review the source code, documentation, and associated permissions to ensure compliance with industry regulations. It is also crucial to log all activities for monitoring and auditing using CloudWatch, CloudTrail, and AWS Config.
AWS Amplify: A Managed Deployment Option
AWS Amplify offers a simplified deployment experience, especially for web and mobile applications backed by Lambda functions. Amplify automatically secures data both in transit and at rest, providing significant security advantages over self-managed systems. Additionally, its native integration with CloudWatch facilitates comprehensive logging of user activity and error metrics.
For startups developing mobile e-commerce apps, using AWS Amplify with its built-in CloudWatch logging ensures that user activities and errors are monitored effectively, enhancing overall application security.
Hybrid Computing with AWS Outposts
Hybrid computing integrates on-premises hardware with AWS services. AWS Outposts extends AWS infrastructure into your data center, allowing you to run AWS services locally while maintaining secure connectivity with the cloud through Direct Connect or VPN. Outposts requires connectivity for updates and configuration changes, ensuring continuous integration with AWS security tools.
For instance, when deploying a Kubernetes cluster with Amazon EKS on Outposts, it is vital to use IAM for worker nodes, ensuring that the nodes only access necessary AWS services securely.
When securing EKS on Outposts, follow these best practices:
- Use IAM roles that grant least-privilege permissions.
- Avoid creating full administrative users.
- Prevent storing credentials on hardware.
Furthermore, secure IAM integration on Outposts by assigning roles with specific, least-privilege permissions for each resource rather than relying on broad administrative rights.
ECS, EKS Anywhere, and Container Management
For organizations looking to extend container workloads on-premises, ECS and EKS Anywhere offer integrated solutions. These services allow deployment and management of containers in traditional data centers while maintaining AWS-level security and IAM integration. For example, a global retail company running containerized applications with ECS must ensure that the ECS Anywhere agent has proper permissions for secure communication with AWS.
Similarly, when using EKS Anywhere, leverage AWS connectors and IAM to manage Kubernetes clusters securely so that integration with AWS services remains seamless.
VMware Cloud on AWS
VMware Cloud on AWS is a unique solution where most security responsibilities are managed through VMware’s suite of tools (vRealize, vCenter, vSAN, and vSphere) rather than AWS-specific configurations. This approach benefits organizations by providing a familiar IT environment alongside AWS’s global infrastructure. For multinational companies migrating on-premises VMware environments to the cloud, the seamless integration between VMware tools and AWS services eliminates the need for major refactoring.
AWS Snow Family
The AWS Snow Family, which includes devices like Snowcone and Snowball, offers hardened solutions for edge computing and temporary on-site data processing under limited connectivity conditions. Although these devices support compute functionalities similar to EC2, they only offer a subset of instance types. For a global research organization needing remote data processing, the compute version of the Snowball device is particularly well-suited for on-site processing and secure data transfer back to the cloud.
Summary
Securing compute services on AWS spans various environments—from virtual machines and containers to serverless and hybrid setups. For fully managed services like AWS Lambda, Step Functions, SAM, Amplify, and hybrid solutions such as Outposts, ECS/EKS Anywhere, and the Snow Family, the focus is on leveraging AWS’s built-in security features (encryption, IAM, logging, and tracing) and following best practices. Although traditional infrastructure like EC2 offers more granular control (e.g., OS-level security and patching), managed services relieve you of such burdens, allowing you to concentrate on proper configuration, access management, and monitoring.
If you have followed along, you are now better equipped to design and secure diverse compute environments on AWS. In upcoming sections, we will explore additional services and design considerations to further enhance your cloud security expertise.
Learn more about these AWS services in the official AWS Documentation.
Watch Video
Watch video content