Automating AWS Lambda Layer Creation for Python with Makefile
Let's learn how to add the dependency code in your Lambda without making it a part of your deployment package using AWS Lambda Layers.
AWS Lambda is an event-driven serverless computing service, which is a “done-for-you” service provided by AWS. It removes the hassle of server management from developers, allowing you to focus on what matters: the code.
The concept of Lambda is simple: You write code in your supported language of choice and upload it to a magic box, where it will (theoretically) be executed in response to defined triggers. However, veteran Lambda users will be familiar with the following error message:
deployment package of your Lambda function is too large to enable inline code
Yes, that’s right. You have deployed your code but can’t see it in your browser because it’s too large! For a service that talks about “done for you”, there’s always a catch.
The reason this error happens is because of the third party packages (python lingo) that you have included in your code. Your code may be just a couple of lines spread across 2 files but if it requires 15 different packages to tie everything together, it’s referred to as a “deployment package.” If your deployment package is too big, you’re out of luck- AWS Lambda won’t be able to show your code in the browser.
Don’t worry- there’s a workaround to this! This is where Lambda Layers comes into play. Lambda Layers are the zip archives that contain these “packages” (or libraries). The libraries enable you to add the dependency code in your Lambda without making it a part of your deployment package.
If you use Lambda Layers, all 15 of your dependencies will be included in a layer- but they’ll be hidden, enabling your Lambda to show the whole code in your browser. It’s an annoying problem, but the solution is simple.
How to automate the creation of Lambda Layers
Important Note: If you create a zip package from a non-Linux environment (even Mac), it will NOT WORK with Lambda. If you (like me) prefer developing on Mac or Windows, your best bet is to use an Ubuntu docker and build your layer there.
Your workflow could be as follows:
- Pull an Ubuntu docker
- Exec into it
- Get your requirements.txt file in container
- And install them at a specific path
- Once everything looks good, zip up your layer
- Get your .zip file
If you have to do it often, or for different Lambdas, the process can get tedious. Luckily, we have an automation script which can save you loads of time.
Here’s how you do it.
Create a Makefile with the following content.
Keep it in the same folder which has requirements.txt file. This will do everything we enlisted in the above steps, both sequentially & automatically.
Run the following command:
Which should create a file named packages.zip in your current directory.
Use this file to create your Lambda layer.
- This script is only for Python
- This downloads 3.6.8 (as of today) and build on top of that. This should work for most Python 3+ versions
- You need to have docker installed.
This script can be used as a good starting point for anyone who loves automation.
For Further Automation:
- Create an image which already has everything pre-installed
- Add different scripts for each language.
- Go one step beyond, upload your zip to s3, and create a layer automatically.
I hope this helps you automate the annoying process of creating Lambda Layers!
You might also like
What the heck is a Service Mesh, anyway?
Your microservices architecture can benefit immensely with a Service Mesh. Here's how.Read article
Elasticsearch for Beginners and SQL Developers
In this video we will learn some basic concepts of Elasticsearch eco-system. Topics that will be covered in this video: 1. What is Elasticsearch and its various use cases. 2. Key SQL Concepts and how ES handles it (or not)? 3. Hands-on demo with ES and SQL queries side by side.Watch on demand
How Route Optimization Improves Efficiency in Last-mile Delivery
It's not just about where your product ends up, but also how it got there.Read article
The Economics of Last-Mile Delivery
The current last-mile environment continues to challenge many retailers and grocers. To address these challenges, they are taking three approaches: subsidize the cost, outsource third parties, or bring last-mile delivery in-house. Find out which one is winning.Read article
The Big Switch: How Grocers are Bringing Last-Mile Delivery In-House
Egen has worked alongside several leading grocery brands and retailers to build a last-mile delivery foundation in under 6 months.Read article
Learn how to build reactive systems using project Reactor and various Spring projects
Let's discuss how to build reactive systems using Project Reactor and various Spring projects.Watch on demand
Handle MLOps across multiple cloud providers using Kubeflow
Machine Learning Models are relatively easy to build but hard to roll out. Learn how to make ML workflows production-ready with Kubeflow.Watch on demand
Role of service mesh in Kubernetes explained
Let's understand the role of service mesh in the Kubernetes world. Learn about Istio and its features (and if you even need it).Watch on demand