Automating AWS Lambda Layer Creation for Python with Makefile
Let's learn how to add the dependency code in your Lambda without making it a part of your deployment package using AWS Lambda Layers.
AWS Lambda is an event-driven serverless computing service, which is a “done-for-you” service provided by AWS. It removes the hassle of server management from developers, allowing you to focus on what matters: the code.
The concept of Lambda is simple: You write code in your supported language of choice and upload it to a magic box, where it will (theoretically) be executed in response to defined triggers. However, veteran Lambda users will be familiar with the following error message:
deployment package of your Lambda function is too large to enable inline code
Yes, that’s right. You have deployed your code but can’t see it in your browser because it’s too large! For a service that talks about “done for you”, there’s always a catch.
The reason this error happens is because of the third party packages (python lingo) that you have included in your code. Your code may be just a couple of lines spread across 2 files but if it requires 15 different packages to tie everything together, it’s referred to as a “deployment package.” If your deployment package is too big, you’re out of luck- AWS Lambda won’t be able to show your code in the browser.
Don’t worry- there’s a workaround to this! This is where Lambda Layers comes into play. Lambda Layers are the zip archives that contain these “packages” (or libraries). The libraries enable you to add the dependency code in your Lambda without making it a part of your deployment package.
If you use Lambda Layers, all 15 of your dependencies will be included in a layer- but they’ll be hidden, enabling your Lambda to show the whole code in your browser. It’s an annoying problem, but the solution is simple.
How to automate the creation of Lambda Layers
Important Note: If you create a zip package from a non-Linux environment (even Mac), it will NOT WORK with Lambda. If you (like me) prefer developing on Mac or Windows, your best bet is to use an Ubuntu docker and build your layer there.
Your workflow could be as follows:
- Pull an Ubuntu docker
- Exec into it
- Get your requirements.txt file in container
- And install them at a specific path
- Once everything looks good, zip up your layer
- Get your .zip file
If you have to do it often, or for different Lambdas, the process can get tedious. Luckily, we have an automation script which can save you loads of time.
Here’s how you do it.
Create a Makefile with the following content.
Keep it in the same folder which has requirements.txt file. This will do everything we enlisted in the above steps, both sequentially & automatically.
Run the following command:
Which should create a file named packages.zip in your current directory.
Use this file to create your Lambda layer.
- This script is only for Python
- This downloads 3.6.8 (as of today) and build on top of that. This should work for most Python 3+ versions
- You need to have docker installed.
This script can be used as a good starting point for anyone who loves automation.
For Further Automation:
- Create an image which already has everything pre-installed
- Add different scripts for each language.
- Go one step beyond, upload your zip to s3, and create a layer automatically.
I hope this helps you automate the annoying process of creating Lambda Layers!
You might also like
How to build your own Clubhouse - Part 2
How to Build your own Clubhouse
How AI Can Enhance Your Product and Customer Experience
A deep dive into implementing AI-based analytics to help transform your product experience and build strong brand loyalty.Read blog
AWS re:Invent in Review - Part 3
Let's go over the all the major announcements from the Week-3 of the AWS re:Invent 2020.Read blog
Fashion E-Commerce: Using Computer Vision to Find Clothing that Fits Like a Glove
Never let online trends get in the way of creating a great outfit for yourself.Read blog
A Deep-Dive into Downtime. Why Does it Happen?
Successfully handling sales peaks while avoiding downtime should be the goal of any business. We’ll be covering every aspect of downtime in a series of posts, including details of how to build resilience into your cloud architecture – ensuring you minimize your business’ exposure to any outages.Read blog
How to Enable Public Health by Innovation in Predictive Analytics - Part 2
Is there a way to let people know of a potential infection risk before even coming into contact with each other?Read blog
AWS re:Invent in Review — Part 2
Let's go over the all the major announcements from the Week-2 of the AWS re:Invent 2020.Read blog