- NPM Downloads Last Month
A Serverless v1.x plugin to automatically bundle dependencies from
requirements.txt and make them available in your
Requires Serverless >= v1.12
sls plugin install -n serverless-python-requirements
:apple::beer::snake: Mac Brew installed Python notes
Compiling non-pure-Python modules or fetching their manylinux wheels is supported on non-linux OSs via the use of Docker and the docker-lambda image. To enable docker usage, add the following to your
custom: pythonRequirements: dockerizePip: true
The dockerizePip option supports a special case in addition to booleans of
'non-linux' which makes it dockerize only on non-linux environments.
To utilize your own Docker container instead of the default, add the following to your
custom: pythonRequirements: dockerImage: <image name>:tag
This must be the full image name and tag to use, including the runtime specific tag if applicable.
Alternatively, you can define your Docker image in your own Dockerfile and add the following to your
custom: pythonRequirements: dockerFile: ./path/to/Dockerfile
Dockerfile the path to the Dockerfile that must be in the current folder (or a subfolder). Please note the
dockerImage and the
dockerFile are mutually exclusive.
To install requirements from private git repositories, add the following to your
custom: pythonRequirements: dockerizePip: true dockerSsh: true
dockerSsh option will mount your
$HOME/.ssh/known_hosts as a volume in the docker container. If your SSH key is password protected, you can use
$SSH_AUTH_SOCK is also mounted & the env var set. It is important that the host of your private repositories has already been added in your
$HOME/.ssh/known_hosts file, as the install process will fail otherwise due to host authenticity failure.
:checkered_flag: Windows notes
If you include a
Pipfile and have
pipenv installed instead of a
requirements.txt this will use
pipenv lock --r to generate them. It is fully compatible with all options such as
dockerizePip. If you don’t want this plugin to generate it for you, set the following option:
custom: pythonRequirements: usePipenv: false
To help deal with potentially large dependencies (for example:
scikit-learn) there is support for compressing the libraries. This does require a minor change to your code to decompress them. To enable this add the following to your
custom: pythonRequirements: zip: true
and add this to your handler module before any code that imports your deps:
try: import unzip_requirements except ImportError: pass
You can omit a package from deployment with the
noDeploy option. Note that dependencies of omitted packages must explicitly be omitted too. By default, this will not install the AWS SDKs that are already installed on Lambda. This example makes it instead omit pytest:
custom: pythonRequirements: noDeploy: - pytest
You can specify extra arguments to be passed to pip like this:
custom: pythonRequirements: dockerizePip: true pipCmdExtraArgs: - --cache-dir - .requirements-cache
--cache-dir don’t forget to also exclude it from the package.
package: exclude: - .requirements-cache/**
pip workflows involve using requirements files not named
requirements.txt. To support these, this plugin has the following option:
custom: pythonRequirements: fileName: requirements-prod.txt
If you have different python functions, with different sets of requirements, you can avoid including all the unecessary dependencies of your functions by using the following structure:
├── serverless.yml ├── function1 │ ├── requirements.txt │ └── index.py └── function2 ├── requirements.txt └── index.py
With the content of your
package: individually: true functions: func1: handler: index.handler module: function1 func2: handler: index.handler module: function2
The result is 2 zip archives, with only the requirements for function1 in the first one, and only the requirements for function2 in the second one.
Quick notes on the config file:
modulefield must be used to tell the plugin where to find the
requirements.txtfile for each function.
handlerfield must not be prefixed by the folder name (already known through
module) as the root of the zip artifact is already the path to your function.
Sometimes your Python executable isn’t available on your
python3.6 (for example, windows or using pyenv). To support this, this plugin has the following option:
custom: pythonRequirements: pythonBin: /opt/python3.6/bin/python
For certain libraries, default packaging produces too large an installation, even when zipping. In those cases it may be necessary to tailor make a version of the module. In that case you can store them in a directory and use the
vendor option, and the plugin will copy them along with all the other dependencies to install:
custom: pythonRequirements: vendor: ./vendored-libraries functions: hello: handler: hello.handler vendor: ./hello-vendor # The option is also available at the function level
requirements.zip(if using zip support) files are left behind to speed things up on subsequent deploys. To clean them up, run
sls requirements clean. You can also create them (and
unzip_requirements if using zip support) manually with
sls requirements install.
If you are using your own Python library, you have to cleanup
.requirements on any update. You can use the following option to cleanup
.requirements everytime you package.
custom: pythonRequirements: invalidateCaches: true
Brew wilfully breaks the
--target option with no seeming intention to fix it which causes issues since this uses that option. There are a few easy workarounds for this:
Also, brew seems to cause issues with pipenv, so make sure you install pipenv using pip.
For usage of
dockerizePip on Windows do Step 1 only if running serverless on windows, or do both Step 1 & 2 if running serverless inside WSL.
dockerFileoption to build a custom docker image, real per-function requirements, and the
noDeploysupport, switched to adding files straight to zip instead of creating symlinks, and improved pip chache support when using docker.