
Explanation:
The most efficient approach is to use a Lambda layer containing the common Python code and attach the layer to each function. Layers let you package shared libraries once, publish a new version when changes are needed, and have multiple functions reference that version to minimize redeployment work. Layers support up to five per function and are available for .zip-based Lambda functions. Storing in S3 and downloading at initialization introduces cold-start latency and additional operational overhead. Layers cannot be attached to container image functions, so dependencies must be baked into the image. Lambda Extensions are for telemetry and security tooling, not for business logic or libraries.
Ultimate access to all questions.
HelioCart, a retail analytics startup, runs about 18 Python-based AWS Lambda functions that all rely on the same in-house helper modules managed by the data engineering team. The team wants to centralize these shared dependencies so that a single update can be rolled out to every function with minimal maintenance effort and without repackaging each function separately. What is the most efficient approach to achieve this?
A
Store the shared Python modules in Amazon S3 and have each Lambda download them during initialization
B
Create a Lambda layer containing the common Python code and attach the layer to each function
C
Containerize the functions and then add Lambda layers to the function configuration
D
Use Lambda Extensions to inject the shared code into every function
No comments yet.