AWS Lambda Step-by-Step Tutorial (Console): Using an Amazon S3 Trigger to Resize Images ๐Ÿ–ผ๏ธ๐Ÿš€

AWS Lambda Step-by-Step Tutorial (Console): Using an Amazon S3 Trigger to Resize Images ๐Ÿ–ผ๏ธ๐Ÿš€

ยท

7 min read

In Introduction to Serverless Services on AWS I have introduced AWS Lambda as one of serverless services on AWS.

Now, let's go through a simple example to see Lambda in action. In this tutorial, you will create and configure a Lambda function that resizes images added to an Amazon Simple Storage Service (Amazon S3) bucket.

How will it work? ๐Ÿ’ก

When you add an image file to your original folder in bucket, Amazon S3 invokes your Lambda function. The function then creates a resized version of the image and uploads it to a resized folder in the same bucket.

What will be the steps? ๐Ÿชœ

  1. Create S3 bucket with input example

  2. Create IAM role with S3 permissions

  3. Create Lambda function

  4. Test it out

Create S3 bucket with input example ๐Ÿชฃ

Create S3 bucket

First create an Amazon S3 bucket with two folders in it. The first folder is the source folder original in which you will upload your images to. The second folder resized will be used by Lambda to save the resized thumbnail when you invoke your function.

  1. Go to the AWS S3 console

  2. Click on "Create a bucket"

  3. Name the bucket - in my case I named it resize-image-demo-ns

  4. Keep everything else as is by default, and click on "Create bucket"

  5. Create a folder named original in this bucket

  6. Repeat the same steps to create a folder named resized

Upload a test image

Later in the tutorial, youโ€™ll test your Lambda function by invoking it. To confirm that your function is operating correctly, your source bucket needs to contain a test PNG image.

  1. Go to the AWS S3 console

  2. Go to your S3 bucket inside the original folder

  3. Upload a test image - in my case I used the wedding.png image

Create IAM role with S3 permissions ๐Ÿ‘ฎ๐Ÿปโ€โ™€๏ธ

Create a permission policy

The first step in creating your Lambda function is to create a permissions policy. This policy gives your function the permissions it needs to access other AWS resources. For this tutorial, the policy gives Lambda read and write permissions for Amazon S3 buckets and allows it to write to Amazon CloudWatch Logs.

  1. Go to the AWS IAM console

  2. Click on Policies at the left panel of IAM dashboard

  3. Click on "Create policy"

  4. Paste this into JSON tab and click on "Next"

     {
         "Version": "2012-10-17",
         "Statement": [
             {
                 "Effect": "Allow",
                 "Action": [
                     "logs:PutLogEvents",
                     "logs:CreateLogGroup",
                     "logs:CreateLogStream"
                 ],
                 "Resource": "arn:aws:logs:*:*:*"
             },
             {
                 "Effect": "Allow",
                 "Action": [
                     "s3:GetObject"
                 ],
                 "Resource": "arn:aws:s3:::*/*"
             },
             {
                 "Effect": "Allow",
                 "Action": [
                     "s3:PutObject"
                 ],
                 "Resource": "arn:aws:s3:::*/*"
             }
         ]
     }
    

  5. Name the policy - in this case I named it LambdaS3Policy, and click on "Create policy"

Create an execution role

An execution role is an IAM role that grants a Lambda function permission to access AWS services and resources. To give your function read and write access to an Amazon S3 bucket, you attach the permissions policy you created in the previous step.

  1. Go to AWS IAM console

  2. Click on Roles at the left panel of IAM dashboard

  3. Click on "Create role"

  4. Select AWS service as trusted entity type and select Lambda as the service, then click on "Next"

  5. Add the created LambdaS3Policy permission, then click on "Next"

  6. Name the role - in this case I named it LambdaS3Role

  7. Keep everything else as if by default, and click on "Create role"

Create Lambda function ๐Ÿš€

Create the function deployment package

To create your function, you create a deployment package containing your function code and its dependencies.

  1. Create a folder - in my case I named it ResizeImage

  2. Create a lambda_function.py file in the folder and paste there this code:

     import boto3
     import os
     import uuid
     from urllib.parse import unquote_plus
     from PIL import Image
    
     s3_client = boto3.client('s3')
    
     def resize_image(image_path, resized_path):
         with Image.open(image_path) as image:
             image.thumbnail(tuple(x // 2 for x in image.size))
             image.save(resized_path, 'PNG')
    
     def lambda_handler(event, context):
         for record in event['Records']:
             bucket = '<YOUR-BUCKET-NAME>'
             key = unquote_plus(record['s3']['object']['key'])
    
             if not key.startswith('original/'):
                 print(f"Ignoring object {key} as it does not start with 'original/'")
                 continue
    
             tmpkey = key.replace('/', '')
             download_path = f'/tmp/{uuid.uuid4()}{tmpkey}'
             upload_key = key.replace('original/', 'resized/')
             upload_path = f'/tmp/resized-{tmpkey}'
    
             s3_client.download_file(bucket, key, download_path)
             resize_image(download_path, upload_path)
             s3_client.upload_file(upload_path, bucket, upload_key)
    
             # Cleanup
             os.remove(download_path)
             os.remove(upload_path)
    
         return {
             'statusCode': 200,
             'body': 'Images resized and uploaded successfully!'
         }
    

    Do not forget to replace <YOUR-BUCKET-NAME> in the lambda_handler function with the actual name of your bucket.

  3. In the same folder create a new directory named package and install the Pillow (PIL) library and the AWS SDK for Python (Boto3)

    ๐Ÿง 
    Although the Lambda Python runtime includes a version of the Boto3 SDK, it is recommended that you add all of your function's dependencies to your deployment package, even if they are included in the runtime.
     mkdir package
     pip install \
     --platform manylinux2014_x86_64 \
     --target=package \
     --implementation cp \
     --python-version 3.12 \
     --only-binary=:all: --upgrade \
     pillow boto3
    
  4. Create a .zip file containing your application code and the Pillow and Boto3 libraries

     cd package
     zip -r ../lambda_function.zip .
     cd ..
     zip lambda_function.zip lambda_function.py
    

Configure the Lambda function

  1. Go to the AWS Lambda console

  2. Click on "Create a function"

  3. We are going to author a function from scratch: name your Lambda function - in my case I named it ResizeImage and then select Python 3.12 as Runtime

    ๐Ÿง 
    Runtime is the language that your Lambda function is written in. Previously, in the package we were installing Python 3.12, so we will use this one only.

  4. Then we have to select the permissions (they are going to be delegated to the function through IAM roles - the one that we have created earlier, LambdaS3Role) and click on "Create function"

  5. Now we can add a trigger to our Lambda function by clicking on "Add trigger"

  6. Configure the trigger source and click on "Add":

    • We're going to select S3 for our trigger

    • Then we need to select which S3 bucket will be the event source - in my case it is resize-image-demo-ns

    • As the even type I want to select only a PUT

      ๐Ÿง 
      Events are operations that trigger this AWS Lambda function. We want to trigger Lambda each time we upload a new image which translates to PUT operation
    • Then we can provide a prefix (specific location inside of the bucket) - in this case the folder original inside the bucket.

    • We then need to acknowledge that if we're using the same S3 bucket for both input and output that it could cause recursive invocations.

      ๐Ÿง 
      We are using the same bucket, however we have integrated the output prefix resized/ in the lambda_function.py .

  1. Upload a code source from the .zip file that you created

  2. Go to "Configuration" and increase the timeout and memory in "General configuration"

Test it out ๐Ÿ˜Ž

Before you test your whole setup by adding an image file to the original folder of your S3 bucket, you test that your Lambda function is working correctly by invoking it with a dummy event.

An event in Lambda is a JSON-formatted document that contains data for your function to process. When your function is invoked by Amazon S3, the event sent to your function contains information such as the bucket name, bucket ARN, and object key.

  1. Go to the AWS Lambda console

  2. Go to "Test", create new event, and name it - in my case I named it ResizeImage-test-event

  3. Choose S3-put template, paste this JSON in it and save it

     {
       "Records": [
         {
           "eventVersion": "2.0",
           "eventSource": "aws:s3",
           "awsRegion": "<YOUR-REGION>",
           "eventTime": "1970-01-01T00:00:00.000Z",
           "eventName": "ObjectCreated:Put",
           "userIdentity": {
             "principalId": "EXAMPLE"
           },
           "requestParameters": {
             "sourceIPAddress": "127.0.0.1"
           },
           "responseElements": {
             "x-amz-request-id": "EXAMPLE123456789",
             "x-amz-id-2": "EXAMPLE123/5678abcdefghijklambdaisawesome/mnopqrstuvwxyzABCDEFGH"
           },
           "s3": {
             "s3SchemaVersion": "1.0",
             "configurationId": "testConfigRule",
             "bucket": {
               "name": "<YOUR-BUCKET-NAME>",
               "ownerIdentity": {
                 "principalId": "EXAMPLE"
               },
               "arn": "arn:aws:s3:::<YOUR-BUCKET-NAME>"
             },
             "object": {
               "key": "original/<YOUR-EXAMPLE.PNG>",
               "size": 1024,
               "eTag": "0123456789abcdef0123456789abcdef",
               "sequencer": "0A1B2C3D4E5F678901"
             }
           }
         }
       ]
     }
    

    Do not forget to replace <YOUR-REGION>, <YOUR-BUCKET-NAME> and <YOUR-EXAMPLE.PNG> (also highlighted in the screenshot) by the actual names in the code.

  4. Click on "Test"

  5. We can check the execution details afterwards - as we can see, the ResizeImage-test-event execution succeeded


Congratulations! ๐Ÿฅณ You've just built a cool serverless solution with AWS Lambda that resizes images automatically when they are uploaded to an S3 bucket.

ย