1×
Become a member
to unlock all features

Level Up!

Access all courses & lessons on egghead today and lock-in your price for life.

Autoplay

    Deploy an S3 bucket and trigger an AWS Lambda function with AWS SAM

    Tomasz ŁakomyTomasz Łakomy
    awsAWS

    The strength of serverless applications lies in connecting various services together.

    "Uploading a file to S3 triggers a lambda function which sends data to a queue which is picked up by another service which triggers another function..."

    In this quick lesson we're going to learn how to deploy an S3 bucket using a SAM template and how to connect it to a serverless function so it'll get triggered whenever a new file gets uploaded.

    Code

    Code

    Become a Member to view code

    You must be a Member to view code

    Access all courses and lessons, track your progress, gain confidence and expertise.

    Become a Member
    and unlock code for this lesson
    Discuss

    Discuss

    Transcript

    Transcript

    Tomasz Łakomy: 0:00 We have a serverless application model stack, which is currently deploying this Lambda function. We would like to, apart from this function, also deploy an S3 bucket and connect them together. Whenever a file gets uploaded to the S3 bucket, this Lambda function is going to get triggered.

    0:13 In order to do that, first we're going to create a bucket. I'm going to specify an additional resources in the resources section. I'm going to call it MyFilesBucket, which is going to be of type AWS S3 bucket.

    0:26 Before moving on, let's build this stack and deploy it to AWS to see if this bucket is actually going to get created. First, I run sam build in our terminal. It's going to build our new stack. Next up, I run sam deploy.

    0:37 Before the actual deployment may happen, we can see over here that this stack is actually going to add MyFilesBucket, which is an S3 bucket, to our resources. I'm OK with that. I'm going to say yes, please deploy to string set. Let's give it a second.

    0:52 Now it's done. Let's take a look at our S3 bucket in the AWS console. Let's go to S3. This is our brand-new S3 bucket which was created just a minute ago.

    1:00 Now, we're going to connect this bucket to our Lambda function. Our Lambda function will receive an event, and we're going to console log the S3 portion of this event. We're going to see the file name, the file size, and so on.

    1:11 In order to connect this function to our S3 bucket, we're going to go back to template and create a new event for this Lambda function. I'm going to call this event fileUpload. It is going to be of type S3. It's going to have following properties. Properties.

    1:27 First off, we would like to trigger this function only when a file gets uploaded to this bucket. In order to do that, we're going to specify bucket, and we're going to pass in a reference to this bucket.

    1:38 An S3 bucket can trigger many events, like whenever a file gets created, updated, or deleted. In our case, we're going to only care about the S3 object created events, so whenever a new file gets uploaded to this S3 bucket. Now, this bucket and this function are connected together.

    1:51 Let's go ahead, build this stack, and deploy it to AWS. Before deployment is going to happen, we will notice that a new Lambda permission resource is going to be created because we are going to trigger this function whenever our file gets uploaded. I'm OK with those changes. Let's deploy it.

    2:07 Now, it's done. Let's test it. First up, let's go to our bucket. Go to S3, go to our bucket, upload a new file, which in this case is my photo, click on upload, wait for it. Now, this file is uploaded to S3. It should have triggered our Lambda function.

    2:20 Let's go to Lambda, select our function, go to monitoring to view logs in CloudWatch. We can see a new log stream. If we open it up, we're going to see a log event saying that I just uploaded this file called tlakomy.jpg. This is the file size, and it was uploaded to this S3 bucket.

    2:36 Now, we have successfully connected our serverless function with an S3 bucket in order to create a serverless application.