Chapter 6. Serverless Functions in Depth Part 1

In chapter two you learned how to create and interact with a serverless API using API gateway and AWS Lambda. In this chapter, you’ll continue to learn about how to use serverless functions by creating two new types of new functions.

The functions in this chapter will be different in that instead of using them as strictly a web server or an API, you’ll be using them to interact with other AWS services to aid us in the application development process.

The two functions you’ll be creating in this chapter are the following:

  1. A function that dynamically adds a user to a group based on their email address.

In some applications, you will need to perform “coarse-grained” access control, which typically means granting certain authorization permissions to users in a broad way based on the type of role or group they are associated with. In our example, we’ll have an administrator group of users that will be identified by their email address. If a user signs up with one of these email addresses, we will place them in a group called Admin.

  1. A function that automatically resizes an image after it has been uploaded to Amazon S3.

Many applications require dynamic image resizing on the server after a user has uploaded an image. This is done for many reasons, ranging from the need to make the web application more performant by compressing images to the need to dynamically create avatars or thumbnail images of a smaller size for images.

In chapter 7 we’ll continue learning about serverless functions by creating a an e-commerce application that interacts with a database and allows the the user to create, read, update, and delete items from a database by invoking the function via an API call.

Event sources and data structure

In chapter two we briefly talked about event sources for serverless functions. The only event source we actually have implemented up until this point though has been from API Gateway; an HTTP request that triggered the function and fetched data from an API and returned it in the response.

In this chapter we’ll be working with two other event types and sources, one from Amazon S3 and one from Amazon Cognito.

In order to understand the events coming into Lambda from the event sources, it’s important to underscore the following point: The shape of the event data will not be consistent across different event types. For instance, the HTTP event data structure coming from API Gateway will differ from the Amazon S3 event data structure, and the Amazon S3 event data structure will differ from the Amazon Cognito data structure.

Understanding the shape of the event data, as well as knowing the data available to you in the event, will help you understand the capabilities of what you can do in the Lambda function. To understand this better, let’s take a look at the different event data structures. For now you do not need to understand every field and value in this data structure. I will outline the values that will be important for us in the following examples.

API Gateway Event

The API gateway event data is the data structure that will be passed into the function when invoking it from an API call, like GET, PUT, POST, or DELETE. This data structure holds information like the HTTP method that invoked the function, the path that was invoked, the body if one was passed in, and the identity of the user calling the api (inside the requestContext.identity field) if the user was authenticated.

{
    "resource": "/items",
    "path": "/items",
    "httpMethod": "GET",
    "headers": { /* header info */ },
    "multiValueHeaders": { /* multi value header info */ },
    "queryStringParameters": null,
    "multiValueQueryStringParameters": null,
    "pathParameters": null,
    "stageVariables": null,
    "requestContext": {
        "resourceId": "b16tgj",
        "resourcePath": "/items",
        "httpMethod": "GET",
        "extendedRequestId": "CzuJMEDMoAMF_MQ=",
        "requestTime": "07/Nov/2019:21:46:09 +0000",
        "path": "/dev/items",
        "accountId": "557458351015",
        "protocol": "HTTP/1.1",
        "stage": "dev",
        "domainPrefix": "eq4ttnl94k",
        "requestTimeEpoch": 1573163169162,
        "requestId": "1ac70afe-d366-4a52-9329-5fcbcc3809d8",
        "identity": {
          "cognitoIdentityPoolId": "",
          "accountId": "",
          "cognitoIdentityId": "",
          "caller": "",
          "apiKey": "",
          "sourceIp": "192.168.100.1",
          "cognitoAuthenticationType": "",
          "cognitoAuthenticationProvider": "",
          "userArn": "",
          "userAgent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/52.0.2743.82 Safari/537.36 OPR/39.0.2256.48",
          "user": ""
        },
        "domainName": "eq4ttnl94k.execute-api.us-east-1.amazonaws.com",
        "apiId": "eq4ttnl94k"
    },
    "body": null,
    "isBase64Encoded": false
}

Amazon S3 Event

The Amazon S3 event data is the data structure that will be passed into the function when invoking it from a a file upload to Amazon S3. This data structure holds an array of records from S3. The main information you’ll typically be working with in this event data is the s3 data. This property holds information like the bucket name, the key, and the size of the item being stored.

{
  "Records": [
    {
      "eventVersion": "2.1",
      "eventSource": "aws:s3",
      "awsRegion": "us-east-2",
      "eventTime": "2019-09-03T19:37:27.192Z",
      "eventName": "ObjectCreated:Put",
      "userIdentity": {
        "principalId": "AWS:AIDAINPONIXQXHT3IKHL2"
      },
      "requestParameters": {
        "sourceIPAddress": "205.255.255.255"
      },
      "responseElements": {
        "x-amz-request-id": "D82B88E5F771F645",
        "x-amz-id-2": "vlR7PnpV2Ce81l0PRw6jlUpck7Jo5ZsQjryTjKlc5aLWGVHPZLj5NeC6qMa0emYBDXOo6QBU0Wo="
      },
      "s3": {
        "s3SchemaVersion": "1.0",
        "configurationId": "828aa6fc-f7b5-4305-8584-487c791949c1",
        "bucket": {
          "name": "lambda-artifacts-deafc19498e3f2df",
          "ownerIdentity": {
            "principalId": "A3I5XTEXAMAI3E"
          },
          "arn": "arn:aws:s3:::lambda-artifacts-deafc19498e3f2df"
        },
        "object": {
          "key": "b21b84d653bb07b05b1e6b33684dc11b",
          "size": 1305107,
          "eTag": "b21b84d653bb07b05b1e6b33684dc11b",
          "sequencer": "0C0F6F405D6ED209E1"
        }
      }
    }
  ]
}

Amazon Cognito Event

The Amazon Cognito event data is the data structure that will be passed into the function when invoking it from an Amazon Cognito action. These actions could be anything from a user signing up, a user confirming their account, or a user signing in.

{
    "version": "1",
    "region": "us-east-1",
    "userPoolId": "us-east-1_uVWAMpQuY",
    "userName": "dabit3",
    "callerContext": {
        "awsSdkVersion": "aws-sdk-unknown-unknown",
        "clientId": "2ects9inqraapp43ejve80pv12"
    },
    "triggerSource": "PostConfirmation_ConfirmSignUp",
    "request": {
        "userAttributes": {
            "sub": "164961f8-13f7-40ed-a8ca-d441d8ec4724",
            "cognito:user_status": "CONFIRMED",
            "email_verified": "true",
            "phone_number_verified": "false",
            "phone_number": "+16018127241",
            "email": "[email protected]"
        }
    },
    "response": {}
}

In this chapter you’ll be using these events, and the information contained within them, to perform different types of actions from within the functions.

Creating the base project

The first thing we’ll do is create a new React application and install the dependencies we’ll need for this chapter:

$ npx create-react-app lambda-trigger-example

$ cd lambda-trigger-example

$ npm install aws-amplify aws-amplify-react

Next, we’ll create a new Amplify project:

$ amplify init

# walk through the steps like we've done in the previous projects

Now that the project has been initialized, we can begin adding the services. The services we’ll need for this chapter will be Amazon Cognito, Amazon S3, and AWS Lambda. We’ll start by adding Amazon Cognito and testing out a post-confirmation Lambda trigger.

Adding a user to a group after signing up - post-confirmation Lambda trigger

The next thing we want to do is create an authentication service. We will then create and configure a post-confirmation Lambda trigger. This means we want a Lambda function to be invoked every time someone successfully signs up and confirms their account using our authentication service.

To do so, we need to first create the authentication service in our project:

$ amplify add auth

? Do you want to use the default authentication and security configuration? Default configuration
? How do you want users to be able to sign in? Username
? Do you want to configure advanced settings? Yes
? What attributes are required for signing up? Email
? Do you want to enable any of the following capabilities? Add User to Group
? Enter the name of the group to which users will be added. Admin
? Do you want to edit your add-to-group function now? Y

Now, update the function with the following code:

// amplify/backend/function/<function_name>/src/add-to-group.js

const aws = require('aws-sdk');

exports.handler = async (event, context, callback) => {
  const cognitoProvider = new aws.CognitoIdentityServiceProvider({ apiVersion: '2016-04-18' });

  let isAdmin = false
  const adminEmails = ['[email protected]']

  // if the user is one of the admins, set the isAdmin variable to true
  if (adminEmails.indexOf(event.request.userAttributes.email) !== -1) {
    isAdmin = true
  }

  const groupParams = {
    UserPoolId: event.userPoolId,
  }

  const userParams = {
    UserPoolId: event.userPoolId,
    Username: event.userName,
  }

  if (isAdmin) {
    groupParams.GroupName = 'Admin',
    userParams.GroupName = 'Admin'

    // first check to see if the groups exists, and if not create the group
    try {
      await cognitoProvider.getGroup(groupParams).promise();
    } catch (e) {
      await cognitoProvider.createGroup(groupParams).promise();
    }

    // the user is an administrator, place them in the Admin group
    try {
      await cognitoProvider.adminAddUserToGroup(userParams).promise();
      callback(null, event);
    } catch (e) {
      callback(e);
    }
  } else {
    // if the user is in neither group, proceed with no action
    callback(null, event)
  }
}

In this function there is one main piece of functionality. If the user is one of the admins specified in the admins email array, we automatically place them in the group called Admins.

To deploy the service, run the push command:

amplify push

Now that the back end is set up, we can test it out. To do so, we first need to configure the React project to recognize the Amplify dependencies. Open src/index.js and add the following below the last import:

import Amplify from 'aws-amplify'
import config from './aws-exports'
Amplify.configure(config)

Next, we’ll sign up a new user and display a greeting if they are an Admin. To do so, open src/App.js and add the following:

import React, { useEffect, useState } from 'react';
import './App.css';

import { Auth } from 'aws-amplify'
import { withAuthenticator } from 'aws-amplify-react'

function App() {
  const [user, updateUser] = useState(null)
  useEffect(() => {
    Auth.currentAuthenticatedUser()
      .then(user => updateUser(user))
      .catch(err => console.log(err));
  }, [])
  let isAdmin = false
  if (user) {
    const { signInUserSession: { idToken: { payload }} }  = user
    console.log('payload: ', payload)
    if (payload['cognito:groups'] && payload['cognito:groups'].includes('Admin')) {
      isAdmin = true
    }
  }
  return (
    <div className="App">
      <header className="App-header">
      { isAdmin && <p>Welcome, Admin</p> }
      </header>
    </div>
  );
}

export default withAuthenticator(App, { includeGreetings: true })

Next, run the app:

$ npm start

Now, sign up with an Admin user. If the user is indeed one of the admins, you should see the Welcome, Admin greeting.

You can also view the Amazon Cognito authentication service and all of the users and groups by running the following command:

amplify console auth

Dynamic image resizing with AWS Lambda and Amazon S3

In the next example, we will add functionality that allows users to upload images to Amazon S3. We will also configure an S3 trigger that will call a Lambda function every time a file is uploaded to the bucket. In this function, we’ll check the size of the image, and if it is above a certain width, we will resize it to be below the width threshold.

For this to work we will need to enable S3 to trigger the Lambda function in our project when a file is uploaded. We can do this using the Amplify CLI by just creating the S3 bucket and choosing the correct configuration. From the CLI, run the following commands:

$ amplify add storage

? Please select from one of the below mentioned services: Content
? Please provide a friendly name for your resource that will be used to label this category in the project: <your_unique_resource_name>
? Please provide bucket name: <your_unique_bucket_name>
? Who should have access: Auth and Guest users
? What kind of access do you want for Authenticated users? Choose all (create / update, read, & delete)
? What kind of access do you want for Guest users? Choose all (create / update, read, & delete)
? Do you want to add a Lambda Trigger for your S3 Bucket? Y
? Select from the following options: Create a new function
? Do you want to edit the local S3Trigger18399e19 lambda function now? Y

This will open the function into your text editor.

Adding the custom logic for resizing the image

Now, we can update the function to implement the image resizing.

What we will be doing in this function is fetching the image when the event comes through, checking to see if it is greater than 1000 pixels wide, and if so, resizing it to be 1000 pixels wide and then saving it back to the bucket. If the image is not larger than 1000 pixels wide, we exit from the function without taking any action.

// amplify/backend/function/<functionname>/src/index.js

// import the sharp library
const sharp = require('sharp')
const aws = require('aws-sdk')
const s3 = new aws.S3()

exports.handler = async function (event, context) { //eslint-disable-line
  // if the event type is delete, return from the function
  if (event.Records[0].eventName === 'ObjectRemoved:Delete') return

  // next, we get the bucket name and the key from the event.
  const BUCKET = event.Records[0].s3.bucket.name
  const KEY = event.Records[0].s3.object.key
  try {
    // fetch the image data from S3
    let image = await s3.getObject({ Bucket: BUCKET, Key: KEY }).promise()
    image = await sharp(image.Body)

    // get the metadata from the image, including the width and the height
    const metadata = await image.metadata()
    if (metadata.width > 1000) {
      // if the width is greater than 1000, the image is resized
      const resizedImage = await image.resize({ width: 1000 }).toBuffer()
      await s3.putObject({ Bucket: BUCKET, Body: resizedImage, Key: KEY }).promise()
      return
    } else {
      return
    }
  }
  catch(err) {
    context.fail(`Error getting files: ${err}`);
  }
};

For our function to work, we still need to update a couple of things. First of all, we are requiring the sharp library in our Lambda function, but so far we have not installed this module. To make sure this module is installed, update the package.json file for the function to add both the dependency for the package as well as a special install script that we will need in order for sharp to run correctly in the Lambda environment. The two fields we will be adding are scripts and dependencies:

// amplify/backend/function/<functionname>/src/package.json
{
  "name": "<your_function_name>",
  "version": "2.0.0",
  "description": "Lambda function generated by Amplify",
  "main": "index.js",
  "license": "Apache-2.0",
  "scripts": {
    "install": "npm install --arch=x64 --platform=linux --target=10.15.0 sharp"
  },
  "dependencies": {
    "sharp": "^0.23.2"
  }
}

Next, we need to make sure we are on 10.x of Node.js. To do this, we can actually specify the runtime for our function. When the function was created by the CLI, we are given defaults that we can update to suit the application that we are building. This configuration is located at mplify/backend/function/<functionname>/<functionname>-cloudformation-template.json. In this file, find the key for Runtime and make sure it is set to nodejs10.x, and if it isn’t update it to the correct runtime.

...
"Runtime": "nodejs10.x",
...

Now, the service is ready to be deployed:

amplify push

Uploading images from the React application

Next, open src/App.js and add the following code to render the basic photo picker and photo album component:

import React from 'react';
import logo from './logo.svg';
import './App.css';

import { S3Album } from 'aws-amplify-react';

function App() {
  return (
    <div className="App">
      <header className="App-header">
      <S3Album
        picker
        path=""
      />
      </header>
    </div>
  );
}

export default App;

Next, run the app:

$ npm start

When you upload an image that is larger than 1000 pixels, you’ll notice that it will load as the original size initially, but if you reload the app you will see that the image has been resized to the correct 1000 pixel width.

Summary

Congratulations, you’ve now successfully implemented two types of Lambda triggers!

Here are a couple of things to keep in mind from this chapter:

  1. Lambda functions can be invoked from many different event types including API calls, image uploads, database operations, and authentication events.

  2. The event data structure differs based on the type of event invoking the Lambda function.

  3. Understanding the data available in the event variable enables you to better evaluate the things that can be accomplished within the function.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.12.73.64