SMS OTP Authentication Using AWS Backend

SMS OTP Authentication Using AWS Backend

A Multifactor Authentication is an essential when developing a web applications. SMS based OTP is one of the most commonly used MFA techniques. Interested in adding one for your web app? Let's go ahead and make one here.

In this example, we will use an AWS based backend. Most of the popular clouds have services that enable us to do something similar. This is a cloud native, serverless solution. Hence, extremely scalable and low cost.

AWS Services Used

For implementing this, we will use four major AWS services.

  • SNS (Simple Notification Service)
  • Dyanamo DB
  • IAM
  • Lambda Functions
  • API Gateway And of course, we will need a website to hold the front-end. For this, we can use two more of the AWS services:
  • S3 (Simple Storage Service)
  • Cloud Front All these services are "Free Tier Eligible", and this tutorial should not cost you anything - unless you badly goof up somewhere - which is quite unlikely

Let us start with configuring each of these. We can configure the services on the AWS console or the CLI or SDK. We will focus on the AWS Console. SNS does not need any explicit configuration. We start with the DynamoDB


DynamoDB is the cloud native serverless database provided by Database. It provides high performance at a low cost. In DynamoDB, we create a new table with specific configurations.

###Create Table Foremost, we create a new table. On the DynamoDB console, click on the "Create table" button.

Create Table

Capacity Planning

This will lead us to a page where we can provide the table name and the important configurations. Configuration Next, we go in for capacity planning for the table. Like most Serverless Services, DynamoDB allows us two types of capacity planning. Reserved and On Demand. On Demand is free for limited usage, billable beyond that. Reserved on the other hand, has a concrete free tier. It helps us restrict our usage.

Another problem with On Demand usage is that of the warm up time. The serverless services have a tendency to slow down when not used for a long time. The first call after that has a high latency. That is not a problem for a small application like this. But, when performance is critical, it is best to reserve capacity.

You can play around with the options and choose the one you like. I would recommend this. Note that reserving too much capacity will cost money.

Capacity In the snapshot above, it shows $0.59 per month. But if you are within the free tier, and have not reserved capacity elsewhere, this cost is waived off.

With all these settings, click on Create on the bottom of the page. It will take a couple of minutes to create the table. Once it is created, we can do further configurations.

Time To Live

EnableTTL TTL or Time To Live is an interesting setting provided by DynamoDB. With this, we can easily automate any data cleanup. In simple words, using the TTL, we can specify the life of the given record in the DB. DynamoDB will delete that record after that time. AWS gives a disclaimer that this cleanup may not be accurate, and the record may live longer than required. But, I have seen it disappear well in time.

If you have a time critical application that depends on the cleanup time, then TTL may not be the right way for you. But, for our application, the TTL provides us a simple way to cleanup the old records.

Table Details section, click on the Manage TTL link. Configure TTL In the TTL attribute field, provide the name of a field that we plan to use for the TTL. I call it ttl. After this, click Continue. Again it will take some time to make the change to the table. But the table will be ready for us in a minute or two.

Lambda Function

Next, we create two Lambda Functions - one to generate the OTP, and the other to validate the OTP provided.

Create Lambda Function For creating a Lambda Function, we start with the AWS Console. Click on the Create Function button to create a new Lambda Function. Provide the function name and programming language in the Basic Information:

Create Lambda Function Next, we assign a role to the Lambda Function. Note that this function has to access SNS and the DynamoDB table. So we have to create a Role with these permissions.

1_QGnucqyXckUOpMQrgkN48A.jpg In the Permissions section, we start by letting AWS create a simple role for us. This role will have all that is required to run a simple Lambda Function - like logging to the CloudWatch. AWS will also provide a name for this role.

With these settings, we click the create button. It creates a function and takes us to the configuration page for the new function.

1_QGnucqyXckUOpMQrgkN48A.jpg We click on the Permissions Tab. Here, we can see the Role assigned to the Lambda function. Since this is an auto generated role, the name will differ from the name you see.

1_QGnucqyXckUOpMQrgkN48A.jpg This auto generated role has limited permissions. We have to extend it to include the additional permissions to access the SNS and also the DynamoDB table that we just created.

Configure the IAM Role

We can always give it the highest level permissions - to reduce our efforts. But that is a very bad habit. It leaves behind several obscure security loop holes that can prove disastrous.

Hence it is best to inculcate the habit of identifying and assigning just the required permissions to each role. At the same time, it is also not feasible to create a new role for each and every service. It is impossible to maintain and you will soon get lost in the chaos.

Hence we have to maintain an optimal balance between the two. In any case, it is necessary that we spend some time in analyzing the permissions required for the different components of our application.

When we click on the role name above, we are directed to the IAM Console.

1_QGnucqyXckUOpMQrgkN48A.jpg As of now, the role has permission to write to the given group on the CloudWatch. We have to add more to it. In order to do this, we should include an Inline Policy. Click the link as shown above.

That will let us define a new policy for our Role.


In here, we modify the policy defined in the JSON. Use the one below:

    "Version": "2012-10-17",
    "Statement": [
            "Sid": "VisualEditor0",
            "Effect": "Allow",
            "Action": [
            "Resource": "arn:aws:dynamodb:us-east-1:869871132949:table/OTP"
            "Sid": "VisualEditor1",
            "Effect": "Allow",
            "Action": [
            "Resource": "*"

Essentially, it gives permissions to the Role - to get/put an item into the OTP table that we just created. Additionally, it allows the role to publish events onto the SNS, and managing the SMS Attributes.

Next, we click Review Policy. It asks for a policy name. We can provide a name that we like. Since it is an inline policy, we can be a little casual about the name.

With this, our role is ready. Now we go back to our Lambda function

Lambda Function

On the Lambda console, we go back to the Configuration tab. Scrolling down, we can see some generated dummy code for the Lambda Function.

import json

def lambda_handler(event, context):
    # TODO implement
    return {
        'statusCode': 200,
        'body': json.dumps('Hello from Lambda!')

We have to replace this with our business logic. Paste the below code in there.

import json
import boto3
from random import randint
import time

def random_with_N_digits(n):
    range_start = 10**(n-1)
    range_end = (10**n)-1
    return randint(range_start, range_end)

def lambda_handler(event, context):
        phone = event["phone"]
        message = event["message"]
        digits = event["digits"]

        # Check if you have an OTP already..
        db = boto3.client("dynamodb")
            response = db.get_item(
                Key = {"type" : {"S":"OTP"},"id" : {"S":phone}},
            otp = response["Item"]["otp"]["S"]
            # Generate an OTP
            otp = random_with_N_digits(event["digits"])
        # Send SMS
        sns = boto3.client('sns')
        response = sns.publish(PhoneNumber = phone, Message = message + str(otp), MessageAttributes={
            'SMSType': {'DataType': 'String', 'StringValue': 'Transactional' }

        # Add record to DynamoDB
        db = boto3.client('dynamodb')
                'type': {'S': 'OTP'}, 'id': {'S': phone},
                'otp': {'S': str(otp)},
                'ttl' : {'N' : str(180 + int(round(time.time())))}
        return {'statusCode': 200, 'body': json.dumps({'response': 'OK'})}
        return {'statusCode': 200, 'body': json.dumps({"response": 'NO'})}

Essentially, this checks the OTP table, to see if an OTP is already generated for this number. Else, it generates a new random number with the required number of digits. Then, it invokes the SNS API to send out an SMS message based on the number provided. It specifies that the SMS type is Transactional - this ensures good timing. The default value is Promotional. That does not work when we want an OTP.

Note that when it inserts a record in the DB, it sets the TTL to 180 seconds more than the current time. Thus, the OTP is valid only for 3 minutes.

Similarly, we generate the second Lambda Function - "otp-validate". This function can validate the provided OTP with the phone number. Again, we create a new Lambda Function. But, instead of creating the new Role, we pick the role we have already created.

1_QGnucqyXckUOpMQrgkN48A.jpg Add the below code to the Lambda Function:

import json
import boto3

def lambda_handler(event, context):
    otp = event["otp"]
    phone = event["phone"]
        # Get the item from dynamodb
        db = boto3.client("dynamodb")
        response = db.get_item(Key = {
            "type" : {"S":"OTP"},
            "id" : {"S":phone}

        if (otp == response["Item"]["otp"]["S"]):
            return {'statusCode': 200,'body': json.dumps({'response': 'YES'})}
            return {'statusCode': 403,'body': json.dumps({'response': 'NO'})}
        return {'statusCode': 403,'body': json.dumps({'response': 'NO'})}

This code tries to fetch the OTP stored the database. If there is an OTP record and if the value matchs, it returns a 200 response code. Else, it returns a 403.

API Gateway

The Lambda functions is hidden inside the AWS cloud. If we want to invoke it from elsewhere, if we want to use it as an API, we need the API Gateway. It is pretty simple to configure. Open the API Gateway Console. Click on Create API. Then select "Rest API" and click the Build button.

Next, we provide some of the basic settings:

1_QGnucqyXckUOpMQrgkN48A.jpg This takes us forward to the API Configuration page. Here we define the details for the API. We can add Resources and Methods in those API - using the Actions dropdown. For this project, we can create two resources - generate and validate. Then add the POST method to each of them. The POST methods are bound to the corresponding Lambda Functions.

1_QGnucqyXckUOpMQrgkN48A.jpg Next, we have to enable CORS, so that we can access the API from our development environment. Just click on Enable CORS from the Actions drop down, and accept all the defaults. With everything in place, we can now "Deploy API" - from the Actions drop down. You will have to choose / define the stage. Give it the name you like - dev, test, v1... are the popular ones. And finally, your API is deployed!

1_QGnucqyXckUOpMQrgkN48A.jpg Select Stages in the menu on the left - to configure the deployed API. Now our API is exposed to the world! Anyone can invoke our API. We need some precautions in this case. The most important one that we violated was by enabling the CORS. In a real life project, we should never do this!

Another point we may want to consider is - throttling. That will limit the number of requests we accept per second. Since this is a test project, we can restrict it to 1. So that we can be guarded against someone somewhere trying to harass us by bombarding our API. We need to be more careful here when we work on a real API - because providing a low throttle threshold also means we are susceptible to a DDoS attack. But, since this is a test project we can ignore those aspects.

The Website - HTML & JavaScript

Now that we have the backend ready, let's begin with the frontend. In this blog, we wil focus only on the functional aspects of the HTML, and not any look and feel. We can always use a good CSS to beautify it.

The HTML and JavaScript code is pretty simple - we can combine them into the index.html

<!doctype html>
<html lang="en">
    <meta charset="utf-8">
    <input type="number" id="phone"> <input type="button" value="Get OTP" onclick="otpGenerate()">
    <input type="number" id="phone"> <input type="button" value="Get OTP" onclick="otpGenerate()">

    <script src="" integrity="sha256-xNzN2a4ltkB44Mc/Jz3pT4iU1cmeR0FkXs4pru/JxaQ=" crossorigin="anonymous"></script> 
        ajaxCall = (url, object) => {
            $('html, body').css("cursor", "wait");
                type: "POST",
                crossDomail: false,
                url: url,
                contentType: "application/json",
                data: JSON.stringify(object),
                success: function (response) {
                    $('html, body').css("cursor", "auto");
                error: function () {
                    $('html, body').css("cursor", "auto");
        otpGenerate = () => {
            var phone = int($("#phone").text());
            ajaxCall("https://API-URL/generate", { "phone": phone, "digits": 6 });
        otpValidate = () => {
            var phone = int($("#phone").text());
            var otp = int($("#otp").text());
            ajaxCall("https://API-URL/validate", { "phone": phone, "otp": otp });


This includes the JQuery library and defines two methods that invoke the API's that we defined above.

Hosting on S3/CloudFront

Finally, we come to hosting our website on AWS. We can use S3 alone to host a static website. But that restricts the connection to http (not https), adding the ugly "Not Secure" warning on the browser. So I prefer the combination of S3 and CloudFront. To start with, just create a simple S3 bucket and put the above index.html file in it - no special configuration required on this bucket. If you have read the blog thus far, I am sure you know how to create an S3 bucket. So I won't waste time on that.

Next, go to the CloudFront console. Click on Create Distribution, and then choose the static delivery method as Web. There, you will see a long form to be filled for the new distribution. Click on the text field for the first entry "Origin Domain Name". You will see the list of all your S3 buckets. Choose the one you just created. A lot of the form will be auto populated.

Most of it will remain the default. Just have to modify a few fields in there.

Restrict Bucket Access

We want to make sure that the external entities do not directly access your S3 bucket. All traffic has to flow through the CloudFront. So, we enable this feature - and it asks many more questions.. Use this as a reference

1_QGnucqyXckUOpMQrgkN48A.jpg Now, CloudFront will take care of configuring your S3 bucket. Leave everything else at the default values as you go down.. and locate the field "Default Root Object". Set this to "index.html"

That is all you need to do. Click on Create Distribution, and you are done. Give it a minute, and you should have everything setup. You will see the URL for your distribution in the distribution list - under the Domain Name column. Something like Try accessing that link ...

There, you can validate your mobile with OTP. Like it? Let me know.


Remember to cleanup all that you created:

  • CloudFront Distribution
  • API Gateway
  • Lambda Functions
  • S3 Bucket
  • IAM Role
  • DynamoDB table