📜 ⬆️ ⬇️

Serverless REST API "on the knee in 5 minutes"

Hi, Habr! Today, we’ll continue to talk about the opportunities that Amazon Web Services provides us and how to use these capabilities in solving application problems.

On a simple example, let's consider creating in just a few minutes our own serverless, autoscaled REST API with parsing a case - getting a list for a resource.

Interesting? Then go under the cat!
')

(A source)

Instead of intro


We will not use any databases to analyze the example; instead, our source of information will be a plain text file on AWS S3.


The architecture of the developed system




Used Amazon Web Services components:


Data preparation


A text file with tabulation as field separators will be used as a source of information for generating responses on a GEST REST request. The information is not important now for this example, but for further use of the API, I unloaded the current trading table for bonds denominated in Russian rubles from the Quik trading terminal and saved it in the bonds.txt file and placed this file in a specially created AWS S3 batch.

An example of the information received is as shown in the figure below:



Next, you need to write a function that will read information from the bonds.txt file, parse it and issue it on request. AWS Lambda will do just fine. But first, you will need to create a new role that will allow the Lambda function created to read information from a batch located in AWS S3.

Creating a role for AWS Lambda


  1. In the AWS Management Console, go to the AWS IAM service and then to the “Roles” tab, click on the “Create role” button;

    Adding a new role

  2. The role we are creating will be used by AWS Lambda to read information from AWS S3. Therefore, in the next step, select “Select type of trusted” -> “Aws Service” and “Choose this service” -> “Lambda” and click on the button “Next: Permissions”

    Role for the Lambda service

  3. Now you need to set the policies for accessing AWS resources to be used in the newly created role. Since The list of policies is quite impressive, using the filter for policies we specify “S3” for it. As a result, we get a filtered list applied to the S3 service. Check the checkbox in front of the “AmazonS3ReadOnlyAccess” policy and click on the “Next: Tags” button.

    Policies for Roles

  4. The step (Add tags (optional)) is optional, but if you wish, you can specify tags for Roles. We will not do this and proceed to the next step - Preview. Here you need to set the name of the role - “ForLambdaS3-ReadOnly”, add a description and click on the “Create role” button.

    Title Roles


Everything, the role is created and we can use it in further work.

Creating a new feature in AWS Lambda


  1. Go to the AWS Lambda service and click on the “Create function” button:

    Function creation


    Fill in all fields as shown in the screenshot below:

    • Name - "getAllBondsList";
    • Runtime - "Python 3.6"
    • Role - “Choose an existing role”
    • Existing role - here we choose the role that we created above - ForLambdaS3-ReadOnly

    Name and role selection

  2. It remains only to write the function code and test its performance on various test runs. It should be noted that the main component of any Lambda function (if you use Python) is the boto3 library:

    import boto3 s3 = boto3.resource('s3') bucket = s3.Bucket('your-s3-bucket') obj = bucket.Object(key = 'bonds.txt') response = obj.get() 

    The basic idea of ​​our Python function is as follows:

    • Open the file bonds.txt;
    • Read column headers;
    • Break records page by page (10 collections in our case);
    • Select the desired page;
    • Snapshot the name of the columns and records;
    • To give the result in the form of collections.

    Let's not spend a lot of time on the function and technical implementation code itself; everything is quite simple here and the full code is available on my GitHub .

      for i in range(0, len(lines_proc)): d = dict((u''.join(key), u''.join(value)) for (key, value) in zip(headers, lines_proc[i].split("\t"))) response_body.append(d) return { 'statusCode': 200, 'page' : num_page, 'body': response_body } 

    Insert the code (or write your own :)) into the “Function code” block and click on the “Save” button in the upper right corner of the screen.

    Insert code

  3. Creating test events. After inserting the code, the feature is available to run and test. Click on the “Test” button and create several test events: launch of the lambda_handler function with different parameters. Namely:

    • Starting a function with the 'page' parameter: '100';
    • Starting the function with the 'page' parameter: '1000000';
    • Starting the function with the parameter 'page': 'bla-bla-bla';
    • Run the function without the 'page' parameter.

    Test Event Page100


    Starting the created function with the transfer of the test event page == 100. As can be seen from the screenshot below, the function has successfully completed, returned status 200 (OK), as well as a set of collections that correspond to the hundredth page of separated data using pagination.

    Launch test event Page100


    For the purity of the experiment, we will launch another test event - “PageBlaBlaBla”. In this case, the function returns the result with the code 415 and a comment that you need to check the correctness of the passed parameters:

    Test Event PageBlaBlaBla


    Event triggering PageBlaBlaBla



API creation


After all the other cases have been tested and there is an understanding that the Lambda function works as we expect, we proceed to the creation of the API. Create an access point to the Lambda function created above and additionally install protection against unwanted launches using the API Key.


Conclusions and summary


In this article, we looked at creating a serverless, autoscaled, REST API using Amazon's cloud services. The article was not the smallest in terms of volume, but I tried to explain the whole process of creating an API in as much detail as possible and put together the entire sequence of actions.

I am sure that after one or two repetitions of the actions described in the article, you will be able to raise your cloud API in 5 minutes and even faster.

Due to its relative simplicity, low cost and power, AWS API Gateway service provides developers with ample opportunities for use in work and in commercial projects. To consolidate the theoretical material of this article, try to get a free annual subscription to Amazon Web Services and take the above steps to create a REST API yourself.

For any questions and suggestions, I am ready to talk with pleasure. I look forward to your comments on the article and wish you success!

Source: https://habr.com/ru/post/435180/


All Articles