Aws Lambda Api Send Response and Continue Working
REST APIs are one of the key core components for operationalisation. Through a REST API you can enable your model to work as a stand-alone software component and provide easy integration channel for other software components adopting your algorithm. It can enhance your algorithm adoption, scalability, permeability and even profit. Turning your brilliant algorithm as REST API is easy!
What is a REST API?
An API is an Application Programming Interface, which is code or a piece of software that allows software programs to communicate with each other.
What is an API and what does it do? Learn from the following 2-min video video from the OpenAPIHub Team. Or you can read this article to know about the basic of APIs – What is an API and what does it do
While REST, short for "Representational State Transfer" is a software architectural style that was created to guide the design and development of the architecture for the World Wide Web. REST defines a set of constraints for how the architecture of an Internet-scale distributed hypermedia system, such as the Web, should behave. REST API makes it easier for software systems and components to communicate with each other, especially when they are separate and decoupled microservices. Communication is done over HTTP using requests to GET, PUT, POST and DELETE data.
Then What is Open API? or OpenAPI?
There are a lot of individuals who are confused about the distinction between "Open API" and "OpenAPI." They are, in fact, two distinct things.
Open APIs: Open Up API Access
The Open of "Open API" is an adjective to the API and it generally describe the usage of the API. In the old days, most of the APIs are considered "closed" and the term "Closed API" refers to a technology that may only be utilized by the developer or the company who developed it and mostly with proprietary standards and formats. In this sense, an Open API differs from embracing an open standard in that any developer easily access to it, with lower barriers. Developers can obtain access to an open API over the Internet, but this does not necessitate that the API is free of charge.
The value in an Open APIs is that it gives universal access to internal software or systems for companies and partners. To make the most of that advantage, the company may expand core functionality to a wider range of ecosystems by rethinking its business model centered on those ecosystems. Developers use an open API to gain access to functionalities in a software program that would otherwise be difficult to access without first writing a lot of code and re-inventing the wheel. In modern days, most Open API is developed using REST API style and this bring us the term OpenAPI – there is no space between "Open" and "API" here
OpenAPIs: A Technical Specification for Documenting APIs
OpenAPI is a specification for documenting and serving REST APIs. It's the latest standard in API documentation, and it's quickly taking over from Swagger.
OpenAPI has been developed by leading experts in the field of API development to help developers better understand how to consume APIs as well as attract them to try out your new APIs. OpenAPI boasts robust tools & platforms such as OpenAPIHub which offers API documentation with integrated test client. It also features a variety of open source solutions that allow you to generate cutting-edge documentation for your APIs. Using OpenAPI will ensure that the documentation you create will cover your API fully, and list all parameters, methods, and responses correctly.
The Open API Initiative is an organization dedicated to the promotion and development of open APIs. An API created in accordance with OpenAPI Specification is not necessary to be an Open API; it can also be a private (or closed) API.
And How about AWS Lambda? How does it work?
AWS Lambda is a serverless compute service provided by Amazon Web Services (AWS) that enables you to run code without provisioning or managing servers. AWS Lambda executes your code in response to events, such as AWS API calls or changes in data, and automatically manages the underlying compute resources for you. AWS Lambda can be used to power the backend of your web applications, process images and videos, and analyze data in real-time. The serverless solution that can be used to build event-driven, log-based, highly scalable, fault tolerant, and cost-effective applications.
AWS Lambda offers several benefits that can be used to power the backend of your web applications, process images and videos, and analyze data in real-time. These benefits include:
- Execution in response to events: AWS Lambda executes your code in response to events, such as AWS API calls or changes in data. This enables you to build event-driven, log-based, highly scalable, fault tolerant, and cost-effective applications.
- No need to provision or manage servers: When using AWS Lambda, you do not need to provision or manage servers. AWS Lambda automatically manages the underlying compute resources for you. This enables you to focus on building your applications, rather than managing infrastructure.
- Cost-effective: AWS Lambda is a cost-effective solution that can be used to build highly scalable and fault tolerant applications. AWS Lambda charges you only for the compute time you consume – there is no charge for idle time. This enables you to save money on your AWS bill, as you are only charged for the resources you use.
With the power of AWS Lambda, it make us much easier to create Open API without worrying the underling infrastructure maintenance.
Sample usage of AWS Lambda
Here is a typical usage extract from the AWS website
Developer can use Amazon Simple Storage Service (Amazon S3) to trigger AWS Lambda data processing in real time after an upload, or connect to an existing Amazon EFS file system to enable massively parallel shared access for large-scale file processing.
The Serverless Architecture with AWS Lambda
The Serverless Architecture is a way of building applications that do not require servers. In this type of architecture, Lambda functions are used to run code in response to events. Typically, it covers the following architectural elements in the stack
- Computing Services (AWS Lambda in this case)
- Database Services (such as DynamoDB or RDS etc.)
- API or HTTP gateway Services (AWS API Gateway or OpenAPIHub etc.)
Pricing of AWS Lambda
AWS Lambda counts a request each time it starts executing in response to an event notification trigger, such as from Amazon Simple Notification Service (SNS) or Amazon EventBridge, or an invoke call, such as from Amazon API Gateway, or via the AWS SDK, including test invokes from the AWS Console. The AWS Lambda free tier includes one million free requests per month and 400,000 GB-seconds of compute time per month.
Duration is calculated from the time your code begins executing until it returns or otherwise terminates, rounded up to the nearest 1 ms. The price depends on the amount of memory you allocate to your function. In the AWS Lambda resource model, you choose the amount of memory you want for your function, and are allocated proportional CPU power and other resources. An increase in memory size triggers an equivalent increase in CPU available to your function.
For more detail, please check the latest official AWS document here – AWS Lambda Pricing
We heard about OpenAPIHub? What is it?
OpenAPIHub is an AWS Software Partner with domain expertise in API platform technology. OpenAPIHub provides an All-in-one API Platform that supports API Needs and it is certified as "AWS Qualified Software" since 2022.
API provider can integrate their API on OpenAPIHub and build a scalable business. API consumer sign up for free and discover all API that work best for their project and business. Through OpenAPIHub, API provider and consumer both can bootstrap their API journey. OpenAPIHub provides direct integration with AWS Lambda so that you can develop and monetize your APIs in minutes. Please check the platform guide "Integrate APIs with OpenAPIHub with AWS Lambda" for more.
Check the following 1-min What is OpenAPIHub video for more and try OpenAPIHub for Free!
Hands-on tutorial with AWS Lambda & OpenAPIHub
The Algorithm
For this example we will use a simply algorithm – check today from worldClock. https://github.com/guidesmiths/world-clock. It take timezone as input and return date. Maybe not the most exciting algorithm for now, but it is merely to show the concept. Obviously you can replace the sample algorithm with any type of your own. Let's write the algorithm as a javascript function.
Setup API Provider account on OpenAPIHub
Before turning your algorithm as API and bootstrap your API journey, prepare your OpenAPIHub account and start your journey to be a API provider for FREE
1) Visit the OpenAPIHub Website – https://www.openapihub.com/
2) Register a FREE Developer Account by clicking "Start for Free"
3) Once you have finished your Free Developer Profile Setup, you can access your Developer Admin Portal here
3. Access "Be a Provider" and create your API Provider profile
4. Create your API Provider Key by going to the "Admin API Credential" under Portal Setting
Now, you are set on the OpenAPIHub side and ready to connect with your AWS Lambda services. If you hacve any difficulties, you can still check our detail guide here – "Become a Provider and Generate API Portals"
Download OpenAPIHub API Creation Tool Kit
Download the tool kit for free and turn your algorithm as REST API.
- https://www.npmjs.com/package/@openapihub/oah-provider-cli
The tool kit provide you essential file for creating REST API on AWS. Your algorithm will be wrapped as AWS lambda and expose via AWS API gateway.
The toolkit currently support Algorithm in the form of a JS project – an example used here is a NodeJS project.
Also note that an extra json file is required for local testing, which is not required for node projects.
Files in the tool kit:
- <application folder>/index.js – lambda handler and Algorithm logic
- <application folder>/package.json – application dependency file (package.json can be generated by command "npm init". Optionally can run "npm install" for package-lock.json as well)
- <application folder>/event.json – lambda event file for local testing
- template.yaml – file to config lambda and API gateway
Setting up AWS/SAM CLI
This is a one-time action required if you have not set up AWS/SAM CLI yet. Assuming you have AWS account and profile to create AWS lambda and API gateway.
On MacOS:
For Windows / Linux, please learn more here: https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-sam-cli-install.html
Why is AWS/SAM CLI needed?
The AWS Serverless Application Model (SAM) is an open-source framework for building serverless applications. It provides shorthand syntax to express functions, APIs, databases, and event source mappings. With just a few lines per resource, you can define the application you want and model it using YAML. During deployment, SAM transforms and expands the SAM syntax into AWS CloudFormation syntax, enabling you to build serverless applications faster.
Ready to Start Your API Journey?
Join OpenAPIHub and connect to the fast growing API Hub Community to grow your API Business today!
REST API Creation
Preparation
1. Update index.js to integrate with lambda. There are three session to be changed – "import module", "define query input" and "application logic". The template cover header, path and body as input, and http method other than GET. Those code are for advanced setting if required. In this example, query string parameter and http method GET are used.
2. Update package.json can be generated by command "npm init". Optionally can run "npm install" for package-lock.json as well.
3. Update event.json for unit testing. It is http request and pass queryStringParameters into lambda event. It can pass header and body to the lambda function local test as well.
4. Update template.yaml endpoint block to your lambda. The type, handler, runtime should not be changed. For the other values change according to different situation. Make sure Runtime is correct for the lambda. CodeUri should point to the application folder. Define the Path according to swagger file. Append the endpoint block if the lamdba sharing the same API gateway.
Below shows an example of template.yaml –
Creating and testing lambda function
Run the sam build command, at the same directory with template.yaml, to create the lambda package for deployment.
The build result could be find in .aws-sam/build
After building, you could do local testing with sam local invoke command, with the json file created previously –
For example, in this case it would be –
If there are any issues, fix the code and run "sam build" again before re-test.
Deploying Algorithm as lambda function and expose via AWS API Gateway
1. First to be a profile with role in AWS with the right to create lambda and API gateway
2. Now we could deploy it. Also on command line:
Setting default arguments for 'sam deploy'
=========================================
Stack Name [sam-app]: trialapiendpoint
AWS Region [us-west-2]: ap-east-1
#Shows you resources changes to be deployed and require a 'Y' to initiate deploy
Confirm changes before deploy [y/N]: y
#SAM needs permission to be able to create roles to connect to the resources in your template
Allow SAM CLI IAM role creation [Y/n]: Y
#Preserves the state of previously provisioned resources when an operation fails
Disable rollback [y/N]: N
WorldClockAPIEndpoint may not have authorization defined, Is this okay? [y/N]: y
Save arguments to configuration file [Y/n]: Y
SAM configuration file [samconfig.toml]: samconfig.toml
SAM configuration environment [default]: beta
Enter "y" if the change looks fine.
It should be deployed afterwards. Check the following on AWS to ensure everything has been successfully deployed –
Lambda on AWS Console
API Gateway
Deployment has been done successfully. You could test your endpoint with postman or cURL to make sure it works now.
Congratulations! You have created API for your Algorithm.
Publish Your Algorithm API on OpenAPIHub
Onboarding to OpenAPIHub can help you manage and promote your algorithm API. The OpenAPIHub API Creation Toolkit included the tool to onboard your algorithm API on OpenAPIHub.
OpenAPIHub cli is a npm package
- https://www.npmjs.com/package/@openapihub/oah-provider-cli
OpenAPIHub Toolkit for API onboarding
openapihub/openapihub.yml – OpenAPIHub config file
openapihub/swagger.json – Your API swagger file. You can use your own swagger file or let OpenAPIHub tool kit download the API definition from AWS API Gateway
openapihub/readme.md – Your API readme file
oah_api_onboard.js – JS script for OpenAPIHub onboarding
Publish your Algorithm API via OpenAPIHub CLI Tool
1. Install via npm cmd
2. Run the command
3. Enter Username and Password after open the browser
4. Get your OpenAPIHub Portal Name onhttps://provider-portal.openapihub.com
5. Take examples/api-creation.json as example and update your API information
6. Prepare your swagger file
7. Run the command to onboard your API on OpenAPIHub
It take few mins to complete the process. You will receive a email after OpenAPIHub validated and processed your swagger file.
Congratulations! You have onboarded your Algorithm API to OpenAPIHub.
You can check your API portal via the link returned from the script. Furthermore, you can custom your provider portal via https://provider-portal.openapihub.com and enjoy the API marketplace and monetisation features etc.
Start using OpenAPIHub for FREE now! OpenAPIHub is a SaaS API Platform that help you bootstrap API business collaboration in minutes. Check the following 1-min What is OpenAPIHub video for more and try OpenAPIHub for Free!
Ready to Start Your API Journey?
Join OpenAPIHub and connect to the fast growing API Hub Community to grow your API Business today!
Source: https://blog.openapihub.com/en-us/build-an-open-api-with-aws-lambda-on-openapihub/
0 Response to "Aws Lambda Api Send Response and Continue Working"
Postar um comentário