This Hackathon project is an AWS app consisting of:
- A data ingestion pipeline which allows adding movie data to an ElasticSearch index via:
- An AWS Lambda function, explosed via a fuction URL.
- The Lambda function sends the JSON payload to a Kinesis Data Stream.
- A Kinesis Firehose Delivery Stream forwards the data to an ElasticSearch domain.
- A frontend / website which:
- Has a simple search interface to search for movies in the database.
- The HTML page uses a vanilla JS script to query data using a second Lambda function.
- This Lambda function performs a fuzzy query on the movie index in the ElasticSearch cluster.
- Clone this repo and
cd
into its working directory - Install the following tools:
- Start LocalStack in the foreground so you can watch the logs:
docker compose up
- Open another terminal window and
cd
into the same working directory - Create the resource and trigger the invocation of the lambda:
./run.sh
-
This sample does not yet run on AWS - Nice to have, but can be tackled later
- Firehose -> ElasticSearch
- Records are not properly delivered to ElasticSearch yet
- Search Lambda -> ElasticSearch
- Lambda needs to sign the HTTP requests to ElasticSearch
- Firehose -> ElasticSearch
-
Simplify the S3 website URL in LocalStack
- We need to use http://movie-search.s3.amazonaws.com:4566/index.html instead of the generated output: http://movie-search.s3-website-eu-west-1.amazonaws.com/
- It works with http://movie-search.s3-website.localhost.localstack.cloud/
According to Ben, using s3-website.localhost.localstack.cloud is the way to go, to safely distinguish between buckets and websites. -> links should be adapted in README or in the generated output
-
HTTPS?
- Due to the function URLs having no proper certificate, we can only use the http version!
- http://movie-search.s3-website.localhost.localstack.cloud:4566/
According to Ben, this is not supported by AWS, and therefore not an issue.