Skip to content

mike-sol/ALB-Logs-to-Elasticsearch

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

24 Commits
 
 
 
 
 
 
 
 

Repository files navigation

ALB-Logs-to-Elasticsearch

#Send ALB logs from S3 bucket to ElasticSearch using AWS Lambda. This project based on awslabs/amazon-elasticsearch-lambda-samples,blmr/aws-elb-logs-to-elasticsearch Sample code for AWS Lambda to get AWS ELB log files from S3, parse and add them to an Amazon Elasticsearch Service domain.

Deployment Package Creation

  1. On your development machine, download and install Node.js.

  2. Go to root folder of the repository and install node dependencies by running:

    npm install
    

    Verify that these are installed within the node_modules subdirectory.

  3. Update the MaxMind GeoIP DB if you're going to use it:

cd node_modules/geoip-list && npm run-script updatedb license_key=$MAXMIND_LICENSE_KEY" 
  1. Create a zip file to package the index.js and the node_modules directory

The zip file thus created is the Lambda Deployment Package.

AWS Configuration

Set up the Lambda function and the S3 bucket. You can reffer to for more details > Lambda-S3 Walkthrough.

Please keep in mind the following notes and configuration overrides:

  • The S3 bucket must be created in the same region as Lambda is, so that it can push events to Lambda.

  • When registering the S3 bucket as the data-source in Lambda, add a filter for files having .log.gz suffix, so that Lambda picks up only apache log files.

  • You need to set Lambda environment variables for the following:

 ES_DOCTYPE: for the `type` field in Elasticsearch
 ES_ENDPOINT: the FQDN of your AWS Elasticsearch Service
 ES_INDEX_PREFIX: the prefix for your indices, which will be suffixed with the date
 ES_REGION: The AWS region, e.g. us-west-1, of your Elasticsearch instance
 ES_TIMESTAMP_FIELD_NAME: The field name of event timestamps. Defaults to `timestamp`.
 ES_EXTRA_FIELDS: A json object with static fields appended to each record. E.g. `{"environment":"foo", "deployment":"bar"}`
 ES_BULKSIZE: The number of log lines to bulk index into ES at once. Try 200.
 GEOIP_LOOKUP_ENABLED: set to "true" if you want to use GeoIP lookup
  • The following authorizations are required:

    1. Lambda permits S3 to push event notification to it
    2. S3 permits Lambda to fetch the created objects from a given bucket
    3. ES permits Lambda to add documents to the given domain
    4. Lambda handler is set to index.handler
    5. Don't forget the ES domain parameters in index.js

    The Lambda console provides a simple way to create an IAM role with policies for (1).
    For (2), when creating the IAM role, choose the "S3 execution role" option; this will load the role with permissions to read from the S3 bucket.
    For (3), add the following access policy to permit ES operations to the role.

{
      "Sid": "AllowLambdaAccess",
      "Effect": "Allow",
      "Principal": {
        "AWS": "arn:aws:iam::123456789012:role/lambda_s3_exec_role"
      },
      "Action": "es:*",
      "Resource": "arn:aws:es:eu-west-1:123456789012:domain/elastic-search-domain/*"
}

For (5)

var esDomain = {
    endpoint: 'elastic-search-domain-123456.us-east-1.es.amazonaws.com',
    region: 'us-east-1',
    index: 'alb-access-logs-' + indexTimestamp,
    doctype: 'alb-access-logs'
};

Event source Add Event source for your lambda function

Event source type: S3

Bucket: s3-elb-access-logs

Event type: Object Created (All)

Suffix: .log.gz

#License ASL

https://aws.amazon.com/asl/

About

Sending ALB Logs from S3 to Elasticsearch

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • JavaScript 100.0%