Tool for analyzing ELB logs for automating steps to retreive details of ip's user agent, total request count, to which urls requests were made along with their total count, and http methods in json format.
Downloads S3 bucket objects that we created in specified time window.
-
Using Pip
python3 -m pip install elb-log-analyzer
-
Create IAM policy with below configuration
{ "Version": "2012-10-17", "Statement": [ { "Sid": "S3ListSpecificDirectory", "Effect": "Allow", "Action": "s3:ListBucket", "Resource": "arn:aws:s3:::alb-log-bucket-name" }, { "Sid": "S3GetSpecificDirectory", "Effect": "Allow", "Action": "s3:GetObject", "Resource": "arn:aws:s3:::alb-log-bucket-name/AWSLogs/XXXXXXXXXXXX/elasticloadbalancing/aws-region/*" } ] }
Note: above policy will allow user to list all contents in the bucket but download objects only from
s3://alb-log-bucket-name/AWSLogs/XXXXXXXXXXXX/elasticloadbalancing/aws-region/*
-
Create AWS access keys
-
Use aws cli to configure access key for boto3
aws configure
-
Print Help Menu.
python3 -m elb_log_analyzer.s3_log -h
-
Download all log files generated in 10 hours from now.
python3 -m elb_log_analyzer.s3_log -b elb-log-bucket -p 'alb-log-bucket-name/AWSLogs/XXXXXXXXXXXX/elasticloadbalancing/aws-region/' -H 10
-
Download all log files generated in 40 mins from now.
python3 -m elb_log_analyzer.s3_log -b elb-log-bucket -p 'alb-log-bucket-name/AWSLogs/XXXXXXXXXXXX/elasticloadbalancing/aws-region/' -m 40
-
Download all log files generated in 20 secs from now.
python3 -m elb_log_analyzer.s3_log -b elb-log-bucket -p 'alb-log-bucket-name/AWSLogs/XXXXXXXXXXXX/elasticloadbalancing/aws-region/' -s 20
-
Download all log files generated in 10 hours, 40 mins and 20 secs from now and store in a directory.
python3 -m elb_log_analyzer.s3_log -b elb-log-bucket -p 'alb-log-bucket-name/AWSLogs/XXXXXXXXXXXX/elasticloadbalancing/aws-region/' --hours 10 --minutes 40 --seconds 20 -o './logs/downloads'
Analyzes downloaded log files.
-
Print Help Menu
python3 -m elb_log_analyzer -h
-
Print json data on console
python3 -m elb_log_analyzer -i [INPUT_LOG_FILE_PATH]
-
Store json data in a file
python3 -m elb_log_analyzer -i [INPUT_LOG_FILE_PATH] -o [OUTPUT_FILE_PATH]
Note: INPUT_LOG_FILE_PATH can be log file or a directory containing all log files ending with
.log
extension -
Get IP details from IPAbuseDB
python3 -m elb_log_analyzer -i [LOG_FILE_PATH] -t [REQUESTS_THRESHOLD_VALUE] -k [IP_ABUSE_DB_API_KEY] -o [OUTPUT_FILE_PATH]
Send alert to slack channel with abusive ip details.
-
Send alert from analyzed file
python elb_log_analyzer.alerts -w [SLACK_WEBHOOK] -f [ANALYZED_LOG_FILE_LOCATION]
Dashboard to visualize data.
-
Install requirements
python3 -m pip install dashboard/requirements.txt
-
Start App
streamlit run dashboard/app.py
-
Enter Log File/Directory Path
-
Using poetry
python3 -m poetry publish --build --username [PYPI_USERNAME] --password [PYPI_PASSWORD]
-
Download log files
python3 -m elb_log_analyzer.s3_log -b elb-log-bucket -p 'alb-log-bucket-name/AWSLogs/XXXXXXXXXXXX/elasticloadbalancing/aws-region/' -H [HOURS] -o logs
-
Analyze Log Files
python3 -m elb_log_analyzer -i logs -o log.json -t [REQUEST_THRESHOLD] -k [IP_ABUSE_API_KEY]
-
Send Alert to slack with client ips having total number of requests greater than threshold requests
python -m elb_log_analyzer.alerts -w [SLACK_WEBHOOK] -f [ANALYZED_LOG_FILE_LOCATION]
-
Visualize Analyzed Logs using Dashboard
streamlit run dashboard/app.py
-
Pull image
docker pull dmdhrumilmistry/elb-log-analyzer
-
Create an
.env
file# bucket configuration BUCKET_NAME='elb-logs-bucket-name' BUCKET_PREFIX='AWSLogs/XXXXXXXX/elasticloadbalancing/eu-west-2/' # SECRETS conf REQUESTS_THRESHOLD=400 IP_ABUSE_DB_API_KEY='UPDATE_HERE' SLACK_WEBHOOK='UPDATE_HERE' # consts DATE_SUFFIX="$(date '+%Y/%m/%d')" LOG_ANALYSIS_INTERVAL=5
-
Start Container
docker run --env-file .env --rm dmdhrumilmistry/elb-log-analyzer