🚀 ELK Stack Implementation

Complete Step-by-Step Visual Guide

📊 Final Architecture

📄

Filebeat

Log Shipper

⚙️

Logstash

Log Processor

Port 5044
🔍

Elasticsearch

Search Engine

Port 9200
↖️
📈

Kibana

Visualization

Port 5601
🔧

Phase 1: Prerequisites & Setup

Prepare your Ubuntu 22.04 system with necessary dependencies and repositories for the ELK Stack installation.

1
System Update & Java Installation

Update your system and install Java 17 runtime required by Elasticsearch:

sudo apt update
sudo apt install openjdk-17-jre -y
✅ Java 17 installed successfully!
2
Add Elastic Repository

Import GPG key and add the official Elastic repository:

wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo gpg --dearmor -o /usr/share/keyrings/elasticsearch-keyring.gpg
echo "deb [signed-by=/usr/share/keyrings/elasticsearch-keyring.gpg] https://artifacts.elastic.co/packages/8.x/apt stable main" | sudo tee /etc/apt/sources.list.d/elastic-8.x.list
Note: This adds the official Elastic 8.x repository to your system.
📦

Phase 2: Core Components

Install and configure Elasticsearch and Kibana - the core components that handle data storage, search, and visualization.

3
Install Elasticsearch

Install and start Elasticsearch service:

sudo apt update
sudo apt install elasticsearch -y
sudo systemctl start elasticsearch
sudo systemctl enable elasticsearch
✅ Elasticsearch is running on port 9200!
4
Install & Configure Kibana

Install Kibana and configure network access:

sudo apt install kibana -y
sudo nano /etc/kibana/kibana.yml
server.port: 5601 server.host: "0.0.0.0"
sudo systemctl start kibana
sudo systemctl enable kibana
✅ Kibana accessible at http://YOUR_IP:5601
🔄

Phase 3: Data Pipeline

Set up Filebeat and Logstash to create a complete data pipeline for log collection, processing, and forwarding.

5
Install & Configure Filebeat

Install Filebeat and enable system module:

sudo apt install filebeat -y
sudo filebeat modules enable system
sudo nano /etc/filebeat/filebeat.yml
# Comment out Elasticsearch output # output.elasticsearch: # hosts: ["localhost:9200"] # Configure Logstash output output.logstash: hosts: ["localhost:5044"]
sudo systemctl start filebeat
sudo systemctl enable filebeat
6
Install & Configure Logstash

Install Logstash and create processing pipeline:

sudo apt install logstash -y
sudo nano /etc/logstash/conf.d/02-beats-input.conf
input { beats { port => 5044 } } output { elasticsearch { hosts => ["http://localhost:9200"] index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}" } }
sudo systemctl start logstash
sudo systemctl enable logstash
✅ Data pipeline is active!
📊

Phase 4: Visualization & Alerting

Configure Kibana for data exploration, create visualizations, and set up automated alerting for security monitoring.

7
Create Index Pattern

Set up index pattern in Kibana to access your log data:

Steps in Kibana UI:
1. Navigate to Stack Management → Index Patterns
2. Click "Create index pattern"
3. Enter pattern: filebeat-*
4. Select time filter field: @timestamp
✅ Index pattern created successfully!
8
Setup Security Alerting

Configure automated alerts for security monitoring:

SSH Failed Login Alert:
1. Go to Stack Management → Rules and Connectors
2. Create rule with type: Log threshold
3. Condition: Count within last 5 minutes
4. Filter: system.auth.ssh.event : "Failed"
5. Threshold: Above 5
6. Configure notification action
✅ Security monitoring active!

🔍 System Verification

Verify your ELK Stack installation with these checks:

🔍 Elasticsearch

curl -X GET "localhost:9200/"

Should return cluster information

📊 Kibana

Access: http://YOUR_IP:5601

Should show Kibana login page

📄 Filebeat

sudo systemctl status filebeat

Should show "active (running)"

⚙️ Logstash

sudo systemctl status logstash

Should show "active (running)"