347 líneas
7.5 KiB
Markdown
347 líneas
7.5 KiB
Markdown
# Network Packet Capture & Elasticsearch Indexer
|
||
|
||
A Node.js-based network packet capture tool that captures packets from network interfaces and indexes them into Elasticsearch for analysis and monitoring.
|
||
|
||
## Features
|
||
|
||
- 🔍 **Multi-interface capture**: Capture from one or multiple network interfaces simultaneously
|
||
- 🎯 **Flexible filtering**: Filter by protocol (TCP/UDP/ICMP), ports, and port ranges
|
||
- 🔒 **Promiscuous mode support**: Optionally capture all packets on the network segment
|
||
- 📊 **Elasticsearch integration**: Automatic indexing with optimized mapping
|
||
- <20> **Failover cache**: In-memory cache for packets when Elasticsearch is unavailable
|
||
- <20>📝 **Content extraction**: Captures and indexes readable (ASCII) packet content
|
||
- 🚀 **Smart content handling**: Automatically skips large binary content while preserving packet metadata
|
||
- 📈 **Real-time statistics**: Track capture performance and statistics
|
||
- ⚙️ **Highly configurable**: Environment variables and config file support
|
||
|
||
## Prerequisites
|
||
|
||
- Node.js >= 14.0.0
|
||
- Elasticsearch 7.x or 8.x
|
||
- Root/Administrator privileges (required for packet capture)
|
||
- Linux: libpcap-dev (`apt-get install libpcap-dev`)
|
||
- macOS: XCode Command Line Tools
|
||
|
||
### Installing System Dependencies
|
||
|
||
**Ubuntu/Debian:**
|
||
```bash
|
||
sudo apt-get update
|
||
sudo apt-get install libpcap-dev build-essential
|
||
```
|
||
|
||
**CentOS/RHEL:**
|
||
```bash
|
||
sudo yum install libpcap-devel gcc-c++ make
|
||
```
|
||
|
||
**macOS:**
|
||
```bash
|
||
xcode-select --install
|
||
```
|
||
|
||
## Installation
|
||
|
||
1. Clone or navigate to the project directory:
|
||
```bash
|
||
cd /path/to/netpcap
|
||
```
|
||
|
||
2. Install Node.js dependencies:
|
||
```bash
|
||
npm install
|
||
```
|
||
|
||
3. Copy the example environment file and configure:
|
||
```bash
|
||
cp .env.example .env
|
||
# Edit .env with your configuration
|
||
```
|
||
|
||
## Configuration
|
||
|
||
Configuration can be done via environment variables or by editing the `config.js` file directly.
|
||
|
||
### Elasticsearch Configuration
|
||
|
||
```bash
|
||
ES_NODE=http://localhost:9200
|
||
ES_USERNAME=elastic
|
||
ES_PASSWORD=your_password
|
||
ES_INDEX=network-packets
|
||
```
|
||
|
||
### Capture Settings
|
||
|
||
**Interfaces:**
|
||
```bash
|
||
# Capture from specific interfaces
|
||
CAPTURE_INTERFACES=eth0,wlan0
|
||
|
||
# Leave empty to capture from all available interfaces
|
||
CAPTURE_INTERFACES=
|
||
```
|
||
|
||
**Promiscuous Mode:**
|
||
```bash
|
||
# Enable to capture all packets on the network segment
|
||
PROMISCUOUS_MODE=true
|
||
```
|
||
|
||
### Filtering
|
||
|
||
**Protocol Filtering:**
|
||
```bash
|
||
# Only capture specific protocols
|
||
FILTER_PROTOCOLS=tcp,udp
|
||
|
||
# Capture all protocols (leave empty)
|
||
FILTER_PROTOCOLS=
|
||
```
|
||
|
||
**Port Filtering:**
|
||
```bash
|
||
# Exclude specific ports (e.g., SSH, HTTP, HTTPS)
|
||
EXCLUDE_PORTS=22,80,443
|
||
|
||
# Exclude port ranges
|
||
EXCLUDE_PORT_RANGES=[[8000,9000],[3000,3100]]
|
||
|
||
# Only capture specific ports (takes precedence)
|
||
INCLUDE_PORTS=3306,5432
|
||
```
|
||
|
||
**Custom BPF Filter:**
|
||
```bash
|
||
# Use custom Berkeley Packet Filter syntax
|
||
CAPTURE_FILTER="tcp and not port 22"
|
||
```
|
||
|
||
### Content Indexing
|
||
|
||
```bash
|
||
# Maximum content size to index (1MB default)
|
||
MAX_CONTENT_SIZE=1048576
|
||
|
||
# Enable/disable content indexing
|
||
INDEX_READABLE_CONTENT=true
|
||
```
|
||
|
||
### Cache System (Elasticsearch Failover)
|
||
|
||
The application includes an in-memory cache system to handle Elasticsearch outages:
|
||
|
||
```bash
|
||
# Maximum documents to cache in memory (default: 10000)
|
||
CACHE_MAX_SIZE=10000
|
||
|
||
# ES availability check interval in milliseconds (default: 5000)
|
||
CACHE_CHECK_INTERVAL=5000
|
||
```
|
||
|
||
**How it works:**
|
||
- When Elasticsearch is unavailable, packets are stored in memory cache
|
||
- The system periodically checks ES availability (every 5 seconds by default)
|
||
- When ES comes back online, cached documents are automatically flushed
|
||
- If cache reaches maximum size, oldest documents are removed (FIFO)
|
||
- On graceful shutdown (SIGINT/SIGTERM), the system attempts to flush all cached documents
|
||
|
||
## Usage
|
||
|
||
### Basic Usage
|
||
|
||
Run with default configuration:
|
||
```bash
|
||
sudo npm start
|
||
```
|
||
|
||
Or directly:
|
||
```bash
|
||
sudo node index.js
|
||
```
|
||
|
||
### Capture from Specific Interface
|
||
|
||
```bash
|
||
sudo CAPTURE_INTERFACES=eth0 node index.js
|
||
```
|
||
|
||
### Capture Only HTTP/HTTPS Traffic
|
||
|
||
```bash
|
||
sudo INCLUDE_PORTS=80,443 FILTER_PROTOCOLS=tcp node index.js
|
||
```
|
||
|
||
### Exclude SSH and High Ports
|
||
|
||
```bash
|
||
sudo EXCLUDE_PORTS=22 EXCLUDE_PORT_RANGES=[[8000,65535]] node index.js
|
||
```
|
||
|
||
### Enable Promiscuous Mode
|
||
|
||
```bash
|
||
sudo PROMISCUOUS_MODE=true node index.js
|
||
```
|
||
|
||
### Debug Mode
|
||
|
||
```bash
|
||
sudo LOG_LEVEL=debug node index.js
|
||
```
|
||
|
||
## Elasticsearch Index Structure
|
||
|
||
The tool creates an index with the following document structure:
|
||
|
||
```json
|
||
{
|
||
"@timestamp": "2026-02-11T10:30:00.000Z",
|
||
"interface": {
|
||
"name": "eth0",
|
||
"ip": "192.168.1.100",
|
||
"mac": "aa:bb:cc:dd:ee:ff"
|
||
},
|
||
"ethernet": {
|
||
"src": "aa:bb:cc:dd:ee:ff",
|
||
"dst": "11:22:33:44:55:66",
|
||
"type": 2048
|
||
},
|
||
"ip": {
|
||
"version": 4,
|
||
"src": "192.168.1.100",
|
||
"dst": "8.8.8.8",
|
||
"protocol": 6,
|
||
"ttl": 64,
|
||
"length": 60
|
||
},
|
||
"tcp": {
|
||
"src_port": 54321,
|
||
"dst_port": 443,
|
||
"flags": {
|
||
"syn": true,
|
||
"ack": false,
|
||
"fin": false,
|
||
"rst": false,
|
||
"psh": false
|
||
},
|
||
"seq": 123456789,
|
||
"ack_seq": 0,
|
||
"window": 65535
|
||
},
|
||
"content": "GET / HTTP/1.1\r\nHost: example.com\r\n",
|
||
"content_length": 1024,
|
||
"content_type": "binary"
|
||
}
|
||
```
|
||
|
||
## Querying Captured Data
|
||
|
||
### Example Elasticsearch Queries
|
||
|
||
**Find all packets from a specific IP:**
|
||
```bash
|
||
curl -X GET "localhost:9200/network-packets/_search?pretty" -H 'Content-Type: application/json' -d'
|
||
{
|
||
"query": {
|
||
"term": {
|
||
"ip.src": "192.168.1.100"
|
||
}
|
||
}
|
||
}
|
||
'
|
||
```
|
||
|
||
**Find all SYN packets (connection attempts):**
|
||
```bash
|
||
curl -X GET "localhost:9200/network-packets/_search?pretty" -H 'Content-Type: application/json' -d'
|
||
{
|
||
"query": {
|
||
"bool": {
|
||
"must": [
|
||
{ "term": { "tcp.flags.syn": true } },
|
||
{ "term": { "tcp.flags.ack": false } }
|
||
]
|
||
}
|
||
}
|
||
}
|
||
'
|
||
```
|
||
|
||
**Find packets with readable content:**
|
||
```bash
|
||
curl -X GET "localhost:9200/network-packets/_search?pretty" -H 'Content-Type: application/json' -d'
|
||
{
|
||
"query": {
|
||
"exists": {
|
||
"field": "content"
|
||
}
|
||
}
|
||
}
|
||
'
|
||
```
|
||
|
||
## Performance Considerations
|
||
|
||
- **Promiscuous mode** can generate high packet volumes on busy networks
|
||
- **Content indexing** increases storage requirements significantly
|
||
- Use **port filters** to reduce captured packet volume
|
||
- Adjust `MAX_CONTENT_SIZE` based on your storage capacity
|
||
- Monitor Elasticsearch cluster health when capturing high-volume traffic
|
||
- **Cache system** protects against data loss during ES outages but consumes memory
|
||
- Adjust `CACHE_MAX_SIZE` based on available RAM (each packet ~1-5KB in memory)
|
||
|
||
## Troubleshooting
|
||
|
||
### Permission Denied Errors
|
||
|
||
Packet capture requires root privileges:
|
||
```bash
|
||
sudo node index.js
|
||
```
|
||
|
||
### Interface Not Found
|
||
|
||
List available interfaces:
|
||
```bash
|
||
ip link show # Linux
|
||
ifconfig # macOS/Unix
|
||
```
|
||
|
||
### Elasticsearch Connection Failed
|
||
|
||
Verify Elasticsearch is running:
|
||
```bash
|
||
curl -X GET "localhost:9200"
|
||
```
|
||
|
||
### No Packets Being Captured
|
||
|
||
1. Check if the interface is up and receiving traffic
|
||
2. Verify filter configuration isn't too restrictive
|
||
3. Try running without filters first
|
||
4. Check system firewall settings
|
||
|
||
## Security Considerations
|
||
|
||
⚠️ **Important Security Notes:**
|
||
|
||
- This tool captures network traffic and may contain sensitive information
|
||
- Store Elasticsearch credentials securely
|
||
- Restrict access to the Elasticsearch index
|
||
- Be aware of privacy and legal implications when capturing network traffic
|
||
- Use encryption for Elasticsearch connections in production
|
||
- Comply with applicable laws and regulations
|
||
|
||
## License
|
||
|
||
MIT
|
||
|
||
## Author
|
||
|
||
ale
|
||
|
||
## Contributing
|
||
|
||
Contributions are welcome! Please feel free to submit issues or pull requests.
|