Files
hasher/DEPLOYMENT.md
2025-12-04 00:58:40 +01:00

407 líneas
7.6 KiB
Markdown

# Deployment Guide
This guide covers deploying the Hasher application to production.
## Prerequisites
- Node.js 18.x or higher
- Elasticsearch 8.x cluster
- Domain name (optional, for custom domain)
- SSL certificate (recommended for production)
## Deployment Options
### Option 1: Vercel (Recommended for Next.js)
Vercel provides seamless deployment for Next.js applications.
#### Steps:
1. **Install Vercel CLI**:
```bash
npm install -g vercel
```
2. **Login to Vercel**:
```bash
vercel login
```
3. **Deploy**:
```bash
vercel
```
4. **Set Environment Variables**:
- Go to your project settings on Vercel
- Add environment variable: `ELASTICSEARCH_NODE=http://your-elasticsearch-host:9200`
- Redeploy: `vercel --prod`
#### Important Notes:
- Ensure Elasticsearch is accessible from Vercel's servers
- Consider using Elastic Cloud or a publicly accessible Elasticsearch instance
- Use environment variables for sensitive configuration
---
### Option 2: Docker
Deploy using Docker containers.
#### Create Dockerfile:
```dockerfile
# Create this file: Dockerfile
FROM node:18-alpine AS base
# Install dependencies only when needed
FROM base AS deps
RUN apk add --no-cache libc6-compat
WORKDIR /app
COPY package.json package-lock.json ./
RUN npm ci
# Rebuild the source code only when needed
FROM base AS builder
WORKDIR /app
COPY --from=deps /app/node_modules ./node_modules
COPY . .
ENV NEXT_TELEMETRY_DISABLED=1
RUN npm run build
# Production image, copy all the files and run next
FROM base AS runner
WORKDIR /app
ENV NODE_ENV=production
ENV NEXT_TELEMETRY_DISABLED=1
RUN addgroup --system --gid 1001 nodejs
RUN adduser --system --uid 1001 nextjs
COPY --from=builder /app/public ./public
COPY --from=builder --chown=nextjs:nodejs /app/.next/standalone ./
COPY --from=builder --chown=nextjs:nodejs /app/.next/static ./.next/static
USER nextjs
EXPOSE 3000
ENV PORT=3000
ENV HOSTNAME="0.0.0.0"
CMD ["node", "server.js"]
```
#### Update next.config.ts:
```typescript
import type { NextConfig } from 'next';
const nextConfig: NextConfig = {
output: 'standalone',
};
export default nextConfig;
```
#### Build and Run:
```bash
# Build the Docker image
docker build -t hasher:latest .
# Run the container
docker run -d \
-p 3000:3000 \
-e ELASTICSEARCH_NODE=http://elasticsearch:9200 \
--name hasher \
hasher:latest
```
#### Docker Compose:
Create `docker-compose.yml`:
```yaml
version: '3.8'
services:
app:
build: .
ports:
- "3000:3000"
environment:
- ELASTICSEARCH_NODE=http://elasticsearch:9200
depends_on:
- elasticsearch
restart: unless-stopped
elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch:8.11.0
environment:
- discovery.type=single-node
- xpack.security.enabled=false
- "ES_JAVA_OPTS=-Xms512m -Xmx512m"
ports:
- "9200:9200"
volumes:
- elasticsearch-data:/usr/share/elasticsearch/data
restart: unless-stopped
volumes:
elasticsearch-data:
```
Run with:
```bash
docker-compose up -d
```
---
### Option 3: Traditional VPS (Ubuntu/Debian)
Deploy to a traditional server.
#### 1. Install Node.js:
```bash
curl -fsSL https://deb.nodesource.com/setup_18.x | sudo -E bash -
sudo apt-get install -y nodejs
```
#### 2. Install PM2 (Process Manager):
```bash
sudo npm install -g pm2
```
#### 3. Clone and Build:
```bash
cd /var/www
git clone <your-repo-url> hasher
cd hasher
npm install
npm run build
```
#### 4. Configure Environment:
```bash
cat > .env.local << EOF
ELASTICSEARCH_NODE=http://localhost:9200
NODE_ENV=production
EOF
```
#### 5. Start with PM2:
```bash
pm2 start npm --name "hasher" -- start
pm2 save
pm2 startup
```
#### 6. Configure Nginx (Optional):
```nginx
server {
listen 80;
server_name your-domain.com;
location / {
proxy_pass http://localhost:3000;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
}
}
```
Enable the site:
```bash
sudo ln -s /etc/nginx/sites-available/hasher /etc/nginx/sites-enabled/
sudo nginx -t
sudo systemctl reload nginx
```
---
## Elasticsearch Setup
### Option 1: Elastic Cloud (Managed)
1. Sign up at [Elastic Cloud](https://cloud.elastic.co/)
2. Create a deployment
3. Note the endpoint URL
4. Update `ELASTICSEARCH_NODE` environment variable
### Option 2: Self-Hosted
```bash
# Ubuntu/Debian
wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -
sudo sh -c 'echo "deb https://artifacts.elastic.co/packages/8.x/apt stable main" > /etc/apt/sources.list.d/elastic-8.x.list'
sudo apt-get update
sudo apt-get install elasticsearch
# Configure
sudo nano /etc/elasticsearch/elasticsearch.yml
# Set: network.host: 0.0.0.0
# Start
sudo systemctl start elasticsearch
sudo systemctl enable elasticsearch
```
---
## Security Considerations
### 1. Elasticsearch Security
- Enable authentication on Elasticsearch
- Use HTTPS for Elasticsearch connection
- Restrict network access with firewall rules
- Update credentials regularly
### 2. Application Security
- Use environment variables for secrets
- Enable HTTPS (SSL/TLS)
- Implement rate limiting
- Add CORS restrictions
- Monitor logs for suspicious activity
### 3. Network Security
```bash
# Example UFW firewall rules
sudo ufw allow 80/tcp
sudo ufw allow 443/tcp
sudo ufw allow from YOUR_IP to any port 9200 # Elasticsearch
sudo ufw enable
```
---
## Monitoring
### Application Monitoring
```bash
# PM2 monitoring
pm2 monit
# View logs
pm2 logs hasher
```
### Elasticsearch Monitoring
```bash
# Health check
curl http://localhost:9200/_cluster/health?pretty
# Index stats
curl http://localhost:9200/hasher/_stats?pretty
```
---
## Backup and Recovery
### Elasticsearch Snapshots
```bash
# Configure snapshot repository
curl -X PUT "localhost:9200/_snapshot/hasher_backup" -H 'Content-Type: application/json' -d'
{
"type": "fs",
"settings": {
"location": "/mnt/backups/elasticsearch"
}
}'
# Create snapshot
curl -X PUT "localhost:9200/_snapshot/hasher_backup/snapshot_1?wait_for_completion=true"
# Restore snapshot
curl -X POST "localhost:9200/_snapshot/hasher_backup/snapshot_1/_restore"
```
---
## Scaling
### Horizontal Scaling
1. Deploy multiple Next.js instances
2. Use a load balancer (nginx, HAProxy)
3. Share the same Elasticsearch cluster
### Elasticsearch Scaling
1. Add more nodes to the cluster
2. Increase shard count (already set to 10)
3. Use replicas for read scaling
---
## Troubleshooting
### Check Application Status
```bash
pm2 status
pm2 logs hasher --lines 100
```
### Check Elasticsearch
```bash
curl http://localhost:9200/_cluster/health
curl http://localhost:9200/hasher/_count
```
### Common Issues
**Issue**: Cannot connect to Elasticsearch
- Check firewall rules
- Verify Elasticsearch is running
- Check `ELASTICSEARCH_NODE` environment variable
**Issue**: Out of memory
- Increase Node.js memory: `NODE_OPTIONS=--max-old-space-size=4096`
- Increase Elasticsearch heap size
**Issue**: Slow searches
- Add more Elasticsearch nodes
- Optimize queries
- Increase replica count
---
## Performance Optimization
1. **Enable Next.js Static Optimization**
2. **Use CDN for static assets**
3. **Enable Elasticsearch caching**
4. **Configure appropriate JVM heap for Elasticsearch**
5. **Use SSD storage for Elasticsearch**
---
## Support
For deployment issues, check:
- [Next.js Deployment Docs](https://nextjs.org/docs/deployment)
- [Elasticsearch Setup Guide](https://www.elastic.co/guide/en/elasticsearch/reference/current/setup.html)
- Project GitHub Issues