Files
hasher/DEPLOYMENT.md
2025-12-04 00:58:40 +01:00

7.6 KiB

Deployment Guide

This guide covers deploying the Hasher application to production.

Prerequisites

  • Node.js 18.x or higher
  • Elasticsearch 8.x cluster
  • Domain name (optional, for custom domain)
  • SSL certificate (recommended for production)

Deployment Options

Vercel provides seamless deployment for Next.js applications.

Steps:

  1. Install Vercel CLI:

    npm install -g vercel
    
  2. Login to Vercel:

    vercel login
    
  3. Deploy:

    vercel
    
  4. Set Environment Variables:

    • Go to your project settings on Vercel
    • Add environment variable: ELASTICSEARCH_NODE=http://your-elasticsearch-host:9200
    • Redeploy: vercel --prod

Important Notes:

  • Ensure Elasticsearch is accessible from Vercel's servers
  • Consider using Elastic Cloud or a publicly accessible Elasticsearch instance
  • Use environment variables for sensitive configuration

Option 2: Docker

Deploy using Docker containers.

Create Dockerfile:

# Create this file: Dockerfile
FROM node:18-alpine AS base

# Install dependencies only when needed
FROM base AS deps
RUN apk add --no-cache libc6-compat
WORKDIR /app

COPY package.json package-lock.json ./
RUN npm ci

# Rebuild the source code only when needed
FROM base AS builder
WORKDIR /app
COPY --from=deps /app/node_modules ./node_modules
COPY . .

ENV NEXT_TELEMETRY_DISABLED=1
RUN npm run build

# Production image, copy all the files and run next
FROM base AS runner
WORKDIR /app

ENV NODE_ENV=production
ENV NEXT_TELEMETRY_DISABLED=1

RUN addgroup --system --gid 1001 nodejs
RUN adduser --system --uid 1001 nextjs

COPY --from=builder /app/public ./public
COPY --from=builder --chown=nextjs:nodejs /app/.next/standalone ./
COPY --from=builder --chown=nextjs:nodejs /app/.next/static ./.next/static

USER nextjs

EXPOSE 3000

ENV PORT=3000
ENV HOSTNAME="0.0.0.0"

CMD ["node", "server.js"]

Update next.config.ts:

import type { NextConfig } from 'next';

const nextConfig: NextConfig = {
  output: 'standalone',
};

export default nextConfig;

Build and Run:

# Build the Docker image
docker build -t hasher:latest .

# Run the container
docker run -d \
  -p 3000:3000 \
  -e ELASTICSEARCH_NODE=http://elasticsearch:9200 \
  --name hasher \
  hasher:latest

Docker Compose:

Create docker-compose.yml:

version: '3.8'

services:
  app:
    build: .
    ports:
      - "3000:3000"
    environment:
      - ELASTICSEARCH_NODE=http://elasticsearch:9200
    depends_on:
      - elasticsearch
    restart: unless-stopped

  elasticsearch:
    image: docker.elastic.co/elasticsearch/elasticsearch:8.11.0
    environment:
      - discovery.type=single-node
      - xpack.security.enabled=false
      - "ES_JAVA_OPTS=-Xms512m -Xmx512m"
    ports:
      - "9200:9200"
    volumes:
      - elasticsearch-data:/usr/share/elasticsearch/data
    restart: unless-stopped

volumes:
  elasticsearch-data:

Run with:

docker-compose up -d

Option 3: Traditional VPS (Ubuntu/Debian)

Deploy to a traditional server.

1. Install Node.js:

curl -fsSL https://deb.nodesource.com/setup_18.x | sudo -E bash -
sudo apt-get install -y nodejs

2. Install PM2 (Process Manager):

sudo npm install -g pm2

3. Clone and Build:

cd /var/www
git clone <your-repo-url> hasher
cd hasher
npm install
npm run build

4. Configure Environment:

cat > .env.local << EOF
ELASTICSEARCH_NODE=http://localhost:9200
NODE_ENV=production
EOF

5. Start with PM2:

pm2 start npm --name "hasher" -- start
pm2 save
pm2 startup

6. Configure Nginx (Optional):

server {
    listen 80;
    server_name your-domain.com;

    location / {
        proxy_pass http://localhost:3000;
        proxy_http_version 1.1;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection 'upgrade';
        proxy_set_header Host $host;
        proxy_cache_bypass $http_upgrade;
    }
}

Enable the site:

sudo ln -s /etc/nginx/sites-available/hasher /etc/nginx/sites-enabled/
sudo nginx -t
sudo systemctl reload nginx

Elasticsearch Setup

Option 1: Elastic Cloud (Managed)

  1. Sign up at Elastic Cloud
  2. Create a deployment
  3. Note the endpoint URL
  4. Update ELASTICSEARCH_NODE environment variable

Option 2: Self-Hosted

# Ubuntu/Debian
wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -
sudo sh -c 'echo "deb https://artifacts.elastic.co/packages/8.x/apt stable main" > /etc/apt/sources.list.d/elastic-8.x.list'
sudo apt-get update
sudo apt-get install elasticsearch

# Configure
sudo nano /etc/elasticsearch/elasticsearch.yml
# Set: network.host: 0.0.0.0

# Start
sudo systemctl start elasticsearch
sudo systemctl enable elasticsearch

Security Considerations

1. Elasticsearch Security

  • Enable authentication on Elasticsearch
  • Use HTTPS for Elasticsearch connection
  • Restrict network access with firewall rules
  • Update credentials regularly

2. Application Security

  • Use environment variables for secrets
  • Enable HTTPS (SSL/TLS)
  • Implement rate limiting
  • Add CORS restrictions
  • Monitor logs for suspicious activity

3. Network Security

# Example UFW firewall rules
sudo ufw allow 80/tcp
sudo ufw allow 443/tcp
sudo ufw allow from YOUR_IP to any port 9200  # Elasticsearch
sudo ufw enable

Monitoring

Application Monitoring

# PM2 monitoring
pm2 monit

# View logs
pm2 logs hasher

Elasticsearch Monitoring

# Health check
curl http://localhost:9200/_cluster/health?pretty

# Index stats
curl http://localhost:9200/hasher/_stats?pretty

Backup and Recovery

Elasticsearch Snapshots

# Configure snapshot repository
curl -X PUT "localhost:9200/_snapshot/hasher_backup" -H 'Content-Type: application/json' -d'
{
  "type": "fs",
  "settings": {
    "location": "/mnt/backups/elasticsearch"
  }
}'

# Create snapshot
curl -X PUT "localhost:9200/_snapshot/hasher_backup/snapshot_1?wait_for_completion=true"

# Restore snapshot
curl -X POST "localhost:9200/_snapshot/hasher_backup/snapshot_1/_restore"

Scaling

Horizontal Scaling

  1. Deploy multiple Next.js instances
  2. Use a load balancer (nginx, HAProxy)
  3. Share the same Elasticsearch cluster

Elasticsearch Scaling

  1. Add more nodes to the cluster
  2. Increase shard count (already set to 10)
  3. Use replicas for read scaling

Troubleshooting

Check Application Status

pm2 status
pm2 logs hasher --lines 100

Check Elasticsearch

curl http://localhost:9200/_cluster/health
curl http://localhost:9200/hasher/_count

Common Issues

Issue: Cannot connect to Elasticsearch

  • Check firewall rules
  • Verify Elasticsearch is running
  • Check ELASTICSEARCH_NODE environment variable

Issue: Out of memory

  • Increase Node.js memory: NODE_OPTIONS=--max-old-space-size=4096
  • Increase Elasticsearch heap size

Issue: Slow searches

  • Add more Elasticsearch nodes
  • Optimize queries
  • Increase replica count

Performance Optimization

  1. Enable Next.js Static Optimization
  2. Use CDN for static assets
  3. Enable Elasticsearch caching
  4. Configure appropriate JVM heap for Elasticsearch
  5. Use SSD storage for Elasticsearch

Support

For deployment issues, check: