Hasher 🔐
A modern, high-performance hash search and generation tool powered by Redis and Next.js. Search for hash values to find their plaintext origins or generate hashes from any text input.
✨ Features
- 🔍 Hash Lookup: Search for MD5, SHA1, SHA256, and SHA512 hashes
- 🔑 Hash Generation: Generate multiple hash types from plaintext
- 💾 Auto-Indexing: Automatically stores searched plaintext and hashes
- 📊 Redis Backend: Fast in-memory storage with persistence
- 🚀 Bulk Indexing: Import wordlists via command-line script
- 🎨 Modern UI: Beautiful, responsive interface with real-time feedback
- 📋 Copy to Clipboard: One-click copying of any hash value
🏗️ Architecture
┌─────────────┐
│ Next.js │ ← Modern React UI
│ Frontend │
└──────┬──────┘
│
↓
┌─────────────┐
│ API │ ← REST endpoints
│ Routes │
└──────┬──────┘
│
↓
┌─────────────┐
│ Redis │ ← In-memory storage
│ │ with persistence
└─────────────┘
🚀 Quick Start
Prerequisites
- Node.js 18.x or higher
- Redis 7.x or higher
- npm or yarn
Installation
-
Clone the repository
git clone <repository-url> cd hasher -
Install dependencies
npm install -
Configure Redis (optional)
By default, the app connects to
localhost:6379. To change this:export REDIS_HOST=localhost export REDIS_PORT=6379 export REDIS_PASSWORD=your_password # Optional export REDIS_DB=0 # Optional, defaults to 0 -
Start Redis
redis-server -
Run the development server
npm run dev -
Open your browser
Navigate to http://localhost:3000
📖 Usage
Web Interface
-
Search for a Hash
- Enter any MD5, SHA1, SHA256, or SHA512 hash
- Click search or press Enter
- View the plaintext result if found in the database
-
Generate Hashes
- Enter any plaintext string
- Get instant hash values for all supported algorithms
- Hashes are automatically saved for future lookups
Bulk Indexing Script
Index large wordlists or dictionaries:
# Basic usage
npm run index-file wordlist.txt
# With custom batch size
npm run index-file wordlist.txt -- --batch-size 500
# Resume from last position
npm run index-file wordlist.txt -- --resume
# Show help
npm run index-file -- --help
Input file format: One word/phrase per line
password
admin
123456
qwerty
Script features:
- ✅ Bulk indexing with configurable batch size
- ✅ Progress indicator with percentage
- ✅ Error handling and reporting
- ✅ Performance metrics (docs/sec)
- ✅ State persistence for resume capability
- ✅ Duplicate detection
Remove Duplicates Script
Find and remove duplicate hash entries:
# Dry run (preview only)
npm run remove-duplicates -- --dry-run --field md5
# Execute removal
npm run remove-duplicates -- --execute --field sha256
# With custom batch size
npm run remove-duplicates -- --execute --field md5 --batch-size 100
🔌 API Reference
Search Endpoint
POST /api/search
Search for a hash or generate hashes from plaintext.
Request Body:
{
"query": "5f4dcc3b5aa765d61d8327deb882cf99"
}
Response (Hash Found):
{
"found": true,
"hashType": "md5",
"hash": "5f4dcc3b5aa765d61d8327deb882cf99",
"results": [{
"plaintext": "password",
"hashes": {
"md5": "5f4dcc3b5aa765d61d8327deb882cf99",
"sha1": "5baa61e4c9b93f3f0682250b6cf8331b7ee68fd8",
"sha256": "5e884898da28047151d0e56f8dc6292773603d0d6aabbdd62a11ef721d1542d8",
"sha512": "b109f3bbbc244eb82441917ed06d618b9008dd09b3befd1b5e07394c706a8bb980b1d7785e5976ec049b46df5f1326af5a2ea6d103fd07c95385ffab0cacbc86"
}
}]
}
Response (Plaintext Input):
{
"found": true,
"isPlaintext": true,
"plaintext": "password",
"wasGenerated": false,
"hashes": {
"md5": "5f4dcc3b5aa765d61d8327deb882cf99",
"sha1": "5baa61e4c9b93f3f0682250b6cf8331b7ee68fd8",
"sha256": "5e884898da28047151d0e56f8dc6292773603d0d6aabbdd62a11ef721d1542d8",
"sha512": "b109f3bbbc244eb82441917ed06d618b9008dd09b3befd1b5e07394c706a8bb980b1d7785e5976ec049b46df5f1326af5a2ea6d103fd07c95385ffab0cacbc86"
}
}
Health Check Endpoint
GET /api/health
Check Redis connection and database statistics.
Response:
{
"status": "ok",
"redis": {
"version": "7.2.4",
"connected": true,
"memoryUsed": "1.5M",
"uptime": 3600
},
"database": {
"totalKeys": 1542,
"documentCount": 386,
"totalSize": 524288
}
}
🗄️ Redis Data Structure
Key Structures
The application uses the following Redis key patterns:
-
Hash Documents:
hash:plaintext:{plaintext}{ "plaintext": "password", "md5": "5f4dcc3b5aa765d61d8327deb882cf99", "sha1": "5baa61e4c9b93f3f0682250b6cf8331b7ee68fd8", "sha256": "5e884898da28047151d0e56f8dc6292773603d0d6aabbdd62a11ef721d1542d8", "sha512": "b109f3bbbc244eb82441917ed06d618b9008dd09b3befd1b5e07394c706a8bb980b1d7785e5976ec049b46df5f1326af5a2ea6d103fd07c95385ffab0cacbc86", "created_at": "2024-01-01T00:00:00.000Z" } -
Hash Indexes:
hash:index:{algorithm}:{hash}- Points to the plaintext value
- One index per hash algorithm (md5, sha1, sha256, sha512)
-
Statistics:
hash:stats(Redis Hash)count: Total number of documentssize: Total data size in bytes
Data Flow
Plaintext → Generate Hashes → Store Document
↓
Create 4 Indexes (one per algorithm)
↓
Update Statistics
📁 Project Structure
hasher/
├── app/
│ ├── api/
│ │ ├── search/
│ │ │ └── route.ts # Search endpoint
│ │ └── health/
│ │ └── route.ts # Health check endpoint
│ ├── layout.tsx # Root layout
│ ├── page.tsx # Main UI component
│ └── globals.css # Global styles
├── lib/
│ ├── redis.ts # Redis client & operations
│ └── hash.ts # Hash utilities
├── scripts/
│ ├── index-file.ts # Bulk indexing script
│ └── remove-duplicates.ts # Duplicate removal script
├── package.json
├── tsconfig.json
├── next.config.ts
└── README.md
🛠️ Development
Build for Production
npm run build
npm run start
Environment Variables
Create a .env.local file:
REDIS_HOST=localhost
REDIS_PORT=6379
REDIS_PASSWORD=your_password # Optional
REDIS_DB=0 # Optional
Linting
npm run lint
🔒 Supported Hash Algorithms
| Algorithm | Length (hex) | Detection Pattern |
|---|---|---|
| MD5 | 32 | ^[a-f0-9]{32}$ |
| SHA1 | 40 | ^[a-f0-9]{40}$ |
| SHA256 | 64 | ^[a-f0-9]{64}$ |
| SHA512 | 128 | ^[a-f0-9]{128}$ |
🚀 Performance
- Bulk Indexing: ~5000-15000 docs/sec (depending on hardware)
- Search Latency: <5ms (typical)
- Memory Efficient: In-memory storage with optional persistence
- Atomic Operations: Pipeline-based batch operations
🔧 Redis Configuration
For optimal performance, consider these Redis settings:
# redis.conf
maxmemory 2gb
maxmemory-policy allkeys-lru
save 900 1
save 300 10
save 60 10000
🤝 Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create your feature branch (
git checkout -b feature/AmazingFeature) - Commit your changes (
git commit -m 'Add some AmazingFeature') - Push to the branch (
git push origin feature/AmazingFeature) - Open a Pull Request
📝 License
This project is open source and available under the MIT License.
🙏 Acknowledgments
- Built with Next.js
- Powered by Redis
- Icons by Lucide
- Styled with Tailwind CSS
📧 Support
For issues, questions, or contributions, please open an issue on GitHub.
Made with ❤️ for the security and development community