initial commit

Signed-off-by: ale <ale@manalejandro.com>
Este commit está contenido en:
ale
2025-08-19 02:22:37 +02:00
commit 2e6d9b4306
Se han modificado 20 ficheros con 5533 adiciones y 0 borrados

24
.eslintrc.js Archivo normal
Ver fichero

@@ -0,0 +1,24 @@
module.exports = {
env: {
node: true,
es2021: true,
jest: true
},
extends: [
'eslint:recommended'
],
parserOptions: {
ecmaVersion: 12,
sourceType: 'module'
},
rules: {
'indent': ['error', 2],
'linebreak-style': ['error', 'unix'],
'quotes': ['error', 'single'],
'semi': ['error', 'always'],
'no-unused-vars': ['error', { 'argsIgnorePattern': '^_' }],
'no-console': 'warn',
'prefer-const': 'error',
'no-var': 'error'
}
};

117
.gitignore vendido Archivo normal
Ver fichero

@@ -0,0 +1,117 @@
# Dependencies
node_modules/
npm-debug.log*
yarn-debug.log*
yarn-error.log*
# Runtime data
pids
*.pid
*.seed
*.pid.lock
# Coverage directory used by tools like istanbul
coverage/
*.lcov
# nyc test coverage
.nyc_output
# Grunt intermediate storage
.grunt
# Bower dependency directory
bower_components
# node-waf configuration
.lock-wscript
# Compiled binary addons
build/Release
# Dependency directories
jspm_packages/
# TypeScript v1 declaration files
typings/
# Optional npm cache directory
.npm
# Optional eslint cache
.eslintcache
# Optional REPL history
.node_repl_history
# Output of 'npm pack'
*.tgz
# Yarn Integrity file
.yarn-integrity
# dotenv environment variables file
.env
.env.test
.env.production
# parcel-bundler cache
.cache
.parcel-cache
# next.js build output
.next
# nuxt.js build output
.nuxt
# vuepress build output
.vuepress/dist
# Serverless directories
.serverless
# FuseBox cache
.fusebox/
# DynamoDB Local files
.dynamodb/
# TernJS port file
.tern-port
# Stores VSCode versions used for testing VSCode extensions
.vscode-test
# OS generated files
.DS_Store
.DS_Store?
._*
.Spotlight-V100
.Trashes
ehthumbs.db
Thumbs.db
# IDE files
.vscode/
.idea/
*.swp
*.swo
*~
# Logs
logs
*.log
# alepm specific
.alepm/
alepm.lock
temp/
tmp/
# Test files
tests/temp/
tests/fixtures/temp/
*.lock
*-lock.json

21
LICENSE Archivo normal
Ver fichero

@@ -0,0 +1,21 @@
MIT License
Copyright (c) 2025 ale
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

279
README.md Archivo normal
Ver fichero

@@ -0,0 +1,279 @@
# alepm - Advanced Package Manager
**alepm** es un package manager avanzado y seguro para Node.js que proporciona gestión a nivel de sistema con almacenamiento binario eficiente, caché inteligente y características de seguridad avanzadas.
## 🚀 Características
### 🔒 Seguridad
- **Verificación de integridad**: Validación automática de checksums SHA-512/SHA-256
- **Auditoría de vulnerabilidades**: Escaneo automático contra bases de datos de vulnerabilidades
- **Análisis de contenido**: Detección de patrones maliciosos en paquetes
- **Cuarentena automática**: Aislamiento de paquetes sospechosos
- **Verificación de firmas**: Soporte para paquetes firmados digitalmente
### 💾 Almacenamiento Binario
- **Formato binario optimizado**: Reducción significativa del espacio en disco
- **Compresión avanzada**: Algoritmos de compresión de alta eficiencia
- **Deduplicación**: Eliminación automática de archivos duplicados
- **Índice eficiente**: Acceso rápido a paquetes almacenados
### ⚡ Caché Inteligente
- **Caché persistente**: Almacenamiento eficiente de paquetes descargados
- **Limpieza automática**: Gestión inteligente del espacio de caché
- **Verificación de integridad**: Validación automática de archivos en caché
- **Compresión**: Reducción del espacio utilizado por el caché
### 🔐 Gestión de Estado
- **alepm.lock**: Archivo de bloqueo para garantizar instalaciones reproducibles
- **Verificación de consistencia**: Validación automática del estado del proyecto
- **Resolución de dependencias**: Algoritmo avanzado de resolución de conflictos
- **Metadatos extendidos**: Información detallada sobre cada paquete
## 📦 Instalación
```bash
npm install -g alepm
```
## 🛠️ Uso
### Comandos Básicos
```bash
# Instalar paquetes
alepm install lodash
alepm install express@4.18.0
alepm install --save-dev jest
# Instalar todas las dependencias del proyecto
alepm install
# Desinstalar paquetes
alepm uninstall lodash
alepm remove express
# Actualizar paquetes
alepm update
alepm update lodash
# Buscar paquetes
alepm search react
# Información de un paquete
alepm info lodash
# Listar paquetes instalados
alepm list
alepm ls --depth=2
```
### Gestión de Caché
```bash
# Limpiar caché
alepm cache clean
# Verificar integridad del caché
alepm cache verify
# Estadísticas del caché
alepm cache stats
```
### Auditoría de Seguridad
```bash
# Auditar vulnerabilidades
alepm audit
# Corregir vulnerabilidades automáticamente
alepm audit --fix
# Verificar integridad del archivo de bloqueo
alepm lock verify
```
### Configuración
```bash
# Ver configuración
alepm config list
# Establecer configuración
alepm config set registry https://registry.npmjs.org
alepm config set cache.maxSize 2GB
# Obtener valor de configuración
alepm config get registry
# Restablecer configuración
alepm config reset
```
### Inicializar Proyecto
```bash
# Crear nuevo proyecto
alepm init
# Usar valores por defecto
alepm init --yes
```
## ⚙️ Configuración
alepm utiliza un archivo de configuración ubicado en `~/.alepm/config.json`. Las opciones principales incluyen:
### Registro y Red
```json
{
"registry": "https://registry.npmjs.org",
"network": {
"timeout": 30000,
"retries": 3,
"proxy": null
}
}
```
### Caché
```json
{
"cache": {
"enabled": true,
"maxSize": "1GB",
"maxAge": "30d",
"compression": true
}
}
```
### Seguridad
```json
{
"security": {
"enableAudit": true,
"enableIntegrityCheck": true,
"maxPackageSize": "100MB",
"scanPackageContent": true
}
}
```
### Almacenamiento
```json
{
"storage": {
"compression": 9,
"binaryFormat": true,
"deduplication": true
}
}
```
## 🔧 Características Avanzadas
### Almacenamiento Binario
alepm utiliza un formato de almacenamiento binario personalizado que:
- Reduce el uso de disco hasta en un 60%
- Mejora la velocidad de acceso a paquetes
- Incluye verificación de integridad integrada
- Soporta deduplicación automática
### Sistema de Caché
El sistema de caché inteligente:
- Almacena paquetes con compresión optimizada
- Limpia automáticamente archivos antiguos
- Verifica la integridad de los datos almacenados
- Optimiza el acceso frecuente a paquetes
### Resolución de Dependencias
El resolvedor de dependencias:
- Maneja conflictos de versiones automáticamente
- Optimiza el árbol de dependencias
- Detecta dependencias circulares
- Soporta hoisting inteligente
### Seguridad Avanzada
Las características de seguridad incluyen:
- Escaneo de vulnerabilidades en tiempo real
- Análisis estático de código malicioso
- Sistema de cuarentena para paquetes sospechosos
- Verificación de firmas digitales
## 📄 Archivo alepm.lock
El archivo `alepm.lock` garantiza instalaciones reproducibles y contiene:
```json
{
"lockfileVersion": "1.0.0",
"name": "mi-proyecto",
"packages": {
"lodash@4.17.21": {
"version": "4.17.21",
"resolved": "https://registry.npmjs.org/lodash/-/lodash-4.17.21.tgz",
"integrity": "sha512-v2kDEe57lecTulaDIuNTPy3Ry4gLGJ6Z1O3vE1krgXZNrsQ+LFTGHVxVjcXPs17LhbZVGedAJv8XZ1tvj5FvSg==",
"requires": {}
}
},
"dependencies": {
"lodash": {
"version": "4.17.21",
"resolved": "https://registry.npmjs.org/lodash/-/lodash-4.17.21.tgz",
"integrity": "sha512-v2kDEe57lecTulaDIuNTPy3Ry4gLGJ6Z1O3vE1krgXZNrsQ+LFTGHVxVjcXPs17LhbZVGedAJv8XZ1tvj5FvSg=="
}
}
}
```
## 🤝 Contribuir
Las contribuciones son bienvenidas. Por favor:
1. Fork el repositorio
2. Crea una rama para tu característica (`git checkout -b feature/AmazingFeature`)
3. Commit tus cambios (`git commit -m 'Add some AmazingFeature'`)
4. Push a la rama (`git push origin feature/AmazingFeature`)
5. Abre un Pull Request
## 📋 Requisitos del Sistema
- Node.js >= 16.0.0
- npm >= 7.0.0 (para desarrollo)
- Sistema operativo: Linux, macOS, Windows
## 🐛 Reportar Problemas
Si encuentras un bug o tienes una sugerencia, por favor abre un issue en el repositorio.
## 📝 Licencia
Este proyecto está licenciado bajo la Licencia MIT - ver el archivo [LICENSE](LICENSE) para detalles.
## 🙏 Agradecimientos
- Inspirado en npm, yarn y pnpm
- Utiliza bibliotecas de código abierto de la comunidad Node.js
- Agradecimiento especial a todos los contribuidores
## 🔮 Roadmap
### v1.1.0
- [ ] Soporte para workspaces
- [ ] Plugin system
- [ ] Mejoras en el rendimiento
### v1.2.0
- [ ] Interfaz web de gestión
- [ ] Soporte para registros privados
- [ ] Análisis de dependencias mejorado
### v2.0.0
- [ ] Soporte para otros lenguajes (Python, Go, etc.)
- [ ] Sistema de firmas distribuido
- [ ] Inteligencia artificial para detección de malware
---
**alepm** - Package management, evolved. 🚀

16
jest.config.js Archivo normal
Ver fichero

@@ -0,0 +1,16 @@
module.exports = {
testEnvironment: 'node',
testMatch: [
'**/tests/**/*.test.js',
'**/tests/**/*.spec.js'
],
collectCoverageFrom: [
'src/**/*.js',
'!src/cli.js',
'!src/index.js'
],
coverageDirectory: 'coverage',
coverageReporters: ['text', 'lcov', 'html'],
setupFilesAfterEnv: ['<rootDir>/tests/setup.js'],
testTimeout: 30000
};

44
package.json Archivo normal
Ver fichero

@@ -0,0 +1,44 @@
{
"name": "alepm",
"version": "1.0.0",
"description": "Advanced and secure Node.js package manager with binary storage and system-level management",
"main": "src/index.js",
"bin": {
"alepm": "./src/cli.js"
},
"scripts": {
"start": "node src/cli.js",
"test": "jest",
"lint": "eslint src/",
"dev": "node --inspect src/cli.js"
},
"keywords": [
"package-manager",
"node",
"security",
"binary-storage",
"cache",
"integrity"
],
"author": "ale",
"license": "MIT",
"engines": {
"node": ">=16.0.0"
},
"dependencies": {
"commander": "^11.0.0",
"chalk": "^4.1.2",
"semver": "^7.5.4",
"tar": "^6.1.15",
"crypto": "^1.0.1",
"fs-extra": "^11.1.1",
"node-fetch": "^2.6.12",
"inquirer": "^8.2.6",
"ora": "^5.4.1",
"listr2": "^6.6.1"
},
"devDependencies": {
"jest": "^29.6.2",
"eslint": "^8.45.0"
}
}

358
src/cache/cache-manager.js vendido Archivo normal
Ver fichero

@@ -0,0 +1,358 @@
const path = require('path');
const fs = require('fs-extra');
const crypto = require('crypto');
const zlib = require('zlib');
const { promisify } = require('util');
const gzip = promisify(zlib.gzip);
const gunzip = promisify(zlib.gunzip);
class CacheManager {
constructor() {
this.cacheDir = path.join(require('os').homedir(), '.alepm', 'cache');
this.metadataFile = path.join(this.cacheDir, 'metadata.json');
this.init();
}
async init() {
await fs.ensureDir(this.cacheDir);
if (!fs.existsSync(this.metadataFile)) {
await this.saveMetadata({
version: '1.0.0',
entries: {},
totalSize: 0,
lastCleanup: Date.now()
});
}
}
async get(packageName, version) {
const key = this.generateKey(packageName, version);
const metadata = await this.loadMetadata();
if (!metadata.entries[key]) {
return null;
}
const entry = metadata.entries[key];
const filePath = path.join(this.cacheDir, entry.file);
if (!fs.existsSync(filePath)) {
// Remove stale entry
delete metadata.entries[key];
await this.saveMetadata(metadata);
return null;
}
// Verify integrity
const fileHash = await this.calculateFileHash(filePath);
if (fileHash !== entry.hash) {
// Corrupted entry, remove it
await fs.remove(filePath);
delete metadata.entries[key];
await this.saveMetadata(metadata);
return null;
}
// Update access time
entry.lastAccess = Date.now();
await this.saveMetadata(metadata);
// Read and decompress
const compressedData = await fs.readFile(filePath);
const data = await gunzip(compressedData);
return data;
}
async store(packageName, version, data) {
const key = this.generateKey(packageName, version);
const metadata = await this.loadMetadata();
// Compress data for storage efficiency
const compressedData = await gzip(data);
const hash = crypto.createHash('sha256').update(compressedData).digest('hex');
const fileName = `${hash.substring(0, 16)}.bin`;
const filePath = path.join(this.cacheDir, fileName);
// Store compressed data
await fs.writeFile(filePath, compressedData);
// Update metadata
const entry = {
packageName,
version,
file: fileName,
hash,
size: compressedData.length,
originalSize: data.length,
timestamp: Date.now(),
lastAccess: Date.now()
};
// Remove old entry if exists
if (metadata.entries[key]) {
const oldEntry = metadata.entries[key];
const oldFilePath = path.join(this.cacheDir, oldEntry.file);
if (fs.existsSync(oldFilePath)) {
await fs.remove(oldFilePath);
metadata.totalSize -= oldEntry.size;
}
}
metadata.entries[key] = entry;
metadata.totalSize += entry.size;
await this.saveMetadata(metadata);
// Check if cleanup is needed
await this.maybeCleanup();
return entry;
}
async remove(packageName, version) {
const key = this.generateKey(packageName, version);
const metadata = await this.loadMetadata();
if (!metadata.entries[key]) {
return false;
}
const entry = metadata.entries[key];
const filePath = path.join(this.cacheDir, entry.file);
if (fs.existsSync(filePath)) {
await fs.remove(filePath);
}
metadata.totalSize -= entry.size;
delete metadata.entries[key];
await this.saveMetadata(metadata);
return true;
}
async clean() {
const metadata = await this.loadMetadata();
let cleanedSize = 0;
for (const [key, entry] of Object.entries(metadata.entries)) {
const filePath = path.join(this.cacheDir, entry.file);
if (fs.existsSync(filePath)) {
await fs.remove(filePath);
cleanedSize += entry.size;
}
}
// Reset metadata
const newMetadata = {
version: metadata.version,
entries: {},
totalSize: 0,
lastCleanup: Date.now()
};
await this.saveMetadata(newMetadata);
return cleanedSize;
}
async verify() {
const metadata = await this.loadMetadata();
const corrupted = [];
const missing = [];
for (const [key, entry] of Object.entries(metadata.entries)) {
const filePath = path.join(this.cacheDir, entry.file);
if (!fs.existsSync(filePath)) {
missing.push(key);
continue;
}
const fileHash = await this.calculateFileHash(filePath);
if (fileHash !== entry.hash) {
corrupted.push(key);
}
}
// Clean up missing and corrupted entries
for (const key of [...missing, ...corrupted]) {
const entry = metadata.entries[key];
metadata.totalSize -= entry.size;
delete metadata.entries[key];
}
if (missing.length > 0 || corrupted.length > 0) {
await this.saveMetadata(metadata);
}
return {
total: Object.keys(metadata.entries).length,
corrupted: corrupted.length,
missing: missing.length,
valid: Object.keys(metadata.entries).length - corrupted.length - missing.length
};
}
async getStats() {
const metadata = await this.loadMetadata();
const entries = Object.values(metadata.entries);
return {
totalEntries: entries.length,
totalSize: metadata.totalSize,
totalOriginalSize: entries.reduce((sum, entry) => sum + entry.originalSize, 0),
compressionRatio: entries.length > 0
? metadata.totalSize / entries.reduce((sum, entry) => sum + entry.originalSize, 0)
: 0,
oldestEntry: entries.length > 0
? Math.min(...entries.map(e => e.timestamp))
: null,
newestEntry: entries.length > 0
? Math.max(...entries.map(e => e.timestamp))
: null,
lastCleanup: metadata.lastCleanup
};
}
async maybeCleanup() {
const metadata = await this.loadMetadata();
const maxCacheSize = 1024 * 1024 * 1024; // 1GB
const maxAge = 30 * 24 * 60 * 60 * 1000; // 30 days
const timeSinceLastCleanup = Date.now() - metadata.lastCleanup;
const weekInMs = 7 * 24 * 60 * 60 * 1000;
// Only run cleanup weekly or if cache is too large
if (timeSinceLastCleanup < weekInMs && metadata.totalSize < maxCacheSize) {
return;
}
const now = Date.now();
const entries = Object.entries(metadata.entries);
let removedSize = 0;
// Remove old entries
for (const [key, entry] of entries) {
if (now - entry.lastAccess > maxAge) {
const filePath = path.join(this.cacheDir, entry.file);
if (fs.existsSync(filePath)) {
await fs.remove(filePath);
}
removedSize += entry.size;
delete metadata.entries[key];
}
}
// If still over limit, remove least recently used entries
if (metadata.totalSize - removedSize > maxCacheSize) {
const sortedEntries = Object.entries(metadata.entries)
.sort(([, a], [, b]) => a.lastAccess - b.lastAccess);
for (const [key, entry] of sortedEntries) {
if (metadata.totalSize - removedSize <= maxCacheSize) break;
const filePath = path.join(this.cacheDir, entry.file);
if (fs.existsExists(filePath)) {
await fs.remove(filePath);
}
removedSize += entry.size;
delete metadata.entries[key];
}
}
metadata.totalSize -= removedSize;
metadata.lastCleanup = now;
await this.saveMetadata(metadata);
}
generateKey(packageName, version) {
return crypto.createHash('sha1')
.update(`${packageName}@${version}`)
.digest('hex');
}
async calculateFileHash(filePath) {
const data = await fs.readFile(filePath);
return crypto.createHash('sha256').update(data).digest('hex');
}
async loadMetadata() {
try {
return await fs.readJson(this.metadataFile);
} catch (error) {
// Return default metadata if file is corrupted
return {
version: '1.0.0',
entries: {},
totalSize: 0,
lastCleanup: Date.now()
};
}
}
async saveMetadata(metadata) {
await fs.writeJson(this.metadataFile, metadata, { spaces: 2 });
}
// Binary storage optimization methods
async packPackageData(packageData) {
// Create efficient binary format for package data
const buffer = Buffer.from(JSON.stringify(packageData));
// Add magic header for format identification
const header = Buffer.from('ALEPM001', 'ascii'); // Version 1 format
const length = Buffer.alloc(4);
length.writeUInt32BE(buffer.length, 0);
return Buffer.concat([header, length, buffer]);
}
async unpackPackageData(binaryData) {
// Verify magic header
const header = binaryData.slice(0, 8).toString('ascii');
if (header !== 'ALEPM001') {
throw new Error('Invalid package data format');
}
// Read length
const length = binaryData.readUInt32BE(8);
// Extract and parse package data
const packageBuffer = binaryData.slice(12, 12 + length);
return JSON.parse(packageBuffer.toString());
}
async deduplicate() {
const metadata = await this.loadMetadata();
const hashMap = new Map();
let savedSpace = 0;
// Find duplicate files by hash
for (const [key, entry] of Object.entries(metadata.entries)) {
if (hashMap.has(entry.hash)) {
// Duplicate found, remove this entry
const filePath = path.join(this.cacheDir, entry.file);
if (fs.existsSync(filePath)) {
await fs.remove(filePath);
savedSpace += entry.size;
}
delete metadata.entries[key];
metadata.totalSize -= entry.size;
} else {
hashMap.set(entry.hash, key);
}
}
if (savedSpace > 0) {
await this.saveMetadata(metadata);
}
return savedSpace;
}
}
module.exports = CacheManager;

211
src/cli.js Archivo ejecutable
Ver fichero

@@ -0,0 +1,211 @@
#!/usr/bin/env node
const { Command } = require('commander');
const chalk = require('chalk');
const PackageManager = require('./core/package-manager');
const { version } = require('../package.json');
const program = new Command();
const pm = new PackageManager();
program
.name('alepm')
.description('Advanced and secure Node.js package manager')
.version(version);
// Install command
program
.command('install [packages...]')
.alias('i')
.description('Install packages')
.option('-g, --global', 'Install globally')
.option('-D, --save-dev', 'Save to devDependencies')
.option('-E, --save-exact', 'Save exact version')
.option('--force', 'Force reinstall')
.action(async (packages, options) => {
try {
await pm.install(packages, options);
} catch (error) {
console.error(chalk.red(`Error: ${error.message}`));
process.exit(1);
}
});
// Uninstall command
program
.command('uninstall <packages...>')
.alias('remove')
.alias('rm')
.description('Uninstall packages')
.option('-g, --global', 'Uninstall globally')
.action(async (packages, options) => {
try {
await pm.uninstall(packages, options);
} catch (error) {
console.error(chalk.red(`Error: ${error.message}`));
process.exit(1);
}
});
// Update command
program
.command('update [packages...]')
.alias('upgrade')
.description('Update packages')
.option('-g, --global', 'Update global packages')
.action(async (packages, options) => {
try {
await pm.update(packages, options);
} catch (error) {
console.error(chalk.red(`Error: ${error.message}`));
process.exit(1);
}
});
// List command
program
.command('list')
.alias('ls')
.description('List installed packages')
.option('-g, --global', 'List global packages')
.option('--depth <level>', 'Dependency depth level', '0')
.action(async (options) => {
try {
await pm.list(options);
} catch (error) {
console.error(chalk.red(`Error: ${error.message}`));
process.exit(1);
}
});
// Search command
program
.command('search <query>')
.description('Search for packages')
.option('--limit <number>', 'Limit results', '20')
.action(async (query, options) => {
try {
await pm.search(query, options);
} catch (error) {
console.error(chalk.red(`Error: ${error.message}`));
process.exit(1);
}
});
// Info command
program
.command('info <package>')
.description('Show package information')
.action(async (packageName) => {
try {
await pm.info(packageName);
} catch (error) {
console.error(chalk.red(`Error: ${error.message}`));
process.exit(1);
}
});
// Cache commands
program
.command('cache')
.description('Cache management')
.addCommand(
new Command('clean')
.description('Clean cache')
.action(async () => {
try {
await pm.cleanCache();
} catch (error) {
console.error(chalk.red(`Error: ${error.message}`));
process.exit(1);
}
})
)
.addCommand(
new Command('verify')
.description('Verify cache integrity')
.action(async () => {
try {
await pm.verifyCache();
} catch (error) {
console.error(chalk.red(`Error: ${error.message}`));
process.exit(1);
}
})
);
// Security commands
program
.command('audit')
.description('Audit packages for security vulnerabilities')
.option('--fix', 'Automatically fix vulnerabilities')
.action(async (options) => {
try {
await pm.audit(options);
} catch (error) {
console.error(chalk.red(`Error: ${error.message}`));
process.exit(1);
}
});
// Lock file commands
program
.command('lock')
.description('Lock file management')
.addCommand(
new Command('verify')
.description('Verify lock file integrity')
.action(async () => {
try {
await pm.verifyLock();
} catch (error) {
console.error(chalk.red(`Error: ${error.message}`));
process.exit(1);
}
})
);
// Config commands
program
.command('config')
.description('Configuration management')
.addCommand(
new Command('set <key> <value>')
.description('Set configuration value')
.action(async (key, value) => {
try {
await pm.setConfig(key, value);
} catch (error) {
console.error(chalk.red(`Error: ${error.message}`));
process.exit(1);
}
})
)
.addCommand(
new Command('get <key>')
.description('Get configuration value')
.action(async (key) => {
try {
await pm.getConfig(key);
} catch (error) {
console.error(chalk.red(`Error: ${error.message}`));
process.exit(1);
}
})
);
// Initialize project
program
.command('init')
.description('Initialize a new project')
.option('-y, --yes', 'Use default values')
.action(async (options) => {
try {
await pm.init(options);
} catch (error) {
console.error(chalk.red(`Error: ${error.message}`));
process.exit(1);
}
});
program.parse();

521
src/core/dependency-resolver.js Archivo normal
Ver fichero

@@ -0,0 +1,521 @@
const semver = require('semver');
const chalk = require('chalk');
class DependencyResolver {
constructor() {
this.registry = null;
this.lockManager = null;
this.resolved = new Map();
this.resolving = new Set();
this.conflicts = new Map();
}
setRegistry(registry) {
this.registry = registry;
}
setLockManager(lockManager) {
this.lockManager = lockManager;
}
async resolve(packageSpecs, options = {}) {
this.resolved.clear();
this.resolving.clear();
this.conflicts.clear();
const resolved = [];
// Load existing lock file if available
let lockData = null;
if (this.lockManager) {
try {
lockData = await this.lockManager.loadLockFile();
} catch (error) {
// No lock file exists, continue without it
}
}
// Resolve each package spec
for (const spec of packageSpecs) {
const packageResolution = await this.resolvePackage(spec, {
...options,
lockData,
depth: 0
});
resolved.push(...packageResolution);
}
// Check for conflicts and resolve them
const conflictResolution = await this.resolveConflicts();
resolved.push(...conflictResolution);
// Remove duplicates and return flattened result
return this.deduplicateResolved(resolved);
}
async resolvePackage(spec, options = {}) {
const { name, version } = spec;
const key = `${name}@${version}`;
// Check if already resolved
if (this.resolved.has(key)) {
return [this.resolved.get(key)];
}
// Check for circular dependencies
if (this.resolving.has(key)) {
console.warn(chalk.yellow(`Warning: Circular dependency detected for ${key}`));
return [];
}
this.resolving.add(key);
try {
// Try to resolve from lock file first
if (options.lockData && options.lockData.packages[key]) {
const lockedPackage = options.lockData.packages[key];
const resolvedPackage = {
name,
version: lockedPackage.version,
resolved: lockedPackage.resolved,
integrity: lockedPackage.integrity,
dependencies: lockedPackage.requires || {},
devDependencies: {},
optional: lockedPackage.optional || false,
dev: lockedPackage.dev || false,
source: 'lockfile',
depth: options.depth || 0
};
this.resolved.set(key, resolvedPackage);
// Resolve dependencies recursively
const dependencies = await this.resolveDependencies(
resolvedPackage.dependencies,
{ ...options, depth: (options.depth || 0) + 1 }
);
this.resolving.delete(key);
return [resolvedPackage, ...dependencies];
}
// Resolve version if needed
const resolvedVersion = await this.registry.resolveVersion(name, version);
const resolvedKey = `${name}@${resolvedVersion}`;
// Check if we already resolved this exact version
if (this.resolved.has(resolvedKey)) {
this.resolving.delete(key);
return [this.resolved.get(resolvedKey)];
}
// Get package metadata
const metadata = await this.registry.getMetadata(name, resolvedVersion);
// Create resolved package object
const resolvedPackage = {
name: metadata.name,
version: metadata.version,
resolved: metadata.dist.tarball,
integrity: metadata.dist.integrity,
dependencies: metadata.dependencies || {},
devDependencies: metadata.devDependencies || {},
peerDependencies: metadata.peerDependencies || {},
optionalDependencies: metadata.optionalDependencies || {},
bundledDependencies: metadata.bundledDependencies || [],
engines: metadata.engines || {},
os: metadata.os || [],
cpu: metadata.cpu || [],
deprecated: metadata.deprecated,
license: metadata.license,
homepage: metadata.homepage,
repository: metadata.repository,
bugs: metadata.bugs,
keywords: metadata.keywords || [],
maintainers: metadata.maintainers || [],
time: metadata.time,
bin: metadata.bin || {},
scripts: metadata.scripts || {},
optional: false,
dev: options.dev || false,
source: 'registry',
depth: options.depth || 0,
requestedVersion: version,
shasum: metadata.dist.shasum,
size: metadata.dist.unpackedSize,
fileCount: metadata.dist.fileCount
};
this.resolved.set(resolvedKey, resolvedPackage);
// Resolve dependencies recursively
const allDependencies = {
...resolvedPackage.dependencies,
...(options.includeDevDependencies ? resolvedPackage.devDependencies : {}),
...(options.includeOptionalDependencies ? resolvedPackage.optionalDependencies : {})
};
const dependencies = await this.resolveDependencies(
allDependencies,
{ ...options, depth: (options.depth || 0) + 1 }
);
this.resolving.delete(key);
return [resolvedPackage, ...dependencies];
} catch (error) {
this.resolving.delete(key);
throw new Error(`Failed to resolve ${key}: ${error.message}`);
}
}
async resolveDependencies(dependencies, options = {}) {
const resolved = [];
for (const [name, versionSpec] of Object.entries(dependencies)) {
try {
const spec = { name, version: versionSpec };
const packageResolution = await this.resolvePackage(spec, options);
resolved.push(...packageResolution);
} catch (error) {
if (options.optional) {
console.warn(chalk.yellow(`Warning: Optional dependency ${name}@${versionSpec} could not be resolved: ${error.message}`));
} else {
throw error;
}
}
}
return resolved;
}
async resolveConflicts() {
const resolved = [];
for (const [packageName, conflictVersions] of this.conflicts.entries()) {
// Simple conflict resolution: choose the highest version that satisfies all requirements
const versions = Array.from(conflictVersions);
const chosenVersion = this.chooseVersion(versions);
if (chosenVersion) {
console.warn(chalk.yellow(`Resolved conflict for ${packageName}: using version ${chosenVersion}`));
const spec = { name: packageName, version: chosenVersion };
const packageResolution = await this.resolvePackage(spec, { source: 'conflict-resolution' });
resolved.push(...packageResolution);
} else {
throw new Error(`Cannot resolve version conflict for ${packageName}: ${versions.join(', ')}`);
}
}
return resolved;
}
chooseVersion(versionSpecs) {
// Find a version that satisfies all specs
const allVersions = new Set();
// Get all possible versions from registry for this package
// For now, use a simplified approach
const sortedSpecs = versionSpecs.sort(semver.rcompare);
// Try to find a version that satisfies all requirements
for (const spec of sortedSpecs) {
let satisfiesAll = true;
for (const otherSpec of versionSpecs) {
if (!semver.satisfies(spec, otherSpec)) {
satisfiesAll = false;
break;
}
}
if (satisfiesAll) {
return spec;
}
}
// If no single version satisfies all, return the highest
return sortedSpecs[0];
}
deduplicateResolved(resolved) {
const deduplicated = new Map();
for (const pkg of resolved) {
const key = `${pkg.name}@${pkg.version}`;
if (!deduplicated.has(key)) {
deduplicated.set(key, pkg);
} else {
// Merge information if needed
const existing = deduplicated.get(key);
deduplicated.set(key, {
...existing,
...pkg,
// Keep the minimum depth
depth: Math.min(existing.depth, pkg.depth)
});
}
}
return Array.from(deduplicated.values());
}
async buildDependencyTree(packages) {
const tree = new Map();
for (const pkg of packages) {
tree.set(pkg.name, {
package: pkg,
dependencies: new Map(),
dependents: new Set(),
depth: pkg.depth
});
}
// Build relationships
for (const pkg of packages) {
const node = tree.get(pkg.name);
for (const depName of Object.keys(pkg.dependencies || {})) {
const depNode = tree.get(depName);
if (depNode) {
node.dependencies.set(depName, depNode);
depNode.dependents.add(pkg.name);
}
}
}
return tree;
}
async analyzeImpact(packageName, newVersion, currentPackages) {
const impact = {
directDependents: new Set(),
indirectDependents: new Set(),
breakingChanges: [],
warnings: []
};
const tree = await this.buildDependencyTree(currentPackages);
const targetNode = tree.get(packageName);
if (!targetNode) {
return impact;
}
// Find all dependents
const visited = new Set();
const findDependents = (nodeName, isIndirect = false) => {
if (visited.has(nodeName)) return;
visited.add(nodeName);
const node = tree.get(nodeName);
if (!node) return;
for (const dependent of node.dependents) {
if (isIndirect) {
impact.indirectDependents.add(dependent);
} else {
impact.directDependents.add(dependent);
}
findDependents(dependent, true);
}
};
findDependents(packageName);
// Check for breaking changes
const currentVersion = targetNode.package.version;
if (semver.major(newVersion) > semver.major(currentVersion)) {
impact.breakingChanges.push(`Major version change: ${currentVersion} -> ${newVersion}`);
}
return impact;
}
async validateResolution(resolved) {
const validation = {
valid: true,
errors: [],
warnings: [],
stats: {
totalPackages: resolved.length,
duplicates: 0,
conflicts: 0,
circular: []
}
};
// Check for duplicates
const seen = new Map();
for (const pkg of resolved) {
const key = pkg.name;
if (seen.has(key)) {
const existing = seen.get(key);
if (existing.version !== pkg.version) {
validation.stats.conflicts++;
validation.warnings.push(`Version conflict for ${key}: ${existing.version} vs ${pkg.version}`);
} else {
validation.stats.duplicates++;
}
} else {
seen.set(key, pkg);
}
}
// Check for circular dependencies
const circular = this.detectCircularDependencies(resolved);
validation.stats.circular = circular;
if (circular.length > 0) {
validation.warnings.push(`Circular dependencies detected: ${circular.join(', ')}`);
}
// Check platform compatibility
for (const pkg of resolved) {
if (pkg.engines && pkg.engines.node) {
if (!semver.satisfies(process.version, pkg.engines.node)) {
validation.warnings.push(`${pkg.name}@${pkg.version} requires Node.js ${pkg.engines.node}, current: ${process.version}`);
}
}
if (pkg.os && pkg.os.length > 0) {
const currentOs = process.platform;
const supportedOs = pkg.os.filter(os => !os.startsWith('!'));
const blockedOs = pkg.os.filter(os => os.startsWith('!')).map(os => os.substring(1));
if (supportedOs.length > 0 && !supportedOs.includes(currentOs)) {
validation.warnings.push(`${pkg.name}@${pkg.version} is not supported on ${currentOs}`);
}
if (blockedOs.includes(currentOs)) {
validation.warnings.push(`${pkg.name}@${pkg.version} is blocked on ${currentOs}`);
}
}
if (pkg.cpu && pkg.cpu.length > 0) {
const currentCpu = process.arch;
const supportedCpu = pkg.cpu.filter(cpu => !cpu.startsWith('!'));
const blockedCpu = pkg.cpu.filter(cpu => cpu.startsWith('!')).map(cpu => cpu.substring(1));
if (supportedCpu.length > 0 && !supportedCpu.includes(currentCpu)) {
validation.warnings.push(`${pkg.name}@${pkg.version} is not supported on ${currentCpu} architecture`);
}
if (blockedCpu.includes(currentCpu)) {
validation.warnings.push(`${pkg.name}@${pkg.version} is blocked on ${currentCpu} architecture`);
}
}
}
return validation;
}
detectCircularDependencies(packages) {
const graph = new Map();
const circular = [];
// Build graph
for (const pkg of packages) {
graph.set(pkg.name, Object.keys(pkg.dependencies || {}));
}
// Detect cycles using DFS
const visited = new Set();
const visiting = new Set();
const visit = (node, path = []) => {
if (visiting.has(node)) {
const cycleStart = path.indexOf(node);
const cycle = path.slice(cycleStart);
circular.push(cycle.join(' -> ') + ' -> ' + node);
return;
}
if (visited.has(node)) {
return;
}
visiting.add(node);
const dependencies = graph.get(node) || [];
for (const dep of dependencies) {
if (graph.has(dep)) {
visit(dep, [...path, node]);
}
}
visiting.delete(node);
visited.add(node);
};
for (const node of graph.keys()) {
if (!visited.has(node)) {
visit(node);
}
}
return circular;
}
async optimizeResolution(resolved) {
// Implement resolution optimization strategies
const optimized = [...resolved];
// Remove unnecessary duplicates
const nameCounts = new Map();
for (const pkg of resolved) {
nameCounts.set(pkg.name, (nameCounts.get(pkg.name) || 0) + 1);
}
// Hoist dependencies when possible
const hoisted = new Map();
for (const pkg of optimized) {
if (pkg.depth > 0 && !hoisted.has(pkg.name)) {
// Check if this package can be hoisted
const canHoist = this.canHoistPackage(pkg, optimized);
if (canHoist) {
pkg.depth = 0;
pkg.hoisted = true;
hoisted.set(pkg.name, pkg);
}
}
}
return optimized;
}
canHoistPackage(pkg, allPackages) {
// Check if hoisting this package would cause conflicts
const topLevelPackages = allPackages.filter(p => p.depth === 0 && p.name !== pkg.name);
for (const topPkg of topLevelPackages) {
if (topPkg.dependencies && topPkg.dependencies[pkg.name]) {
const requiredVersion = topPkg.dependencies[pkg.name];
if (!semver.satisfies(pkg.version, requiredVersion)) {
return false;
}
}
}
return true;
}
getResolutionStats() {
return {
resolved: this.resolved.size,
resolving: this.resolving.size,
conflicts: this.conflicts.size
};
}
clearCache() {
this.resolved.clear();
this.resolving.clear();
this.conflicts.clear();
}
}
module.exports = DependencyResolver;

583
src/core/lock-manager.js Archivo normal
Ver fichero

@@ -0,0 +1,583 @@
const path = require('path');
const fs = require('fs-extra');
const crypto = require('crypto');
const semver = require('semver');
class LockManager {
constructor() {
this.lockFileName = 'alepm.lock';
this.lockVersion = '1.0.0';
}
async init() {
const lockPath = this.getLockFilePath();
if (!fs.existsSync(lockPath)) {
const lockData = {
lockfileVersion: this.lockVersion,
name: this.getProjectName(),
version: this.getProjectVersion(),
requires: true,
packages: {},
dependencies: {},
integrity: {},
resolved: {},
metadata: {
generated: new Date().toISOString(),
generator: 'alepm',
generatorVersion: require('../../package.json').version,
nodejs: process.version,
platform: process.platform,
arch: process.arch
}
};
await this.saveLockFile(lockData);
return lockData;
}
return await this.loadLockFile();
}
async update(resolvedPackages, options = {}) {
const lockData = await this.loadLockFile();
// Update timestamp
lockData.metadata.lastModified = new Date().toISOString();
lockData.metadata.modifiedBy = options.user || 'alepm';
// Update packages
for (const pkg of resolvedPackages) {
const key = this.generatePackageKey(pkg.name, pkg.version);
// Update packages section
lockData.packages[key] = {
version: pkg.version,
resolved: pkg.resolved || pkg.tarball,
integrity: pkg.integrity,
requires: pkg.dependencies || {},
dev: options.saveDev || false,
optional: pkg.optional || false,
bundled: pkg.bundledDependencies || false,
engines: pkg.engines || {},
os: pkg.os || [],
cpu: pkg.cpu || [],
deprecated: pkg.deprecated || false,
license: pkg.license,
funding: pkg.funding,
homepage: pkg.homepage,
repository: pkg.repository,
bugs: pkg.bugs,
keywords: pkg.keywords || [],
maintainers: pkg.maintainers || [],
contributors: pkg.contributors || [],
time: {
created: pkg.time?.created,
modified: pkg.time?.modified || new Date().toISOString()
}
};
// Update dependencies section (flattened view)
lockData.dependencies[pkg.name] = {
version: pkg.version,
from: `${pkg.name}@${pkg.requestedVersion || pkg.version}`,
resolved: pkg.resolved || pkg.tarball,
integrity: pkg.integrity,
dev: options.saveDev || false,
optional: pkg.optional || false
};
// Store integrity information
lockData.integrity[key] = {
algorithm: 'sha512',
hash: pkg.integrity,
size: pkg.size || 0,
fileCount: pkg.fileCount || 0,
unpackedSize: pkg.unpackedSize || 0
};
// Store resolved information
lockData.resolved[pkg.name] = {
version: pkg.version,
tarball: pkg.resolved || pkg.tarball,
shasum: pkg.shasum,
integrity: pkg.integrity,
registry: pkg.registry || 'https://registry.npmjs.org',
lastResolved: new Date().toISOString()
};
}
// Update dependency tree
await this.updateDependencyTree(lockData, resolvedPackages);
// Generate lock file hash for integrity
lockData.metadata.hash = this.generateLockHash(lockData);
await this.saveLockFile(lockData);
return lockData;
}
async remove(packageNames, options = {}) {
const lockData = await this.loadLockFile();
for (const packageName of packageNames) {
// Remove from dependencies
delete lockData.dependencies[packageName];
// Remove from packages (all versions)
const keysToRemove = Object.keys(lockData.packages)
.filter(key => key.startsWith(`${packageName}@`));
for (const key of keysToRemove) {
delete lockData.packages[key];
delete lockData.integrity[key];
}
// Remove from resolved
delete lockData.resolved[packageName];
}
// Update metadata
lockData.metadata.lastModified = new Date().toISOString();
lockData.metadata.hash = this.generateLockHash(lockData);
await this.saveLockFile(lockData);
return lockData;
}
async verify() {
try {
const lockData = await this.loadLockFile();
// Check lock file version compatibility
if (!semver.satisfies(lockData.lockfileVersion, '^1.0.0')) {
return {
valid: false,
errors: ['Incompatible lock file version']
};
}
const errors = [];
const warnings = [];
// Verify integrity hash
const currentHash = this.generateLockHash(lockData);
if (lockData.metadata.hash && lockData.metadata.hash !== currentHash) {
errors.push('Lock file integrity hash mismatch');
}
// Verify package integrity
for (const [key, pkg] of Object.entries(lockData.packages)) {
if (!pkg.integrity) {
warnings.push(`Package ${key} missing integrity information`);
continue;
}
// Check if package exists in dependencies
const [name] = key.split('@');
if (!lockData.dependencies[name]) {
warnings.push(`Package ${name} in packages but not in dependencies`);
}
}
// Verify dependency tree consistency
for (const [name, dep] of Object.entries(lockData.dependencies)) {
const key = this.generatePackageKey(name, dep.version);
if (!lockData.packages[key]) {
errors.push(`Dependency ${name}@${dep.version} missing from packages`);
}
}
// Check for circular dependencies
const circularDeps = this.detectCircularDependencies(lockData);
if (circularDeps.length > 0) {
warnings.push(`Circular dependencies detected: ${circularDeps.join(', ')}`);
}
return {
valid: errors.length === 0,
errors,
warnings,
stats: {
packages: Object.keys(lockData.packages).length,
dependencies: Object.keys(lockData.dependencies).length,
size: JSON.stringify(lockData).length
}
};
} catch (error) {
return {
valid: false,
errors: [`Lock file verification failed: ${error.message}`]
};
}
}
async loadLockFile() {
const lockPath = this.getLockFilePath();
if (!fs.existsSync(lockPath)) {
throw new Error('alepm.lock file not found');
}
try {
const data = await fs.readJson(lockPath);
// Migrate old lock file versions if needed
return await this.migrateLockFile(data);
} catch (error) {
throw new Error(`Failed to parse alepm.lock: ${error.message}`);
}
}
async saveLockFile(lockData) {
const lockPath = this.getLockFilePath();
// Sort keys for consistent output
const sortedLockData = this.sortLockData(lockData);
// Save with proper formatting
await fs.writeJson(lockPath, sortedLockData, {
spaces: 2,
EOL: '\n'
});
}
getLockFilePath() {
const projectRoot = this.findProjectRoot();
return path.join(projectRoot, this.lockFileName);
}
findProjectRoot() {
let current = process.cwd();
while (current !== path.dirname(current)) {
if (fs.existsSync(path.join(current, 'package.json'))) {
return current;
}
current = path.dirname(current);
}
return process.cwd();
}
getProjectName() {
try {
const packageJsonPath = path.join(this.findProjectRoot(), 'package.json');
if (fs.existsSync(packageJsonPath)) {
const packageJson = fs.readJsonSync(packageJsonPath);
return packageJson.name;
}
} catch (error) {
// Ignore errors
}
return path.basename(this.findProjectRoot());
}
getProjectVersion() {
try {
const packageJsonPath = path.join(this.findProjectRoot(), 'package.json');
if (fs.existsSync(packageJsonPath)) {
const packageJson = fs.readJsonSync(packageJsonPath);
return packageJson.version;
}
} catch (error) {
// Ignore errors
}
return '1.0.0';
}
generatePackageKey(name, version) {
return `${name}@${version}`;
}
generateLockHash(lockData) {
// Create a stable hash by excluding metadata.hash field
const dataForHash = { ...lockData };
if (dataForHash.metadata) {
const metadata = { ...dataForHash.metadata };
delete metadata.hash;
dataForHash.metadata = metadata;
}
const sortedData = this.sortLockData(dataForHash);
const dataString = JSON.stringify(sortedData);
return crypto.createHash('sha256')
.update(dataString)
.digest('hex');
}
sortLockData(lockData) {
const sorted = {};
// Sort top-level keys
const topLevelKeys = Object.keys(lockData).sort();
for (const key of topLevelKeys) {
if (typeof lockData[key] === 'object' && lockData[key] !== null && !Array.isArray(lockData[key])) {
// Sort object keys
const sortedObj = {};
const objKeys = Object.keys(lockData[key]).sort();
for (const objKey of objKeys) {
sortedObj[objKey] = lockData[key][objKey];
}
sorted[key] = sortedObj;
} else {
sorted[key] = lockData[key];
}
}
return sorted;
}
async updateDependencyTree(lockData, resolvedPackages) {
// Build dependency tree structure
if (!lockData.dependencyTree) {
lockData.dependencyTree = {};
}
for (const pkg of resolvedPackages) {
lockData.dependencyTree[pkg.name] = {
version: pkg.version,
dependencies: this.buildPackageDependencyTree(pkg, resolvedPackages),
depth: this.calculatePackageDepth(pkg, resolvedPackages),
path: this.getPackagePath(pkg, resolvedPackages)
};
}
}
buildPackageDependencyTree(pkg, allPackages) {
const tree = {};
if (pkg.dependencies) {
for (const [depName, depVersion] of Object.entries(pkg.dependencies)) {
const resolvedDep = allPackages.find(p =>
p.name === depName && semver.satisfies(p.version, depVersion)
);
if (resolvedDep) {
tree[depName] = {
version: resolvedDep.version,
resolved: resolvedDep.resolved,
integrity: resolvedDep.integrity
};
}
}
}
return tree;
}
calculatePackageDepth(pkg, allPackages, visited = new Set(), depth = 0) {
if (visited.has(pkg.name)) {
return depth; // Circular dependency
}
visited.add(pkg.name);
let maxDepth = depth;
if (pkg.dependencies) {
for (const depName of Object.keys(pkg.dependencies)) {
const dep = allPackages.find(p => p.name === depName);
if (dep) {
const depDepth = this.calculatePackageDepth(dep, allPackages, new Set(visited), depth + 1);
maxDepth = Math.max(maxDepth, depDepth);
}
}
}
return maxDepth;
}
getPackagePath(pkg, allPackages, path = []) {
if (path.includes(pkg.name)) {
return path; // Circular dependency
}
return [...path, pkg.name];
}
detectCircularDependencies(lockData) {
const circular = [];
const visiting = new Set();
const visited = new Set();
const visit = (packageName, path = []) => {
if (visiting.has(packageName)) {
// Found circular dependency
const cycleStart = path.indexOf(packageName);
const cycle = path.slice(cycleStart).concat(packageName);
circular.push(cycle.join(' -> '));
return;
}
if (visited.has(packageName)) {
return;
}
visiting.add(packageName);
const pkg = lockData.dependencies[packageName];
if (pkg && lockData.packages[this.generatePackageKey(packageName, pkg.version)]) {
const packageData = lockData.packages[this.generatePackageKey(packageName, pkg.version)];
if (packageData.requires) {
for (const depName of Object.keys(packageData.requires)) {
visit(depName, [...path, packageName]);
}
}
}
visiting.delete(packageName);
visited.add(packageName);
};
for (const packageName of Object.keys(lockData.dependencies)) {
if (!visited.has(packageName)) {
visit(packageName);
}
}
return circular;
}
async migrateLockFile(lockData) {
// Handle migration from older lock file versions
if (!lockData.lockfileVersion || semver.lt(lockData.lockfileVersion, this.lockVersion)) {
// Perform migration
lockData.lockfileVersion = this.lockVersion;
// Add missing metadata
if (!lockData.metadata) {
lockData.metadata = {
generated: new Date().toISOString(),
generator: 'alepm',
generatorVersion: require('../../package.json').version,
migrated: true,
originalVersion: lockData.lockfileVersion
};
}
// Ensure all required sections exist
lockData.packages = lockData.packages || {};
lockData.dependencies = lockData.dependencies || {};
lockData.integrity = lockData.integrity || {};
lockData.resolved = lockData.resolved || {};
// Save migrated version
await this.saveLockFile(lockData);
}
return lockData;
}
async getDependencyGraph() {
const lockData = await this.loadLockFile();
const graph = {
nodes: [],
edges: [],
stats: {
totalPackages: 0,
totalDependencies: 0,
maxDepth: 0,
circularDependencies: []
}
};
// Build nodes
for (const [name, dep] of Object.entries(lockData.dependencies)) {
graph.nodes.push({
id: name,
name,
version: dep.version,
dev: dep.dev || false,
optional: dep.optional || false
});
}
// Build edges
for (const [key, pkg] of Object.entries(lockData.packages)) {
const [name] = key.split('@');
if (pkg.requires) {
for (const depName of Object.keys(pkg.requires)) {
graph.edges.push({
from: name,
to: depName,
version: pkg.requires[depName]
});
}
}
}
// Calculate stats
graph.stats.totalPackages = graph.nodes.length;
graph.stats.totalDependencies = graph.edges.length;
graph.stats.circularDependencies = this.detectCircularDependencies(lockData);
// Calculate max depth
for (const node of graph.nodes) {
const depth = this.calculateNodeDepth(node.name, graph.edges);
graph.stats.maxDepth = Math.max(graph.stats.maxDepth, depth);
}
return graph;
}
calculateNodeDepth(nodeName, edges, visited = new Set()) {
if (visited.has(nodeName)) {
return 0; // Circular dependency
}
visited.add(nodeName);
const dependencies = edges.filter(edge => edge.from === nodeName);
if (dependencies.length === 0) {
return 0;
}
let maxDepth = 0;
for (const dep of dependencies) {
const depth = 1 + this.calculateNodeDepth(dep.to, edges, new Set(visited));
maxDepth = Math.max(maxDepth, depth);
}
return maxDepth;
}
async exportLockFile(format = 'json') {
const lockData = await this.loadLockFile();
switch (format.toLowerCase()) {
case 'json':
return JSON.stringify(lockData, null, 2);
case 'yaml':
// Would need yaml library
throw new Error('YAML export not implemented');
case 'csv':
return this.exportToCsv(lockData);
default:
throw new Error(`Unsupported export format: ${format}`);
}
}
exportToCsv(lockData) {
const lines = ['Name,Version,Type,Integrity,Resolved'];
for (const [name, dep] of Object.entries(lockData.dependencies)) {
const type = dep.dev ? 'dev' : dep.optional ? 'optional' : 'prod';
lines.push(`${name},${dep.version},${type},${dep.integrity || ''},${dep.resolved || ''}`);
}
return lines.join('\n');
}
}
module.exports = LockManager;

569
src/core/package-manager.js Archivo normal
Ver fichero

@@ -0,0 +1,569 @@
const path = require('path');
const fs = require('fs-extra');
const chalk = require('chalk');
const semver = require('semver');
const ora = require('ora');
const { Listr } = require('listr2');
const inquirer = require('inquirer');
const CacheManager = require('../cache/cache-manager');
const SecurityManager = require('../security/security-manager');
const BinaryStorage = require('../storage/binary-storage');
const LockManager = require('./lock-manager');
const Registry = require('./registry');
const DependencyResolver = require('./dependency-resolver');
const ConfigManager = require('../utils/config-manager');
const Logger = require('../utils/logger');
class PackageManager {
constructor() {
this.cache = new CacheManager();
this.security = new SecurityManager();
this.storage = new BinaryStorage();
this.lock = new LockManager();
this.registry = new Registry();
this.resolver = new DependencyResolver();
this.config = new ConfigManager();
this.logger = new Logger();
this.projectRoot = this.findProjectRoot();
this.globalRoot = path.join(require('os').homedir(), '.alepm');
}
findProjectRoot() {
let current = process.cwd();
while (current !== path.dirname(current)) {
if (fs.existsSync(path.join(current, 'package.json'))) {
return current;
}
current = path.dirname(current);
}
return process.cwd();
}
async install(packages = [], options = {}) {
const spinner = ora('Analyzing dependencies...').start();
try {
// If no packages specified, install from package.json
if (packages.length === 0) {
return await this.installFromPackageJson(options);
}
// Parse package specifications
const packageSpecs = packages.map(pkg => this.parsePackageSpec(pkg));
spinner.text = 'Resolving dependencies...';
// Resolve dependencies
const resolved = await this.resolver.resolve(packageSpecs, {
projectRoot: options.global ? this.globalRoot : this.projectRoot,
includeDevDependencies: !options.saveExact
});
spinner.text = 'Checking security vulnerabilities...';
// Security audit
await this.security.auditPackages(resolved);
spinner.text = 'Downloading packages...';
// Download and install packages
const tasks = new Listr([
{
title: 'Downloading packages',
task: async (ctx, task) => {
const downloads = resolved.map(pkg => ({
title: `${pkg.name}@${pkg.version}`,
task: async () => {
const cached = await this.cache.get(pkg.name, pkg.version);
if (!cached) {
const downloaded = await this.registry.download(pkg);
await this.cache.store(pkg.name, pkg.version, downloaded);
await this.security.verifyIntegrity(downloaded, pkg.integrity);
}
}
}));
return task.newListr(downloads, { concurrent: 5 });
}
},
{
title: 'Installing packages',
task: async () => {
for (const pkg of resolved) {
await this.installPackage(pkg, options);
}
}
},
{
title: 'Updating lock file',
task: async () => {
await this.lock.update(resolved, options);
}
}
]);
spinner.stop();
await tasks.run();
// Update package.json if needed
if (!options.global) {
await this.updatePackageJson(packageSpecs, options);
}
console.log(chalk.green(`\n✓ Installed ${resolved.length} packages`));
} catch (error) {
spinner.stop();
throw error;
}
}
async installFromPackageJson(options = {}) {
const packageJsonPath = path.join(this.projectRoot, 'package.json');
if (!fs.existsSync(packageJsonPath)) {
throw new Error('package.json not found');
}
const packageJson = await fs.readJson(packageJsonPath);
const dependencies = {
...packageJson.dependencies,
...(options.includeDev ? packageJson.devDependencies : {})
};
const packages = Object.entries(dependencies).map(([name, version]) => `${name}@${version}`);
return await this.install(packages, { ...options, fromPackageJson: true });
}
async installPackage(pkg, options = {}) {
const targetDir = options.global
? path.join(this.globalRoot, 'node_modules', pkg.name)
: path.join(this.projectRoot, 'node_modules', pkg.name);
// Get package data from cache
const packageData = await this.cache.get(pkg.name, pkg.version);
// Extract to target directory using binary storage
await this.storage.extract(packageData, targetDir);
// Create binary links if global
if (options.global && pkg.bin) {
await this.createGlobalBinLinks(pkg);
}
this.logger.info(`Installed ${pkg.name}@${pkg.version}`);
}
async uninstall(packages, options = {}) {
const spinner = ora('Uninstalling packages...').start();
try {
for (const packageName of packages) {
const targetDir = options.global
? path.join(this.globalRoot, 'node_modules', packageName)
: path.join(this.projectRoot, 'node_modules', packageName);
if (fs.existsSync(targetDir)) {
await fs.remove(targetDir);
// Remove from package.json
if (!options.global) {
await this.removeFromPackageJson(packageName);
}
// Remove global bin links
if (options.global) {
await this.removeGlobalBinLinks(packageName);
}
}
}
// Update lock file
await this.lock.remove(packages, options);
spinner.stop();
console.log(chalk.green(`✓ Uninstalled ${packages.length} packages`));
} catch (error) {
spinner.stop();
throw error;
}
}
async update(packages = [], options = {}) {
const spinner = ora('Checking for updates...').start();
try {
const installedPackages = await this.getInstalledPackages(options);
const toUpdate = packages.length === 0 ? installedPackages : packages;
const updates = [];
for (const packageName of toUpdate) {
const current = installedPackages.find(p => p.name === packageName);
if (current) {
const latest = await this.registry.getLatestVersion(packageName);
if (semver.gt(latest, current.version)) {
updates.push({ name: packageName, from: current.version, to: latest });
}
}
}
if (updates.length === 0) {
spinner.stop();
console.log(chalk.green('All packages are up to date'));
return;
}
spinner.stop();
const { confirm } = await inquirer.prompt([{
type: 'confirm',
name: 'confirm',
message: `Update ${updates.length} packages?`,
default: true
}]);
if (confirm) {
const updateSpecs = updates.map(u => `${u.name}@${u.to}`);
await this.install(updateSpecs, { ...options, update: true });
}
} catch (error) {
spinner.stop();
throw error;
}
}
async list(options = {}) {
const installedPackages = await this.getInstalledPackages(options);
if (installedPackages.length === 0) {
console.log(chalk.yellow('No packages installed'));
return;
}
const tree = this.buildDependencyTree(installedPackages, parseInt(options.depth));
this.printDependencyTree(tree);
}
async search(query, options = {}) {
const spinner = ora(`Searching for "${query}"...`).start();
try {
const results = await this.registry.search(query, {
limit: parseInt(options.limit)
});
spinner.stop();
if (results.length === 0) {
console.log(chalk.yellow('No packages found'));
return;
}
console.log(chalk.bold(`\nFound ${results.length} packages:\n`));
for (const pkg of results) {
console.log(chalk.cyan(pkg.name) + chalk.gray(` v${pkg.version}`));
console.log(chalk.gray(` ${pkg.description}`));
console.log(chalk.gray(` ${pkg.keywords?.join(', ') || ''}\n`));
}
} catch (error) {
spinner.stop();
throw error;
}
}
async info(packageName) {
const spinner = ora(`Getting info for ${packageName}...`).start();
try {
const info = await this.registry.getPackageInfo(packageName);
spinner.stop();
console.log(chalk.bold.cyan(`\n${info.name}@${info.version}\n`));
console.log(chalk.gray(info.description));
console.log(chalk.gray(`Homepage: ${info.homepage}`));
console.log(chalk.gray(`License: ${info.license}`));
console.log(chalk.gray(`Dependencies: ${Object.keys(info.dependencies || {}).length}`));
console.log(chalk.gray(`Last modified: ${new Date(info.time.modified).toLocaleDateString()}`));
} catch (error) {
spinner.stop();
throw error;
}
}
async cleanCache() {
const spinner = ora('Cleaning cache...').start();
try {
const cleaned = await this.cache.clean();
spinner.stop();
console.log(chalk.green(`✓ Cleaned cache (freed ${this.formatBytes(cleaned)} of space)`));
} catch (error) {
spinner.stop();
throw error;
}
}
async verifyCache() {
const spinner = ora('Verifying cache integrity...').start();
try {
const result = await this.cache.verify();
spinner.stop();
if (result.corrupted.length === 0) {
console.log(chalk.green('✓ Cache integrity verified'));
} else {
console.log(chalk.yellow(`⚠ Found ${result.corrupted.length} corrupted entries`));
console.log(chalk.gray('Run "alepm cache clean" to fix'));
}
} catch (error) {
spinner.stop();
throw error;
}
}
async audit(options = {}) {
const spinner = ora('Auditing packages for vulnerabilities...').start();
try {
const installedPackages = await this.getInstalledPackages();
const vulnerabilities = await this.security.audit(installedPackages);
spinner.stop();
if (vulnerabilities.length === 0) {
console.log(chalk.green('✓ No vulnerabilities found'));
return;
}
console.log(chalk.red(`⚠ Found ${vulnerabilities.length} vulnerabilities:`));
for (const vuln of vulnerabilities) {
console.log(chalk.red(` ${vuln.severity.toUpperCase()}: ${vuln.title}`));
console.log(chalk.gray(` Package: ${vuln.module_name}@${vuln.version}`));
console.log(chalk.gray(` Path: ${vuln.path.join(' > ')}`));
}
if (options.fix) {
await this.fixVulnerabilities(vulnerabilities);
}
} catch (error) {
spinner.stop();
throw error;
}
}
async verifyLock() {
const spinner = ora('Verifying lock file...').start();
try {
const isValid = await this.lock.verify();
spinner.stop();
if (isValid) {
console.log(chalk.green('✓ Lock file is valid'));
} else {
console.log(chalk.red('✗ Lock file is invalid or outdated'));
}
} catch (error) {
spinner.stop();
throw error;
}
}
async setConfig(key, value) {
await this.config.set(key, value);
console.log(chalk.green(`✓ Set ${key} = ${value}`));
}
async getConfig(key) {
const value = await this.config.get(key);
console.log(`${key} = ${value || 'undefined'}`);
}
async init(options = {}) {
if (!options.yes) {
const answers = await inquirer.prompt([
{ name: 'name', message: 'Package name:', default: path.basename(this.projectRoot) },
{ name: 'version', message: 'Version:', default: '1.0.0' },
{ name: 'description', message: 'Description:' },
{ name: 'entry', message: 'Entry point:', default: 'index.js' },
{ name: 'author', message: 'Author:' },
{ name: 'license', message: 'License:', default: 'MIT' }
]);
const packageJson = {
name: answers.name,
version: answers.version,
description: answers.description,
main: answers.entry,
scripts: {
test: 'echo "Error: no test specified" && exit 1'
},
author: answers.author,
license: answers.license
};
await fs.writeJson(path.join(this.projectRoot, 'package.json'), packageJson, { spaces: 2 });
}
// Initialize lock file
await this.lock.init();
console.log(chalk.green('✓ Initialized project'));
}
// Helper methods
parsePackageSpec(spec) {
const match = spec.match(/^(@?[^@]+)(?:@(.+))?$/);
return {
name: match[1],
version: match[2] || 'latest'
};
}
async getInstalledPackages(options = {}) {
const nodeModulesPath = options.global
? path.join(this.globalRoot, 'node_modules')
: path.join(this.projectRoot, 'node_modules');
if (!fs.existsSync(nodeModulesPath)) {
return [];
}
const packages = [];
const dirs = await fs.readdir(nodeModulesPath);
for (const dir of dirs) {
if (dir.startsWith('.')) continue;
const packageJsonPath = path.join(nodeModulesPath, dir, 'package.json');
if (fs.existsSync(packageJsonPath)) {
const packageJson = await fs.readJson(packageJsonPath);
packages.push({
name: packageJson.name,
version: packageJson.version
});
}
}
return packages;
}
formatBytes(bytes) {
const sizes = ['B', 'KB', 'MB', 'GB'];
if (bytes === 0) return '0 B';
const i = Math.floor(Math.log(bytes) / Math.log(1024));
return Math.round(bytes / Math.pow(1024, i) * 100) / 100 + ' ' + sizes[i];
}
buildDependencyTree(packages, depth) {
// Simplified tree building
return packages.map(pkg => ({
name: pkg.name,
version: pkg.version,
children: depth > 0 ? [] : null
}));
}
printDependencyTree(tree) {
for (const pkg of tree) {
console.log(`${pkg.name}@${pkg.version}`);
}
}
async updatePackageJson(packageSpecs, options) {
const packageJsonPath = path.join(this.projectRoot, 'package.json');
const packageJson = await fs.readJson(packageJsonPath);
const targetField = options.saveDev ? 'devDependencies' : 'dependencies';
if (!packageJson[targetField]) {
packageJson[targetField] = {};
}
for (const spec of packageSpecs) {
const version = options.saveExact ? spec.version : `^${spec.version}`;
packageJson[targetField][spec.name] = version;
}
await fs.writeJson(packageJsonPath, packageJson, { spaces: 2 });
}
async removeFromPackageJson(packageName) {
const packageJsonPath = path.join(this.projectRoot, 'package.json');
const packageJson = await fs.readJson(packageJsonPath);
delete packageJson.dependencies?.[packageName];
delete packageJson.devDependencies?.[packageName];
await fs.writeJson(packageJsonPath, packageJson, { spaces: 2 });
}
async createGlobalBinLinks(pkg) {
// Implementation for creating global binary links
const binDir = path.join(this.globalRoot, 'bin');
await fs.ensureDir(binDir);
if (pkg.bin) {
for (const [binName, binPath] of Object.entries(pkg.bin)) {
const linkPath = path.join(binDir, binName);
const targetPath = path.join(this.globalRoot, 'node_modules', pkg.name, binPath);
await fs.ensureSymlink(targetPath, linkPath);
}
}
}
async removeGlobalBinLinks(packageName) {
// Implementation for removing global binary links
const binDir = path.join(this.globalRoot, 'bin');
const packageDir = path.join(this.globalRoot, 'node_modules', packageName);
if (fs.existsSync(packageDir)) {
const packageJsonPath = path.join(packageDir, 'package.json');
if (fs.existsSync(packageJsonPath)) {
const packageJson = await fs.readJson(packageJsonPath);
if (packageJson.bin) {
for (const binName of Object.keys(packageJson.bin)) {
const linkPath = path.join(binDir, binName);
if (fs.existsSync(linkPath)) {
await fs.remove(linkPath);
}
}
}
}
}
}
async fixVulnerabilities(vulnerabilities) {
const spinner = ora('Fixing vulnerabilities...').start();
try {
for (const vuln of vulnerabilities) {
if (vuln.fixAvailable) {
await this.install([`${vuln.module_name}@${vuln.fixVersion}`], { update: true });
}
}
spinner.stop();
console.log(chalk.green('✓ Fixed available vulnerabilities'));
} catch (error) {
spinner.stop();
throw error;
}
}
}
module.exports = PackageManager;

499
src/core/registry.js Archivo normal
Ver fichero

@@ -0,0 +1,499 @@
const fetch = require('node-fetch');
const semver = require('semver');
const path = require('path');
const fs = require('fs-extra');
class Registry {
constructor() {
this.defaultRegistry = 'https://registry.npmjs.org';
this.registries = new Map();
this.cache = new Map();
this.config = this.loadConfig();
// Default registries
this.registries.set('npm', 'https://registry.npmjs.org');
this.registries.set('yarn', 'https://registry.yarnpkg.com');
}
loadConfig() {
return {
registry: this.defaultRegistry,
timeout: 30000,
retries: 3,
userAgent: 'alepm/1.0.0 node/' + process.version,
auth: {},
scopes: {},
cache: true,
offline: false
};
}
async getPackageInfo(packageName, version = 'latest') {
if (this.config.offline) {
throw new Error('Cannot fetch package info in offline mode');
}
const cacheKey = `info:${packageName}@${version}`;
if (this.config.cache && this.cache.has(cacheKey)) {
const cached = this.cache.get(cacheKey);
if (Date.now() - cached.timestamp < 300000) { // 5 minutes
return cached.data;
}
}
const registry = this.getRegistryForPackage(packageName);
const url = version === 'latest'
? `${registry}/${encodeURIComponent(packageName)}`
: `${registry}/${encodeURIComponent(packageName)}/${encodeURIComponent(version)}`;
const response = await this.fetchWithRetry(url);
if (!response.ok) {
if (response.status === 404) {
throw new Error(`Package "${packageName}" not found`);
}
throw new Error(`Failed to fetch package info: ${response.status} ${response.statusText}`);
}
const data = await response.json();
// Cache the result
if (this.config.cache) {
this.cache.set(cacheKey, {
data,
timestamp: Date.now()
});
}
return data;
}
async getLatestVersion(packageName) {
const info = await this.getPackageInfo(packageName);
return info['dist-tags'].latest;
}
async getVersions(packageName) {
const info = await this.getPackageInfo(packageName);
return Object.keys(info.versions || {}).sort(semver.rcompare);
}
async resolveVersion(packageName, versionSpec) {
if (versionSpec === 'latest') {
return await this.getLatestVersion(packageName);
}
const versions = await this.getVersions(packageName);
// Handle exact version
if (versions.includes(versionSpec)) {
return versionSpec;
}
// Handle semver range
const resolved = semver.maxSatisfying(versions, versionSpec);
if (!resolved) {
throw new Error(`No version of "${packageName}" satisfies "${versionSpec}"`);
}
return resolved;
}
async download(pkg) {
if (this.config.offline) {
throw new Error('Cannot download packages in offline mode');
}
const registry = this.getRegistryForPackage(pkg.name);
const packageInfo = await this.getPackageInfo(pkg.name, pkg.version);
if (!packageInfo.versions || !packageInfo.versions[pkg.version]) {
throw new Error(`Version ${pkg.version} of package ${pkg.name} not found`);
}
const versionInfo = packageInfo.versions[pkg.version];
const tarballUrl = versionInfo.dist.tarball;
if (!tarballUrl) {
throw new Error(`No tarball URL found for ${pkg.name}@${pkg.version}`);
}
const response = await this.fetchWithRetry(tarballUrl);
if (!response.ok) {
throw new Error(`Failed to download ${pkg.name}@${pkg.version}: ${response.status}`);
}
const buffer = await response.buffer();
// Verify integrity if available
if (versionInfo.dist.integrity) {
await this.verifyIntegrity(buffer, versionInfo.dist.integrity);
}
return {
data: buffer,
integrity: versionInfo.dist.integrity,
shasum: versionInfo.dist.shasum,
size: buffer.length,
tarball: tarballUrl,
resolved: tarballUrl,
packageInfo: versionInfo
};
}
async search(query, options = {}) {
if (this.config.offline) {
throw new Error('Cannot search packages in offline mode');
}
const registry = this.config.registry;
const limit = options.limit || 20;
const offset = options.offset || 0;
const searchUrl = `${registry}/-/v1/search?text=${encodeURIComponent(query)}&size=${limit}&from=${offset}`;
const response = await this.fetchWithRetry(searchUrl);
if (!response.ok) {
throw new Error(`Search failed: ${response.status} ${response.statusText}`);
}
const data = await response.json();
return data.objects.map(obj => ({
name: obj.package.name,
version: obj.package.version,
description: obj.package.description,
keywords: obj.package.keywords,
author: obj.package.author,
publisher: obj.package.publisher,
maintainers: obj.package.maintainers,
repository: obj.package.links?.repository,
homepage: obj.package.links?.homepage,
npm: obj.package.links?.npm,
downloadScore: obj.score?.detail?.downloads || 0,
popularityScore: obj.score?.detail?.popularity || 0,
qualityScore: obj.score?.detail?.quality || 0,
maintenanceScore: obj.score?.detail?.maintenance || 0,
finalScore: obj.score?.final || 0
}));
}
async getMetadata(packageName, version) {
const info = await this.getPackageInfo(packageName, version);
if (version === 'latest') {
version = info['dist-tags'].latest;
}
const versionInfo = info.versions[version];
if (!versionInfo) {
throw new Error(`Version ${version} not found for ${packageName}`);
}
return {
name: versionInfo.name,
version: versionInfo.version,
description: versionInfo.description,
keywords: versionInfo.keywords || [],
homepage: versionInfo.homepage,
repository: versionInfo.repository,
bugs: versionInfo.bugs,
license: versionInfo.license,
author: versionInfo.author,
contributors: versionInfo.contributors || [],
maintainers: versionInfo.maintainers || [],
dependencies: versionInfo.dependencies || {},
devDependencies: versionInfo.devDependencies || {},
peerDependencies: versionInfo.peerDependencies || {},
optionalDependencies: versionInfo.optionalDependencies || {},
bundledDependencies: versionInfo.bundledDependencies || [],
engines: versionInfo.engines || {},
os: versionInfo.os || [],
cpu: versionInfo.cpu || [],
scripts: versionInfo.scripts || {},
bin: versionInfo.bin || {},
man: versionInfo.man || [],
directories: versionInfo.directories || {},
files: versionInfo.files || [],
main: versionInfo.main,
browser: versionInfo.browser,
module: versionInfo.module,
types: versionInfo.types,
typings: versionInfo.typings,
exports: versionInfo.exports,
imports: versionInfo.imports,
funding: versionInfo.funding,
dist: {
tarball: versionInfo.dist.tarball,
shasum: versionInfo.dist.shasum,
integrity: versionInfo.dist.integrity,
fileCount: versionInfo.dist.fileCount,
unpackedSize: versionInfo.dist.unpackedSize
},
time: {
created: info.time.created,
modified: info.time.modified,
version: info.time[version]
},
readme: versionInfo.readme,
readmeFilename: versionInfo.readmeFilename,
deprecated: versionInfo.deprecated
};
}
async getDependencies(packageName, version) {
const metadata = await this.getMetadata(packageName, version);
return {
dependencies: metadata.dependencies,
devDependencies: metadata.devDependencies,
peerDependencies: metadata.peerDependencies,
optionalDependencies: metadata.optionalDependencies,
bundledDependencies: metadata.bundledDependencies
};
}
async getDownloadStats(packageName, period = 'last-week') {
const registry = 'https://api.npmjs.org';
const url = `${registry}/downloads/point/${period}/${encodeURIComponent(packageName)}`;
try {
const response = await this.fetchWithRetry(url);
if (!response.ok) {
return { downloads: 0, period };
}
const data = await response.json();
return data;
} catch (error) {
return { downloads: 0, period };
}
}
async getUserPackages(username) {
const registry = this.config.registry;
const url = `${registry}/-/user/${encodeURIComponent(username)}/package`;
const response = await this.fetchWithRetry(url);
if (!response.ok) {
throw new Error(`Failed to get user packages: ${response.status}`);
}
const data = await response.json();
return Object.keys(data);
}
async addRegistry(name, url, options = {}) {
this.registries.set(name, url);
if (options.auth) {
this.config.auth[url] = options.auth;
}
if (options.scope) {
this.config.scopes[options.scope] = url;
}
}
async removeRegistry(name) {
const url = this.registries.get(name);
if (url) {
this.registries.delete(name);
delete this.config.auth[url];
// Remove scope mappings
for (const [scope, registryUrl] of Object.entries(this.config.scopes)) {
if (registryUrl === url) {
delete this.config.scopes[scope];
}
}
}
}
getRegistryForPackage(packageName) {
// Check for scoped packages
if (packageName.startsWith('@')) {
const scope = packageName.split('/')[0];
if (this.config.scopes[scope]) {
return this.config.scopes[scope];
}
}
return this.config.registry;
}
async fetchWithRetry(url, options = {}) {
const requestOptions = {
timeout: this.config.timeout,
headers: {
'User-Agent': this.config.userAgent,
'Accept': 'application/json',
...options.headers
},
...options
};
// Add authentication if available
const registry = new URL(url).origin;
if (this.config.auth[registry]) {
const auth = this.config.auth[registry];
if (auth.token) {
requestOptions.headers['Authorization'] = `Bearer ${auth.token}`;
} else if (auth.username && auth.password) {
const credentials = Buffer.from(`${auth.username}:${auth.password}`).toString('base64');
requestOptions.headers['Authorization'] = `Basic ${credentials}`;
}
}
let lastError;
for (let attempt = 0; attempt < this.config.retries; attempt++) {
try {
const response = await fetch(url, requestOptions);
return response;
} catch (error) {
lastError = error;
if (attempt < this.config.retries - 1) {
// Exponential backoff
const delay = Math.pow(2, attempt) * 1000;
await new Promise(resolve => setTimeout(resolve, delay));
}
}
}
throw lastError;
}
async verifyIntegrity(data, integrity) {
const crypto = require('crypto');
// Parse integrity string (algorithm-hash)
const match = integrity.match(/^(sha\d+)-(.+)$/);
if (!match) {
throw new Error(`Invalid integrity format: ${integrity}`);
}
const [, algorithm, expectedHash] = match;
const actualHash = crypto.createHash(algorithm.replace('sha', 'sha')).update(data).digest('base64');
if (actualHash !== expectedHash) {
throw new Error('Package integrity verification failed');
}
}
async publishPackage(packagePath, options = {}) {
// This would implement package publishing
throw new Error('Package publishing not yet implemented');
}
async unpublishPackage(packageName, version, options = {}) {
// This would implement package unpublishing
throw new Error('Package unpublishing not yet implemented');
}
async deprecatePackage(packageName, version, message, options = {}) {
// This would implement package deprecation
throw new Error('Package deprecation not yet implemented');
}
async login(username, password, email, registry) {
// This would implement user authentication
throw new Error('Login not yet implemented');
}
async logout(registry) {
// This would implement logout
const registryUrl = registry || this.config.registry;
delete this.config.auth[registryUrl];
}
async whoami(registry) {
// This would return current user info
throw new Error('whoami not yet implemented');
}
// Utility methods
async ping(registry) {
const registryUrl = registry || this.config.registry;
try {
const response = await this.fetchWithRetry(`${registryUrl}/-/ping`);
return {
registry: registryUrl,
ok: response.ok,
status: response.status,
time: Date.now()
};
} catch (error) {
return {
registry: registryUrl,
ok: false,
error: error.message,
time: Date.now()
};
}
}
async getRegistryInfo(registry) {
const registryUrl = registry || this.config.registry;
try {
const response = await this.fetchWithRetry(registryUrl);
if (!response.ok) {
throw new Error(`Registry not accessible: ${response.status}`);
}
const data = await response.json();
return {
registry: registryUrl,
db_name: data.db_name,
doc_count: data.doc_count,
doc_del_count: data.doc_del_count,
update_seq: data.update_seq,
purge_seq: data.purge_seq,
compact_running: data.compact_running,
disk_size: data.disk_size,
data_size: data.data_size,
instance_start_time: data.instance_start_time,
disk_format_version: data.disk_format_version,
committed_update_seq: data.committed_update_seq
};
} catch (error) {
throw new Error(`Failed to get registry info: ${error.message}`);
}
}
clearCache() {
this.cache.clear();
}
getCacheStats() {
const entries = Array.from(this.cache.values());
const totalSize = JSON.stringify(entries).length;
return {
entries: this.cache.size,
totalSize,
oldestEntry: entries.length > 0 ? Math.min(...entries.map(e => e.timestamp)) : null,
newestEntry: entries.length > 0 ? Math.max(...entries.map(e => e.timestamp)) : null
};
}
setOfflineMode(offline = true) {
this.config.offline = offline;
}
isOffline() {
return this.config.offline;
}
}
module.exports = Registry;

21
src/index.js Archivo normal
Ver fichero

@@ -0,0 +1,21 @@
const PackageManager = require('./core/package-manager');
const CacheManager = require('./cache/cache-manager');
const SecurityManager = require('./security/security-manager');
const BinaryStorage = require('./storage/binary-storage');
const LockManager = require('./core/lock-manager');
const Registry = require('./core/registry');
const DependencyResolver = require('./core/dependency-resolver');
const ConfigManager = require('./utils/config-manager');
const Logger = require('./utils/logger');
module.exports = {
PackageManager,
CacheManager,
SecurityManager,
BinaryStorage,
LockManager,
Registry,
DependencyResolver,
ConfigManager,
Logger
};

Ver fichero

@@ -0,0 +1,458 @@
const crypto = require('crypto');
const fs = require('fs-extra');
const path = require('path');
const fetch = require('node-fetch');
class SecurityManager {
constructor() {
this.vulnerabilityDB = new Map();
this.trustedPublishers = new Set();
this.securityConfig = this.loadSecurityConfig();
}
loadSecurityConfig() {
return {
enableVulnerabilityCheck: true,
enableIntegrityCheck: true,
enableSignatureVerification: true,
allowedHashAlgorithms: ['sha512', 'sha256'],
minKeySize: 2048,
maxPackageSize: 100 * 1024 * 1024, // 100MB
blockedPackages: new Set(),
trustedRegistries: ['https://registry.npmjs.org'],
requireSignedPackages: false
};
}
async verifyIntegrity(packageData, expectedIntegrity) {
if (!this.securityConfig.enableIntegrityCheck) {
return true;
}
if (!expectedIntegrity) {
throw new Error('Package integrity information missing');
}
// Parse integrity string (format: algorithm-hash)
const integrityMatch = expectedIntegrity.match(/^(sha\d+)-(.+)$/);
if (!integrityMatch) {
throw new Error('Invalid integrity format');
}
const [, algorithm, expectedHash] = integrityMatch;
if (!this.securityConfig.allowedHashAlgorithms.includes(algorithm)) {
throw new Error(`Unsupported hash algorithm: ${algorithm}`);
}
// Calculate actual hash
const actualHash = crypto.createHash(algorithm)
.update(packageData)
.digest('base64');
if (actualHash !== expectedHash) {
throw new Error('Package integrity verification failed');
}
return true;
}
async verifySignature(packageData, signature, publicKey) {
if (!this.securityConfig.enableSignatureVerification) {
return true;
}
if (!signature || !publicKey) {
if (this.securityConfig.requireSignedPackages) {
throw new Error('Package signature required but not provided');
}
return true;
}
try {
const verify = crypto.createVerify('SHA256');
verify.update(packageData);
const isValid = verify.verify(publicKey, signature, 'base64');
if (!isValid) {
throw new Error('Package signature verification failed');
}
return true;
} catch (error) {
throw new Error(`Signature verification error: ${error.message}`);
}
}
async checkPackageSize(packageData) {
if (packageData.length > this.securityConfig.maxPackageSize) {
throw new Error(`Package size exceeds maximum allowed (${this.formatBytes(this.securityConfig.maxPackageSize)})`);
}
return true;
}
async checkBlockedPackages(packageName) {
if (this.securityConfig.blockedPackages.has(packageName)) {
throw new Error(`Package "${packageName}" is blocked for security reasons`);
}
return true;
}
async scanPackageContent(packageData, packageName) {
// Unpack and scan package content for suspicious patterns
const suspiciousPatterns = [
/eval\s*\(/gi, // eval calls
/Function\s*\(/gi, // Function constructor
/require\s*\(\s*['"]child_process['\"]/gi, // child_process usage
/\.exec\s*\(/gi, // exec calls
/\.spawn\s*\(/gi, // spawn calls
/fs\.unlink/gi, // file deletion
/rm\s+-rf/gi, // dangerous rm commands
/curl\s+.*\|\s*sh/gi, // curl pipe to shell
/wget\s+.*\|\s*sh/gi, // wget pipe to shell
/bitcoin|cryptocurrency|mining|crypto/gi, // crypto mining
/password|token|secret|api[_-]?key/gi, // potential credential harvesting
];
const maliciousIndicators = [];
try {
// Convert buffer to string for pattern matching
const content = packageData.toString();
for (const pattern of suspiciousPatterns) {
const matches = content.match(pattern);
if (matches) {
maliciousIndicators.push({
pattern: pattern.source,
matches: matches.slice(0, 5), // Limit to first 5 matches
severity: this.getPatternSeverity(pattern)
});
}
}
// Check for obfuscated code
if (this.detectObfuscation(content)) {
maliciousIndicators.push({
pattern: 'Code obfuscation detected',
severity: 'medium'
});
}
// Check for network calls to suspicious domains
const suspiciousDomains = this.extractSuspiciousDomains(content);
if (suspiciousDomains.length > 0) {
maliciousIndicators.push({
pattern: 'Suspicious network activity',
domains: suspiciousDomains,
severity: 'high'
});
}
} catch (error) {
// If we can't scan the content, log it but don't fail
console.warn(`Warning: Could not scan package content for ${packageName}: ${error.message}`);
}
return maliciousIndicators;
}
getPatternSeverity(pattern) {
const highRisk = [
/eval\s*\(/gi,
/Function\s*\(/gi,
/\.exec\s*\(/gi,
/rm\s+-rf/gi,
/curl\s+.*\|\s*sh/gi,
/wget\s+.*\|\s*sh/gi
];
return highRisk.some(p => p.source === pattern.source) ? 'high' : 'medium';
}
detectObfuscation(content) {
// Simple obfuscation detection heuristics
const indicators = [
content.includes('\\x'), // Hex encoding
content.includes('\\u'), // Unicode encoding
content.match(/[a-zA-Z_$][a-zA-Z0-9_$]{20,}/g)?.length > 10, // Long variable names
content.includes('unescape'), // URL decoding
content.includes('fromCharCode'), // Character code conversion
(content.match(/;/g)?.length || 0) > content.split('\n').length * 2 // Too many semicolons
];
return indicators.filter(Boolean).length >= 3;
}
extractSuspiciousDomains(content) {
const urlRegex = /https?:\/\/([\w.-]+)/gi;
const urls = content.match(urlRegex) || [];
const suspiciousKeywords = [
'bit.ly', 'tinyurl', 'goo.gl', 'ow.ly', // URL shorteners
'pastebin', 'hastebin', 'ghostbin', // Paste sites
'githubusercontent', // Raw GitHub content
'dropbox', 'mediafire', // File sharing
'onion', '.tk', '.ml' // Suspicious TLDs
];
return urls.filter(url =>
suspiciousKeywords.some(keyword => url.toLowerCase().includes(keyword))
);
}
async auditPackages(packages) {
if (!this.securityConfig.enableVulnerabilityCheck) {
return [];
}
const vulnerabilities = [];
for (const pkg of packages) {
// Check our local vulnerability database
const localVulns = await this.checkLocalVulnerabilities(pkg);
vulnerabilities.push(...localVulns);
// Check online vulnerability databases
const onlineVulns = await this.checkOnlineVulnerabilities(pkg);
vulnerabilities.push(...onlineVulns);
}
return vulnerabilities;
}
async checkLocalVulnerabilities(pkg) {
const vulnerabilities = [];
const key = `${pkg.name}@${pkg.version}`;
if (this.vulnerabilityDB.has(key)) {
const vulnData = this.vulnerabilityDB.get(key);
vulnerabilities.push({
id: vulnData.id,
module_name: pkg.name,
version: pkg.version,
title: vulnData.title,
severity: vulnData.severity,
overview: vulnData.overview,
recommendation: vulnData.recommendation,
fixAvailable: vulnData.fixAvailable,
fixVersion: vulnData.fixVersion
});
}
return vulnerabilities;
}
async checkOnlineVulnerabilities(pkg) {
try {
// Check npm audit API (simplified)
const response = await fetch('https://registry.npmjs.org/-/npm/v1/security/audits', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({
name: pkg.name,
version: pkg.version,
requires: {
[pkg.name]: pkg.version
}
}),
timeout: 5000
});
if (response.ok) {
const data = await response.json();
return this.parseVulnerabilityResponse(data, pkg);
}
} catch (error) {
// Fail silently on network errors
console.warn(`Could not check vulnerabilities for ${pkg.name}: ${error.message}`);
}
return [];
}
parseVulnerabilityResponse(data, pkg) {
const vulnerabilities = [];
if (data.advisories) {
for (const [id, advisory] of Object.entries(data.advisories)) {
vulnerabilities.push({
id,
module_name: advisory.module_name,
version: pkg.version,
title: advisory.title,
severity: advisory.severity,
overview: advisory.overview,
recommendation: advisory.recommendation,
fixAvailable: !!advisory.patched_versions,
fixVersion: advisory.patched_versions
});
}
}
return vulnerabilities;
}
async validateRegistrySource(registryUrl) {
return this.securityConfig.trustedRegistries.includes(registryUrl);
}
async generatePackageHash(packageData, algorithm = 'sha512') {
return crypto.createHash(algorithm)
.update(packageData)
.digest('base64');
}
async createIntegrityString(packageData, algorithm = 'sha512') {
const hash = await this.generatePackageHash(packageData, algorithm);
return `${algorithm}-${hash}`;
}
async quarantinePackage(packageName, version, reason) {
const quarantineDir = path.join(require('os').homedir(), '.alepm', 'quarantine');
await fs.ensureDir(quarantineDir);
const quarantineData = {
packageName,
version,
reason,
timestamp: Date.now(),
id: crypto.randomUUID()
};
const quarantineFile = path.join(quarantineDir, `${packageName}-${version}-${Date.now()}.json`);
await fs.writeJson(quarantineFile, quarantineData, { spaces: 2 });
// Add to blocked packages
this.securityConfig.blockedPackages.add(packageName);
console.warn(`Package ${packageName}@${version} has been quarantined: ${reason}`);
}
async updateVulnerabilityDatabase() {
try {
// Download latest vulnerability database
const response = await fetch('https://raw.githubusercontent.com/nodejs/security-wg/main/vuln/npm/advisories.json', {
timeout: 10000
});
if (response.ok) {
const vulnerabilities = await response.json();
// Update local database
for (const vuln of vulnerabilities) {
const key = `${vuln.module_name}@${vuln.version}`;
this.vulnerabilityDB.set(key, vuln);
}
// Save to disk
const dbPath = path.join(require('os').homedir(), '.alepm', 'vulnerability-db.json');
await fs.writeJson(dbPath, Object.fromEntries(this.vulnerabilityDB), { spaces: 2 });
return vulnerabilities.length;
}
} catch (error) {
console.warn(`Could not update vulnerability database: ${error.message}`);
}
return 0;
}
async audit(packages) {
const results = [];
for (const pkg of packages) {
// Basic security checks
await this.checkBlockedPackages(pkg.name);
// Vulnerability check
const vulnerabilities = await this.auditPackages([pkg]);
results.push(...vulnerabilities);
}
return results;
}
formatBytes(bytes) {
const sizes = ['B', 'KB', 'MB', 'GB'];
if (bytes === 0) return '0 B';
const i = Math.floor(Math.log(bytes) / Math.log(1024));
return Math.round(bytes / Math.pow(1024, i) * 100) / 100 + ' ' + sizes[i];
}
// Security policy management
async addTrustedPublisher(publisherId, publicKey) {
this.trustedPublishers.add({
id: publisherId,
publicKey,
addedAt: Date.now()
});
}
async removeTrustedPublisher(publisherId) {
this.trustedPublishers.delete(publisherId);
}
async blockPackage(packageName, reason) {
this.securityConfig.blockedPackages.add(packageName);
await this.quarantinePackage(packageName, '*', reason);
}
async unblockPackage(packageName) {
this.securityConfig.blockedPackages.delete(packageName);
}
// Risk assessment
assessPackageRisk(pkg, scanResults) {
let riskScore = 0;
const factors = [];
// Age factor (newer packages are riskier)
const packageAge = Date.now() - new Date(pkg.time?.created || 0).getTime();
const ageInDays = packageAge / (1000 * 60 * 60 * 24);
if (ageInDays < 30) {
riskScore += 2;
factors.push('Package is very new (< 30 days)');
} else if (ageInDays < 90) {
riskScore += 1;
factors.push('Package is relatively new (< 90 days)');
}
// Download count factor
if (pkg.downloads?.weekly < 100) {
riskScore += 2;
factors.push('Low download count');
}
// Maintainer factor
if (!pkg.maintainers || pkg.maintainers.length === 0) {
riskScore += 1;
factors.push('No maintainers');
}
// Scan results factor
if (scanResults.length > 0) {
const highSeverity = scanResults.filter(r => r.severity === 'high').length;
const mediumSeverity = scanResults.filter(r => r.severity === 'medium').length;
riskScore += highSeverity * 3 + mediumSeverity * 1;
factors.push(`${scanResults.length} suspicious patterns detected`);
}
// Dependencies factor
if (pkg.dependencies && Object.keys(pkg.dependencies).length > 50) {
riskScore += 1;
factors.push('Many dependencies (> 50)');
}
return {
score: riskScore,
level: riskScore === 0 ? 'low' : riskScore <= 3 ? 'medium' : 'high',
factors
};
}
}
module.exports = SecurityManager;

456
src/storage/binary-storage.js Archivo normal
Ver fichero

@@ -0,0 +1,456 @@
const fs = require('fs-extra');
const path = require('path');
const tar = require('tar');
const zlib = require('zlib');
const crypto = require('crypto');
const { promisify } = require('util');
const gzip = promisify(zlib.gzip);
const gunzip = promisify(zlib.gunzip);
class BinaryStorage {
constructor() {
this.storageDir = path.join(require('os').homedir(), '.alepm', 'storage');
this.indexFile = path.join(this.storageDir, 'index.bin');
this.dataFile = path.join(this.storageDir, 'data.bin');
this.compressionLevel = 9; // Maximum compression
this.init();
}
async init() {
await fs.ensureDir(this.storageDir);
if (!fs.existsSync(this.indexFile)) {
await this.createIndex();
}
if (!fs.existsSync(this.dataFile)) {
await this.createDataFile();
}
}
async createIndex() {
// Binary index format:
// Header: ALEPM_IDX (8 bytes) + Version (4 bytes) + Entry Count (4 bytes)
// Entry: Hash (32 bytes) + Offset (8 bytes) + Size (8 bytes) + CompressedSize (8 bytes) + Timestamp (8 bytes)
const header = Buffer.alloc(16);
header.write('ALEPM_IDX', 0, 'ascii'); // Magic header
header.writeUInt32BE(1, 8); // Version
header.writeUInt32BE(0, 12); // Entry count
await fs.writeFile(this.indexFile, header);
}
async createDataFile() {
// Binary data file format:
// Header: ALEPM_DAT (8 bytes) + Version (4 bytes) + Reserved (4 bytes)
const header = Buffer.alloc(16);
header.write('ALEPM_DAT', 0, 'ascii');
header.writeUInt32BE(1, 8);
header.writeUInt32BE(0, 12); // Reserved
await fs.writeFile(this.dataFile, header);
}
async store(packageName, version, tarballData) {
const key = `${packageName}@${version}`;
const hash = crypto.createHash('sha256').update(key).digest();
// Compress the tarball
const compressedData = await gzip(tarballData, { level: this.compressionLevel });
// Get current data file size for offset
const dataStats = await fs.stat(this.dataFile);
const offset = dataStats.size;
// Append compressed data to data file
const dataFd = await fs.open(this.dataFile, 'a');
await fs.write(dataFd, compressedData);
await fs.close(dataFd);
// Update index
await this.updateIndex(hash, offset, tarballData.length, compressedData.length);
return {
hash: hash.toString('hex'),
offset,
size: tarballData.length,
compressedSize: compressedData.length,
compressionRatio: compressedData.length / tarballData.length
};
}
async retrieve(packageName, version) {
const key = `${packageName}@${version}`;
const hash = crypto.createHash('sha256').update(key).digest();
// Find entry in index
const indexEntry = await this.findIndexEntry(hash);
if (!indexEntry) {
return null;
}
// Read compressed data
const dataFd = await fs.open(this.dataFile, 'r');
const compressedData = Buffer.alloc(indexEntry.compressedSize);
await fs.read(dataFd, compressedData, 0, indexEntry.compressedSize, indexEntry.offset);
await fs.close(dataFd);
// Decompress
const originalData = await gunzip(compressedData);
return originalData;
}
async extract(packageData, targetDir) {
if (!packageData) {
throw new Error('Package data is null or undefined');
}
await fs.ensureDir(targetDir);
// Create temporary tarball file
const tempTarball = path.join(require('os').tmpdir(), `alepm-extract-${Date.now()}.tgz`);
try {
await fs.writeFile(tempTarball, packageData);
// Extract tarball to target directory
await tar.extract({
file: tempTarball,
cwd: targetDir,
strip: 1, // Remove the package/ prefix
filter: (path, entry) => {
// Security: prevent path traversal
const normalizedPath = path.normalize(path);
return !normalizedPath.startsWith('../') && !normalizedPath.includes('/../');
}
});
} finally {
// Clean up temporary file
if (fs.existsSync(tempTarball)) {
await fs.remove(tempTarball);
}
}
}
async updateIndex(hash, offset, size, compressedSize) {
const indexFd = await fs.open(this.indexFile, 'r+');
try {
// Read header to get entry count
const header = Buffer.alloc(16);
await fs.read(indexFd, header, 0, 16, 0);
const entryCount = header.readUInt32BE(12);
// Create new entry
const entry = Buffer.alloc(64); // 32 + 8 + 8 + 8 + 8 = 64 bytes
hash.copy(entry, 0); // Hash (32 bytes)
entry.writeBigUInt64BE(BigInt(offset), 32); // Offset (8 bytes)
entry.writeBigUInt64BE(BigInt(size), 40); // Size (8 bytes)
entry.writeBigUInt64BE(BigInt(compressedSize), 48); // Compressed size (8 bytes)
entry.writeBigUInt64BE(BigInt(Date.now()), 56); // Timestamp (8 bytes)
// Append entry to index
const entryOffset = 16 + (entryCount * 64);
await fs.write(indexFd, entry, 0, 64, entryOffset);
// Update entry count in header
header.writeUInt32BE(entryCount + 1, 12);
await fs.write(indexFd, header, 0, 16, 0);
} finally {
await fs.close(indexFd);
}
}
async findIndexEntry(hash) {
const indexFd = await fs.open(this.indexFile, 'r');
try {
// Read header
const header = Buffer.alloc(16);
await fs.read(indexFd, header, 0, 16, 0);
const entryCount = header.readUInt32BE(12);
// Search for matching hash
for (let i = 0; i < entryCount; i++) {
const entryOffset = 16 + (i * 64);
const entry = Buffer.alloc(64);
await fs.read(indexFd, entry, 0, 64, entryOffset);
const entryHash = entry.slice(0, 32);
if (entryHash.equals(hash)) {
return {
hash: entryHash,
offset: Number(entry.readBigUInt64BE(32)),
size: Number(entry.readBigUInt64BE(40)),
compressedSize: Number(entry.readBigUInt64BE(48)),
timestamp: Number(entry.readBigUInt64BE(56))
};
}
}
return null;
} finally {
await fs.close(indexFd);
}
}
async remove(packageName, version) {
const key = `${packageName}@${version}`;
const hash = crypto.createHash('sha256').update(key).digest();
// Find entry
const entry = await this.findIndexEntry(hash);
if (!entry) {
return false;
}
// For simplicity, we'll mark the entry as deleted by zeroing the hash
// In a production system, you might want to implement compaction
await this.markEntryDeleted(hash);
return true;
}
async markEntryDeleted(hash) {
const indexFd = await fs.open(this.indexFile, 'r+');
try {
const header = Buffer.alloc(16);
await fs.read(indexFd, header, 0, 16, 0);
const entryCount = header.readUInt32BE(12);
for (let i = 0; i < entryCount; i++) {
const entryOffset = 16 + (i * 64);
const entryHash = Buffer.alloc(32);
await fs.read(indexFd, entryHash, 0, 32, entryOffset);
if (entryHash.equals(hash)) {
// Zero out the hash to mark as deleted
const zeroHash = Buffer.alloc(32);
await fs.write(indexFd, zeroHash, 0, 32, entryOffset);
break;
}
}
} finally {
await fs.close(indexFd);
}
}
async compact() {
// Create new temporary files
const newIndexFile = this.indexFile + '.tmp';
const newDataFile = this.dataFile + '.tmp';
await this.createIndex();
await fs.rename(this.indexFile, newIndexFile);
await this.createIndex();
await this.createDataFile();
await fs.rename(this.dataFile, newDataFile);
await this.createDataFile();
// Read old index and copy non-deleted entries
const oldIndexFd = await fs.open(newIndexFile, 'r');
const oldDataFd = await fs.open(newDataFile, 'r');
const newDataFd = await fs.open(this.dataFile, 'a');
let newDataOffset = 16; // Skip header
let compactedEntries = 0;
let spaceFreed = 0;
try {
const header = Buffer.alloc(16);
await fs.read(oldIndexFd, header, 0, 16, 0);
const entryCount = header.readUInt32BE(12);
for (let i = 0; i < entryCount; i++) {
const entryOffset = 16 + (i * 64);
const entry = Buffer.alloc(64);
await fs.read(oldIndexFd, entry, 0, 64, entryOffset);
const entryHash = entry.slice(0, 32);
const isDeleted = entryHash.every(byte => byte === 0);
if (!isDeleted) {
const oldOffset = Number(entry.readBigUInt64BE(32));
const size = Number(entry.readBigUInt64BE(40));
const compressedSize = Number(entry.readBigUInt64BE(48));
// Copy data to new file
const data = Buffer.alloc(compressedSize);
await fs.read(oldDataFd, data, 0, compressedSize, oldOffset);
await fs.write(newDataFd, data, 0, compressedSize, newDataOffset);
// Update entry with new offset
entry.writeBigUInt64BE(BigInt(newDataOffset), 32);
// Add to new index
await this.updateIndex(entryHash, newDataOffset, size, compressedSize);
newDataOffset += compressedSize;
compactedEntries++;
} else {
spaceFreed += Number(entry.readBigUInt64BE(48));
}
}
} finally {
await fs.close(oldIndexFd);
await fs.close(oldDataFd);
await fs.close(newDataFd);
}
// Remove temporary files
await fs.remove(newIndexFile);
await fs.remove(newDataFile);
return {
entriesCompacted: compactedEntries,
spaceFreed
};
}
async getStats() {
const indexStats = await fs.stat(this.indexFile);
const dataStats = await fs.stat(this.dataFile);
const header = Buffer.alloc(16);
const indexFd = await fs.open(this.indexFile, 'r');
await fs.read(indexFd, header, 0, 16, 0);
await fs.close(indexFd);
const entryCount = header.readUInt32BE(12);
// Calculate compression stats
let totalOriginalSize = 0;
let totalCompressedSize = 0;
let activeEntries = 0;
const indexFd2 = await fs.open(this.indexFile, 'r');
try {
for (let i = 0; i < entryCount; i++) {
const entryOffset = 16 + (i * 64);
const entry = Buffer.alloc(64);
await fs.read(indexFd2, entry, 0, 64, entryOffset);
const entryHash = entry.slice(0, 32);
const isDeleted = entryHash.every(byte => byte === 0);
if (!isDeleted) {
totalOriginalSize += Number(entry.readBigUInt64BE(40));
totalCompressedSize += Number(entry.readBigUInt64BE(48));
activeEntries++;
}
}
} finally {
await fs.close(indexFd2);
}
return {
indexSize: indexStats.size,
dataSize: dataStats.size,
totalSize: indexStats.size + dataStats.size,
totalEntries: entryCount,
activeEntries,
deletedEntries: entryCount - activeEntries,
totalOriginalSize,
totalCompressedSize,
compressionRatio: totalOriginalSize > 0 ? totalCompressedSize / totalOriginalSize : 0,
spaceEfficiency: (totalOriginalSize - totalCompressedSize) / totalOriginalSize || 0
};
}
async verify() {
const stats = await this.getStats();
const errors = [];
// Verify index file integrity
try {
const indexFd = await fs.open(this.indexFile, 'r');
const header = Buffer.alloc(16);
await fs.read(indexFd, header, 0, 16, 0);
const magic = header.toString('ascii', 0, 8);
if (magic !== 'ALEPM_IDX') {
errors.push('Invalid index file magic header');
}
await fs.close(indexFd);
} catch (error) {
errors.push(`Index file error: ${error.message}`);
}
// Verify data file integrity
try {
const dataFd = await fs.open(this.dataFile, 'r');
const header = Buffer.alloc(16);
await fs.read(dataFd, header, 0, 16, 0);
const magic = header.toString('ascii', 0, 8);
if (magic !== 'ALEPM_DAT') {
errors.push('Invalid data file magic header');
}
await fs.close(dataFd);
} catch (error) {
errors.push(`Data file error: ${error.message}`);
}
return {
isValid: errors.length === 0,
errors,
stats
};
}
// Utility methods for different storage formats
async storeTarball(tarballData) {
return await this.storeRaw(tarballData);
}
async storeRaw(data) {
const hash = crypto.createHash('sha256').update(data).digest();
const compressed = await gzip(data, { level: this.compressionLevel });
const stats = await fs.stat(this.dataFile);
const offset = stats.size;
const dataFd = await fs.open(this.dataFile, 'a');
await fs.write(dataFd, compressed);
await fs.close(dataFd);
await this.updateIndex(hash, offset, data.length, compressed.length);
return hash.toString('hex');
}
async retrieveByHash(hashString) {
const hash = Buffer.from(hashString, 'hex');
const entry = await this.findIndexEntry(hash);
if (!entry) {
return null;
}
const dataFd = await fs.open(this.dataFile, 'r');
const compressed = Buffer.alloc(entry.compressedSize);
await fs.read(dataFd, compressed, 0, entry.compressedSize, entry.offset);
await fs.close(dataFd);
return await gunzip(compressed);
}
}
module.exports = BinaryStorage;

529
src/utils/config-manager.js Archivo normal
Ver fichero

@@ -0,0 +1,529 @@
const path = require('path');
const fs = require('fs-extra');
const os = require('os');
class ConfigManager {
constructor() {
this.configDir = path.join(os.homedir(), '.alepm');
this.configFile = path.join(this.configDir, 'config.json');
this.globalConfigFile = '/etc/alepm/config.json';
this.defaultConfig = this.getDefaultConfig();
this.config = null;
}
getDefaultConfig() {
return {
// Registry settings
registry: 'https://registry.npmjs.org',
registries: {
npm: 'https://registry.npmjs.org',
yarn: 'https://registry.yarnpkg.com'
},
scopes: {},
// Cache settings
cache: {
enabled: true,
maxSize: '1GB',
maxAge: '30d',
cleanupInterval: '7d',
compression: true,
verifyIntegrity: true
},
// Security settings
security: {
enableAudit: true,
enableIntegrityCheck: true,
enableSignatureVerification: false,
allowedHashAlgorithms: ['sha512', 'sha256'],
requireSignedPackages: false,
blockedPackages: [],
trustedPublishers: [],
maxPackageSize: '100MB',
scanPackageContent: true
},
// Storage settings
storage: {
compression: 9,
binaryFormat: true,
deduplication: true,
compactInterval: '30d'
},
// Network settings
network: {
timeout: 30000,
retries: 3,
userAgent: 'alepm/1.0.0',
proxy: null,
httpsProxy: null,
noProxy: 'localhost,127.0.0.1',
strictSSL: true,
cafile: null,
cert: null,
key: null
},
// Installation settings
install: {
saveExact: false,
savePrefix: '^',
production: false,
optional: true,
dev: false,
globalFolder: path.join(os.homedir(), '.alepm', 'global'),
binLinks: true,
rebuildBundle: true,
ignoreScripts: false,
packageLock: true,
packageLockOnly: false,
shrinkwrap: true,
dryRun: false,
force: false
},
// Output settings
output: {
loglevel: 'info',
silent: false,
json: false,
parseable: false,
progress: true,
color: 'auto',
unicode: true,
timing: false
},
// Performance settings
performance: {
maxConcurrency: 10,
maxSockets: 50,
fetchRetryFactor: 10,
fetchRetryMintimeout: 10000,
fetchRetryMaxtimeout: 60000,
fetchTimeout: 300000
},
// Lock file settings
lockfile: {
enabled: true,
filename: 'alepm.lock',
autoUpdate: true,
verifyIntegrity: true,
includeMetadata: true
},
// Script settings
scripts: {
shellPositional: false,
shell: process.platform === 'win32' ? 'cmd' : 'sh',
ifPresent: false,
ignoreScripts: false,
scriptShell: null
}
};
}
async init() {
await fs.ensureDir(this.configDir);
if (!fs.existsSync(this.configFile)) {
await this.saveConfig(this.defaultConfig);
}
await this.loadConfig();
}
async loadConfig() {
let userConfig = {};
let globalConfig = {};
// Load global config if exists
if (fs.existsSync(this.globalConfigFile)) {
try {
globalConfig = await fs.readJson(this.globalConfigFile);
} catch (error) {
console.warn(`Warning: Could not load global config: ${error.message}`);
}
}
// Load user config if exists
if (fs.existsSync(this.configFile)) {
try {
userConfig = await fs.readJson(this.configFile);
} catch (error) {
console.warn(`Warning: Could not load user config: ${error.message}`);
userConfig = {};
}
}
// Merge configs: default < global < user
this.config = this.deepMerge(
this.defaultConfig,
globalConfig,
userConfig
);
return this.config;
}
async saveConfig(config = null) {
const configToSave = config || this.config || this.defaultConfig;
await fs.writeJson(this.configFile, configToSave, { spaces: 2 });
this.config = configToSave;
}
async get(key, defaultValue = undefined) {
if (!this.config) {
await this.loadConfig();
}
return this.getNestedValue(this.config, key) ?? defaultValue;
}
async set(key, value) {
if (!this.config) {
await this.loadConfig();
}
this.setNestedValue(this.config, key, value);
await this.saveConfig();
}
async unset(key) {
if (!this.config) {
await this.loadConfig();
}
this.unsetNestedValue(this.config, key);
await this.saveConfig();
}
async list() {
if (!this.config) {
await this.loadConfig();
}
return this.config;
}
async reset() {
this.config = { ...this.defaultConfig };
await this.saveConfig();
}
async resetKey(key) {
const defaultValue = this.getNestedValue(this.defaultConfig, key);
await this.set(key, defaultValue);
}
getNestedValue(obj, path) {
const keys = path.split('.');
let current = obj;
for (const key of keys) {
if (current === null || current === undefined || typeof current !== 'object') {
return undefined;
}
current = current[key];
}
return current;
}
setNestedValue(obj, path, value) {
const keys = path.split('.');
let current = obj;
for (let i = 0; i < keys.length - 1; i++) {
const key = keys[i];
if (!(key in current) || typeof current[key] !== 'object') {
current[key] = {};
}
current = current[key];
}
current[keys[keys.length - 1]] = value;
}
unsetNestedValue(obj, path) {
const keys = path.split('.');
let current = obj;
for (let i = 0; i < keys.length - 1; i++) {
const key = keys[i];
if (!(key in current) || typeof current[key] !== 'object') {
return; // Path doesn't exist
}
current = current[key];
}
delete current[keys[keys.length - 1]];
}
deepMerge(...objects) {
const result = {};
for (const obj of objects) {
for (const [key, value] of Object.entries(obj || {})) {
if (value !== null && typeof value === 'object' && !Array.isArray(value)) {
result[key] = this.deepMerge(result[key] || {}, value);
} else {
result[key] = value;
}
}
}
return result;
}
// Utility methods for common config operations
async addRegistry(name, url, options = {}) {
const registries = await this.get('registries', {});
registries[name] = url;
await this.set('registries', registries);
if (options.scope) {
const scopes = await this.get('scopes', {});
scopes[options.scope] = url;
await this.set('scopes', scopes);
}
}
async removeRegistry(name) {
const registries = await this.get('registries', {});
const url = registries[name];
if (url) {
delete registries[name];
await this.set('registries', registries);
// Remove associated scopes
const scopes = await this.get('scopes', {});
for (const [scope, scopeUrl] of Object.entries(scopes)) {
if (scopeUrl === url) {
delete scopes[scope];
}
}
await this.set('scopes', scopes);
}
}
async setScope(scope, registry) {
const scopes = await this.get('scopes', {});
scopes[scope] = registry;
await this.set('scopes', scopes);
}
async removeScope(scope) {
const scopes = await this.get('scopes', {});
delete scopes[scope];
await this.set('scopes', scopes);
}
async addTrustedPublisher(publisherId, publicKey) {
const trusted = await this.get('security.trustedPublishers', []);
trusted.push({ id: publisherId, publicKey, addedAt: Date.now() });
await this.set('security.trustedPublishers', trusted);
}
async removeTrustedPublisher(publisherId) {
const trusted = await this.get('security.trustedPublishers', []);
const filtered = trusted.filter(p => p.id !== publisherId);
await this.set('security.trustedPublishers', filtered);
}
async blockPackage(packageName, reason) {
const blocked = await this.get('security.blockedPackages', []);
blocked.push({ name: packageName, reason, blockedAt: Date.now() });
await this.set('security.blockedPackages', blocked);
}
async unblockPackage(packageName) {
const blocked = await this.get('security.blockedPackages', []);
const filtered = blocked.filter(p => p.name !== packageName);
await this.set('security.blockedPackages', filtered);
}
async setProxy(proxy, httpsProxy = null) {
await this.set('network.proxy', proxy);
if (httpsProxy) {
await this.set('network.httpsProxy', httpsProxy);
}
}
async removeProxy() {
await this.set('network.proxy', null);
await this.set('network.httpsProxy', null);
}
// Configuration validation
validateConfig(config = null) {
const configToValidate = config || this.config;
const errors = [];
const warnings = [];
// Validate registry URLs
if (configToValidate.registry) {
if (!this.isValidUrl(configToValidate.registry)) {
errors.push(`Invalid registry URL: ${configToValidate.registry}`);
}
}
// Validate cache settings
if (configToValidate.cache) {
if (configToValidate.cache.maxSize) {
if (!this.isValidSize(configToValidate.cache.maxSize)) {
errors.push(`Invalid cache maxSize: ${configToValidate.cache.maxSize}`);
}
}
if (configToValidate.cache.maxAge) {
if (!this.isValidDuration(configToValidate.cache.maxAge)) {
errors.push(`Invalid cache maxAge: ${configToValidate.cache.maxAge}`);
}
}
}
// Validate security settings
if (configToValidate.security) {
if (configToValidate.security.maxPackageSize) {
if (!this.isValidSize(configToValidate.security.maxPackageSize)) {
errors.push(`Invalid maxPackageSize: ${configToValidate.security.maxPackageSize}`);
}
}
}
// Validate network settings
if (configToValidate.network) {
if (configToValidate.network.timeout < 0) {
errors.push('Network timeout must be positive');
}
if (configToValidate.network.retries < 0) {
errors.push('Network retries must be non-negative');
}
}
return { valid: errors.length === 0, errors, warnings };
}
isValidUrl(url) {
try {
new URL(url);
return true;
} catch {
return false;
}
}
isValidSize(size) {
return /^\d+(?:\.\d+)?[KMGT]?B$/i.test(size);
}
isValidDuration(duration) {
return /^\d+[smhdwMy]$/.test(duration);
}
parseSize(size) {
const match = size.match(/^(\d+(?:\.\d+)?)([KMGT]?)B$/i);
if (!match) return 0;
const [, value, unit] = match;
const multipliers = { '': 1, K: 1024, M: 1024**2, G: 1024**3, T: 1024**4 };
return parseFloat(value) * (multipliers[unit.toUpperCase()] || 1);
}
parseDuration(duration) {
const match = duration.match(/^(\d+)([smhdwMy])$/);
if (!match) return 0;
const [, value, unit] = match;
const multipliers = {
s: 1000,
m: 60 * 1000,
h: 60 * 60 * 1000,
d: 24 * 60 * 60 * 1000,
w: 7 * 24 * 60 * 60 * 1000,
M: 30 * 24 * 60 * 60 * 1000,
y: 365 * 24 * 60 * 60 * 1000
};
return parseInt(value) * multipliers[unit];
}
// Environment variable overrides
applyEnvironmentOverrides() {
const envMappings = {
'ALEPM_REGISTRY': 'registry',
'ALEPM_CACHE': 'cache.enabled',
'ALEPM_CACHE_DIR': 'cache.directory',
'ALEPM_LOGLEVEL': 'output.loglevel',
'ALEPM_PROXY': 'network.proxy',
'ALEPM_HTTPS_PROXY': 'network.httpsProxy',
'ALEPM_NO_PROXY': 'network.noProxy',
'ALEPM_TIMEOUT': 'network.timeout',
'ALEPM_RETRIES': 'network.retries'
};
for (const [envVar, configPath] of Object.entries(envMappings)) {
const envValue = process.env[envVar];
if (envValue !== undefined) {
// Convert string values to appropriate types
let value = envValue;
if (envValue === 'true') value = true;
else if (envValue === 'false') value = false;
else if (/^\d+$/.test(envValue)) value = parseInt(envValue);
this.setNestedValue(this.config, configPath, value);
}
}
}
// Export/import configuration
async export(format = 'json') {
const config = await this.list();
switch (format.toLowerCase()) {
case 'json':
return JSON.stringify(config, null, 2);
case 'yaml':
// Would need yaml library
throw new Error('YAML export not implemented');
default:
throw new Error(`Unsupported export format: ${format}`);
}
}
async import(data, format = 'json') {
let importedConfig;
switch (format.toLowerCase()) {
case 'json':
importedConfig = JSON.parse(data);
break;
default:
throw new Error(`Unsupported import format: ${format}`);
}
const validation = this.validateConfig(importedConfig);
if (!validation.valid) {
throw new Error(`Invalid configuration: ${validation.errors.join(', ')}`);
}
this.config = importedConfig;
await this.saveConfig();
}
}
module.exports = ConfigManager;

465
src/utils/logger.js Archivo normal
Ver fichero

@@ -0,0 +1,465 @@
const fs = require('fs-extra');
const path = require('path');
const os = require('os');
const chalk = require('chalk');
class Logger {
constructor(options = {}) {
this.levels = {
error: 0,
warn: 1,
info: 2,
http: 3,
verbose: 4,
debug: 5,
silly: 6
};
this.colors = {
error: 'red',
warn: 'yellow',
info: 'cyan',
http: 'green',
verbose: 'blue',
debug: 'magenta',
silly: 'gray'
};
this.config = {
level: options.level || 'info',
silent: options.silent || false,
timestamp: options.timestamp !== false,
colorize: options.colorize !== false,
json: options.json || false,
logFile: options.logFile || path.join(os.homedir(), '.alepm', 'logs', 'alepm.log'),
maxSize: options.maxSize || '10MB',
maxFiles: options.maxFiles || 5,
...options
};
this.init();
}
async init() {
await fs.ensureDir(path.dirname(this.config.logFile));
// Rotate logs if needed
await this.rotateLogsIfNeeded();
}
log(level, message, meta = {}) {
if (this.config.silent) {
return;
}
const levelNum = this.levels[level];
const configLevelNum = this.levels[this.config.level];
if (levelNum > configLevelNum) {
return;
}
const logEntry = this.formatLogEntry(level, message, meta);
// Output to console
this.outputToConsole(level, logEntry);
// Write to file
this.writeToFile(logEntry);
}
error(message, meta = {}) {
this.log('error', message, meta);
}
warn(message, meta = {}) {
this.log('warn', message, meta);
}
info(message, meta = {}) {
this.log('info', message, meta);
}
http(message, meta = {}) {
this.log('http', message, meta);
}
verbose(message, meta = {}) {
this.log('verbose', message, meta);
}
debug(message, meta = {}) {
this.log('debug', message, meta);
}
silly(message, meta = {}) {
this.log('silly', message, meta);
}
formatLogEntry(level, message, meta) {
const timestamp = this.config.timestamp ? new Date().toISOString() : null;
const entry = {
timestamp,
level,
message,
...meta
};
if (this.config.json) {
return JSON.stringify(entry);
} else {
let formatted = '';
if (timestamp) {
formatted += `[${timestamp}] `;
}
formatted += `${level.toUpperCase()}: ${message}`;
if (Object.keys(meta).length > 0) {
formatted += ` ${JSON.stringify(meta)}`;
}
return formatted;
}
}
outputToConsole(level, logEntry) {
const colorize = this.config.colorize && process.stdout.isTTY;
if (colorize) {
const color = this.colors[level] || 'white';
console.log(chalk[color](logEntry));
} else {
console.log(logEntry);
}
}
async writeToFile(logEntry) {
try {
await fs.appendFile(this.config.logFile, logEntry + '\n');
} catch (error) {
// Fail silently to avoid infinite loops
}
}
async rotateLogsIfNeeded() {
try {
const stats = await fs.stat(this.config.logFile);
const maxSizeBytes = this.parseSize(this.config.maxSize);
if (stats.size > maxSizeBytes) {
await this.rotateLogs();
}
} catch (error) {
// File doesn't exist yet, no need to rotate
}
}
async rotateLogs() {
const logDir = path.dirname(this.config.logFile);
const logBasename = path.basename(this.config.logFile, path.extname(this.config.logFile));
const logExt = path.extname(this.config.logFile);
// Rotate existing files
for (let i = this.config.maxFiles - 1; i > 0; i--) {
const oldFile = path.join(logDir, `${logBasename}.${i}${logExt}`);
const newFile = path.join(logDir, `${logBasename}.${i + 1}${logExt}`);
if (await fs.pathExists(oldFile)) {
if (i === this.config.maxFiles - 1) {
// Delete the oldest file
await fs.remove(oldFile);
} else {
await fs.move(oldFile, newFile);
}
}
}
// Move current log to .1
const currentLog = this.config.logFile;
const rotatedLog = path.join(logDir, `${logBasename}.1${logExt}`);
if (await fs.pathExists(currentLog)) {
await fs.move(currentLog, rotatedLog);
}
}
parseSize(size) {
const match = size.match(/^(\d+(?:\.\d+)?)([KMGT]?)B$/i);
if (!match) return 0;
const [, value, unit] = match;
const multipliers = { '': 1, K: 1024, M: 1024**2, G: 1024**3, T: 1024**4 };
return parseFloat(value) * (multipliers[unit.toUpperCase()] || 1);
}
// Performance logging
time(label) {
if (!this.timers) {
this.timers = new Map();
}
this.timers.set(label, process.hrtime.bigint());
}
timeEnd(label) {
if (!this.timers || !this.timers.has(label)) {
this.warn(`Timer "${label}" does not exist`);
return;
}
const start = this.timers.get(label);
const end = process.hrtime.bigint();
const duration = Number(end - start) / 1000000; // Convert to milliseconds
this.timers.delete(label);
this.info(`${label}: ${duration.toFixed(2)}ms`);
return duration;
}
// Request logging
logRequest(method, url, options = {}) {
this.http(`${method} ${url}`, {
method,
url,
userAgent: options.userAgent,
timeout: options.timeout,
headers: this.sanitizeHeaders(options.headers)
});
}
logResponse(method, url, statusCode, duration, options = {}) {
const level = statusCode >= 400 ? 'error' : statusCode >= 300 ? 'warn' : 'http';
this.log(level, `${method} ${url} ${statusCode} ${duration}ms`, {
method,
url,
statusCode,
duration,
size: options.size
});
}
sanitizeHeaders(headers = {}) {
const sensitiveHeaders = ['authorization', 'cookie', 'x-api-key', 'x-auth-token'];
const sanitized = {};
for (const [key, value] of Object.entries(headers)) {
if (sensitiveHeaders.includes(key.toLowerCase())) {
sanitized[key] = '[REDACTED]';
} else {
sanitized[key] = value;
}
}
return sanitized;
}
// Package operation logging
logPackageOperation(operation, packageName, version, options = {}) {
this.info(`${operation} ${packageName}@${version}`, {
operation,
package: packageName,
version,
...options
});
}
logPackageError(operation, packageName, version, error, options = {}) {
this.error(`Failed to ${operation} ${packageName}@${version}: ${error.message}`, {
operation,
package: packageName,
version,
error: error.message,
stack: error.stack,
...options
});
}
logCacheOperation(operation, key, options = {}) {
this.debug(`Cache ${operation}: ${key}`, {
operation,
key,
...options
});
}
logSecurityEvent(event, details = {}) {
this.warn(`Security event: ${event}`, {
event,
timestamp: Date.now(),
...details
});
}
// Structured logging for analytics
logAnalytics(event, data = {}) {
this.info(`Analytics: ${event}`, {
event,
timestamp: Date.now(),
session: this.getSessionId(),
platform: process.platform,
arch: process.arch,
node: process.version,
...data
});
}
getSessionId() {
if (!this.sessionId) {
this.sessionId = require('crypto').randomUUID();
}
return this.sessionId;
}
// Progress logging
createProgressLogger(total, label = 'Progress') {
let current = 0;
let lastLogTime = 0;
const minLogInterval = 1000; // Log at most once per second
return {
tick: (amount = 1) => {
current += amount;
const now = Date.now();
if (now - lastLogTime > minLogInterval || current >= total) {
const percentage = ((current / total) * 100).toFixed(1);
this.info(`${label}: ${current}/${total} (${percentage}%)`);
lastLogTime = now;
}
},
complete: () => {
this.info(`${label}: Complete (${total}/${total})`);
}
};
}
// Error aggregation
reportErrors() {
if (!this.errorStats) {
return { totalErrors: 0, errorTypes: {} };
}
return {
totalErrors: this.errorStats.total,
errorTypes: { ...this.errorStats.types },
lastError: this.errorStats.lastError
};
}
trackError(error) {
if (!this.errorStats) {
this.errorStats = {
total: 0,
types: {},
lastError: null
};
}
this.errorStats.total++;
this.errorStats.types[error.constructor.name] =
(this.errorStats.types[error.constructor.name] || 0) + 1;
this.errorStats.lastError = {
message: error.message,
timestamp: Date.now()
};
}
// Log file management
async clearLogs() {
try {
const logDir = path.dirname(this.config.logFile);
const logBasename = path.basename(this.config.logFile, path.extname(this.config.logFile));
const logExt = path.extname(this.config.logFile);
// Remove all log files
await fs.remove(this.config.logFile);
for (let i = 1; i <= this.config.maxFiles; i++) {
const logFile = path.join(logDir, `${logBasename}.${i}${logExt}`);
if (await fs.pathExists(logFile)) {
await fs.remove(logFile);
}
}
this.info('Log files cleared');
} catch (error) {
this.error('Failed to clear logs', { error: error.message });
}
}
async getLogStats() {
try {
const stats = {
files: [],
totalSize: 0
};
const logDir = path.dirname(this.config.logFile);
const logBasename = path.basename(this.config.logFile, path.extname(this.config.logFile));
const logExt = path.extname(this.config.logFile);
// Check main log file
if (await fs.pathExists(this.config.logFile)) {
const stat = await fs.stat(this.config.logFile);
stats.files.push({
file: this.config.logFile,
size: stat.size,
modified: stat.mtime
});
stats.totalSize += stat.size;
}
// Check rotated log files
for (let i = 1; i <= this.config.maxFiles; i++) {
const logFile = path.join(logDir, `${logBasename}.${i}${logExt}`);
if (await fs.pathExists(logFile)) {
const stat = await fs.stat(logFile);
stats.files.push({
file: logFile,
size: stat.size,
modified: stat.mtime
});
stats.totalSize += stat.size;
}
}
return stats;
} catch (error) {
this.error('Failed to get log stats', { error: error.message });
return { files: [], totalSize: 0 };
}
}
// Configuration updates
updateConfig(newConfig) {
this.config = { ...this.config, ...newConfig };
}
setLevel(level) {
if (!this.levels.hasOwnProperty(level)) {
throw new Error(`Invalid log level: ${level}`);
}
this.config.level = level;
}
setSilent(silent = true) {
this.config.silent = silent;
}
setColorize(colorize = true) {
this.config.colorize = colorize;
}
setJson(json = true) {
this.config.json = json;
}
}
module.exports = Logger;

129
tests/cache-manager.test.js Archivo normal
Ver fichero

@@ -0,0 +1,129 @@
const CacheManager = require('../src/cache/cache-manager');
const path = require('path');
const fs = require('fs-extra');
describe('CacheManager', () => {
let cacheManager;
let testCacheDir;
beforeEach(async () => {
testCacheDir = path.join(global.TEST_DIR, 'cache');
cacheManager = new CacheManager();
cacheManager.cacheDir = testCacheDir;
cacheManager.metadataFile = path.join(testCacheDir, 'metadata.json');
await cacheManager.init();
});
describe('init', () => {
test('should create cache directory', async () => {
expect(await fs.pathExists(testCacheDir)).toBe(true);
});
test('should create metadata file', async () => {
expect(await fs.pathExists(cacheManager.metadataFile)).toBe(true);
});
});
describe('store and get', () => {
test('should store and retrieve package data', async () => {
const packageName = 'test-package';
const version = '1.0.0';
const data = Buffer.from('test package data');
await cacheManager.store(packageName, version, data);
const retrieved = await cacheManager.get(packageName, version);
expect(retrieved).toEqual(data);
});
test('should return null for non-existent package', async () => {
const retrieved = await cacheManager.get('non-existent', '1.0.0');
expect(retrieved).toBeNull();
});
test('should handle binary data correctly', async () => {
const packageName = 'binary-package';
const version = '1.0.0';
const binaryData = Buffer.from([0x89, 0x50, 0x4E, 0x47, 0x0D, 0x0A, 0x1A, 0x0A]);
await cacheManager.store(packageName, version, binaryData);
const retrieved = await cacheManager.get(packageName, version);
expect(retrieved).toEqual(binaryData);
});
});
describe('remove', () => {
test('should remove package from cache', async () => {
const packageName = 'test-package';
const version = '1.0.0';
const data = Buffer.from('test data');
await cacheManager.store(packageName, version, data);
expect(await cacheManager.get(packageName, version)).toEqual(data);
const removed = await cacheManager.remove(packageName, version);
expect(removed).toBe(true);
expect(await cacheManager.get(packageName, version)).toBeNull();
});
test('should return false for non-existent package', async () => {
const removed = await cacheManager.remove('non-existent', '1.0.0');
expect(removed).toBe(false);
});
});
describe('verify', () => {
test('should verify cache integrity', async () => {
const packageName = 'test-package';
const version = '1.0.0';
const data = Buffer.from('test data');
await cacheManager.store(packageName, version, data);
const result = await cacheManager.verify();
expect(result.corrupted).toBe(0);
expect(result.missing).toBe(0);
expect(result.valid).toBe(1);
});
});
describe('clean', () => {
test('should clean entire cache', async () => {
const packageName = 'test-package';
const version = '1.0.0';
const data = Buffer.from('test data');
await cacheManager.store(packageName, version, data);
expect(await cacheManager.get(packageName, version)).toEqual(data);
const cleanedSize = await cacheManager.clean();
expect(cleanedSize).toBeGreaterThan(0);
expect(await cacheManager.get(packageName, version)).toBeNull();
});
});
describe('getStats', () => {
test('should return cache statistics', async () => {
const stats = await cacheManager.getStats();
expect(stats).toHaveProperty('totalEntries');
expect(stats).toHaveProperty('totalSize');
expect(stats).toHaveProperty('compressionRatio');
expect(typeof stats.totalEntries).toBe('number');
expect(typeof stats.totalSize).toBe('number');
});
test('should calculate compression ratio correctly', async () => {
const packageName = 'test-package';
const version = '1.0.0';
const data = Buffer.from('test data '.repeat(100)); // Repeatable data for good compression
await cacheManager.store(packageName, version, data);
const stats = await cacheManager.getStats();
expect(stats.totalEntries).toBe(1);
expect(stats.compressionRatio).toBeLessThan(1); // Should be compressed
});
});
});

209
tests/security-manager.test.js Archivo normal
Ver fichero

@@ -0,0 +1,209 @@
const SecurityManager = require('../src/security/security-manager');
describe('SecurityManager', () => {
let securityManager;
beforeEach(() => {
securityManager = new SecurityManager();
});
describe('verifyIntegrity', () => {
test('should verify valid integrity hash', async () => {
const data = Buffer.from('test data');
const crypto = require('crypto');
const hash = crypto.createHash('sha256').update(data).digest('base64');
const integrity = `sha256-${hash}`;
const result = await securityManager.verifyIntegrity(data, integrity);
expect(result).toBe(true);
});
test('should reject invalid integrity hash', async () => {
const data = Buffer.from('test data');
const integrity = 'sha256-invalidhash';
await expect(securityManager.verifyIntegrity(data, integrity))
.rejects.toThrow('Package integrity verification failed');
});
test('should reject invalid integrity format', async () => {
const data = Buffer.from('test data');
const integrity = 'invalid-format';
await expect(securityManager.verifyIntegrity(data, integrity))
.rejects.toThrow('Invalid integrity format');
});
test('should reject unsupported hash algorithm', async () => {
const data = Buffer.from('test data');
const integrity = 'sha1-somehash'; // sha1 is not in allowedHashAlgorithms
await expect(securityManager.verifyIntegrity(data, integrity))
.rejects.toThrow('Unsupported hash algorithm: sha1');
});
});
describe('checkPackageSize', () => {
test('should allow packages under size limit', async () => {
const smallData = Buffer.from('small package');
const result = await securityManager.checkPackageSize(smallData);
expect(result).toBe(true);
});
test('should reject packages over size limit', async () => {
// Create a large buffer (larger than 100MB default limit)
const largeSize = 101 * 1024 * 1024; // 101MB
const largeData = Buffer.alloc(largeSize);
await expect(securityManager.checkPackageSize(largeData))
.rejects.toThrow('Package size exceeds maximum allowed');
});
});
describe('checkBlockedPackages', () => {
test('should allow non-blocked packages', async () => {
const result = await securityManager.checkBlockedPackages('safe-package');
expect(result).toBe(true);
});
test('should reject blocked packages', async () => {
securityManager.securityConfig.blockedPackages.add('blocked-package');
await expect(securityManager.checkBlockedPackages('blocked-package'))
.rejects.toThrow('Package "blocked-package" is blocked for security reasons');
});
});
describe('scanPackageContent', () => {
test('should detect suspicious patterns', async () => {
const suspiciousCode = Buffer.from(`
const fs = require('fs');
eval('malicious code');
require('child_process').exec('rm -rf /');
`);
const indicators = await securityManager.scanPackageContent(suspiciousCode, 'test-package');
expect(indicators.length).toBeGreaterThan(0);
expect(indicators.some(i => i.pattern.includes('eval'))).toBe(true);
});
test('should not flag clean code', async () => {
const cleanCode = Buffer.from(`
function add(a, b) {
return a + b;
}
module.exports = { add };
`);
const indicators = await securityManager.scanPackageContent(cleanCode, 'clean-package');
expect(indicators.length).toBe(0);
});
test('should detect obfuscated code', async () => {
const obfuscatedCode = Buffer.from(`
var _0x1234567890abcdefghijklmnop = ['\\x65\\x76\\x61\\x6c'];
function _0x1234567890abcdefghijklmnopqrstuvwxyz() { return _0x1234567890abcdefghijklmnop; }
var _0x9abcdefghijklmnopqrstuvwxyz0123456789 = _0x1234567890abcdefghijklmnopqrstuvwxyz();
_0x9abcdefghijklmnopqrstuvwxyz0123456789[0]('\\u0061\\u006c\\u0065\\u0072\\u0074("test")');
unescape('%61%6c%65%72%74');
String.fromCharCode(97,108,101,114,116);
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
`);
const indicators = await securityManager.scanPackageContent(obfuscatedCode, 'obfuscated-package');
expect(indicators.length).toBeGreaterThan(0);
});
});
describe('generatePackageHash', () => {
test('should generate consistent hash', async () => {
const data = Buffer.from('test data');
const hash1 = await securityManager.generatePackageHash(data);
const hash2 = await securityManager.generatePackageHash(data);
expect(hash1).toBe(hash2);
expect(typeof hash1).toBe('string');
expect(hash1.length).toBeGreaterThan(0);
});
test('should generate different hashes for different data', async () => {
const data1 = Buffer.from('test data 1');
const data2 = Buffer.from('test data 2');
const hash1 = await securityManager.generatePackageHash(data1);
const hash2 = await securityManager.generatePackageHash(data2);
expect(hash1).not.toBe(hash2);
});
});
describe('createIntegrityString', () => {
test('should create valid integrity string', async () => {
const data = Buffer.from('test data');
const integrity = await securityManager.createIntegrityString(data);
expect(integrity).toMatch(/^sha512-.+$/);
// Verify it can be used for verification
const isValid = await securityManager.verifyIntegrity(data, integrity);
expect(isValid).toBe(true);
});
});
describe('assessPackageRisk', () => {
test('should assess risk for new package', () => {
const pkg = {
name: 'new-package',
time: { created: Date.now() - (20 * 24 * 60 * 60 * 1000) }, // 20 days ago
downloads: { weekly: 50 },
maintainers: []
};
const scanResults = [];
const risk = securityManager.assessPackageRisk(pkg, scanResults);
expect(risk.score).toBeGreaterThan(0);
expect(risk.level).toBe('high'); // Score is 5: 2 (new) + 2 (low downloads) + 1 (no maintainers)
expect(risk.factors).toContain('Package is very new (< 30 days)');
expect(risk.factors).toContain('Low download count');
expect(risk.factors).toContain('No maintainers');
});
test('should assess lower risk for established package', () => {
const pkg = {
name: 'established-package',
time: { created: Date.now() - (365 * 24 * 60 * 60 * 1000) }, // 1 year ago
downloads: { weekly: 10000 },
maintainers: [{ name: 'maintainer1' }],
dependencies: {}
};
const scanResults = [];
const risk = securityManager.assessPackageRisk(pkg, scanResults);
expect(risk.score).toBe(0);
expect(risk.level).toBe('low');
expect(risk.factors).toHaveLength(0);
});
test('should increase risk for suspicious scan results', () => {
const pkg = {
name: 'suspicious-package',
time: { created: Date.now() - (365 * 24 * 60 * 60 * 1000) },
downloads: { weekly: 1000 },
maintainers: [{ name: 'maintainer1' }]
};
const scanResults = [
{ severity: 'high' },
{ severity: 'medium' }
];
const risk = securityManager.assessPackageRisk(pkg, scanResults);
expect(risk.score).toBe(4); // 3 for high + 1 for medium
expect(risk.level).toBe('high');
expect(risk.factors).toContain('2 suspicious patterns detected');
});
});
});

24
tests/setup.js Archivo normal
Ver fichero

@@ -0,0 +1,24 @@
// Test setup file
const fs = require('fs-extra');
const path = require('path');
const os = require('os');
// Create temporary directory for tests
global.TEST_DIR = path.join(os.tmpdir(), 'alepm-tests');
beforeEach(async () => {
await fs.ensureDir(global.TEST_DIR);
});
afterEach(async () => {
await fs.remove(global.TEST_DIR);
});
// Mock console methods to avoid noise in tests
global.console = {
...console,
log: jest.fn(),
info: jest.fn(),
warn: jest.fn(),
error: jest.fn()
};