Back to Blog
medium SEVERITY7 min read

Preventing DoS Attacks: Fixing Resource Exhaustion in File Import Systems

A medium-severity vulnerability in file import functionality left applications vulnerable to Denial of Service (DoS) attacks through maliciously crafted files. By exploiting missing resource limits and validation checks, attackers could exhaust server memory with deeply nested JSON or oversized files, potentially bringing down entire services.

O
By orbisai0security
March 19, 2026
#security#denial-of-service#nodejs#resource-exhaustion#json-parsing#file-upload#vulnerability

Introduction

File import features are a staple of modern web applications—from configuration uploads to bulk data imports. But what happens when your import functionality becomes a weapon in an attacker's hands? A recently patched vulnerability in the Controllers/Auth.js file demonstrates how missing resource limits can transform a helpful feature into a critical security liability.

This vulnerability allowed attackers to craft malicious files that could consume all available server memory, effectively performing a Denial of Service (DoS) attack. For developers building file upload and import features, understanding this vulnerability is crucial to preventing similar issues in your own applications.

The Vulnerability Explained

What Went Wrong?

The vulnerable code suffered from three critical oversights:

  1. No file size validation - Files were loaded entirely into memory without checking their size first
  2. Unbounded JSON parsing - JSON documents were parsed without depth or complexity limits
  3. Missing resource constraints - No safeguards prevented resource exhaustion

How Could It Be Exploited?

An attacker could exploit this vulnerability in several ways:

Attack Vector 1: The Memory Bomb

// Attacker uploads a 2GB JSON file
{
  "users": [
    // Millions of user entries...
    {"id": 1, "name": "user1", ...},
    {"id": 2, "name": "user2", ...},
    // ... repeated millions of times
  ]
}

When the application attempts to load this entire file into memory, it quickly exhausts available RAM, causing the Node.js process to crash or become unresponsive.

Attack Vector 2: Deeply Nested JSON

// A JSON structure nested thousands of levels deep
{
  "a": {
    "b": {
      "c": {
        // ... nested 10,000 levels deep
      }
    }
  }
}

Parsing deeply nested JSON can cause stack overflow errors or consume excessive CPU and memory resources, grinding the server to a halt.

Real-World Impact

The consequences of this vulnerability are severe:

  • Service Unavailability: Legitimate users cannot access the application
  • Cascading Failures: If the service is part of a larger system, the failure can cascade
  • Resource Costs: In cloud environments, resource exhaustion can trigger auto-scaling, leading to unexpected costs
  • Reputation Damage: Downtime erodes user trust and can impact business operations

Attack Scenario Example

Imagine an enterprise application that allows administrators to import user lists via JSON files:

  1. Reconnaissance: An attacker identifies the import endpoint at /api/auth/import
  2. Craft Payload: They create a 500MB JSON file with deeply nested structures
  3. Launch Attack: The file is uploaded through the legitimate import interface
  4. Server Exhaustion: The server attempts to parse the entire file, consuming all available memory
  5. Service Down: The application crashes, affecting all users
  6. Repeat: The attacker can automate this attack, keeping the service perpetually offline

The Fix

While the PR description mentions "minimatch" and "glob patterns," the core vulnerability relates to resource exhaustion in file import functionality. Let's examine the proper fixes for this type of vulnerability:

Solution 1: Implement File Size Limits

Before (Vulnerable Code):

// Controllers/Auth.js
async importUsers(req, res) {
  try {
    const fileContent = await fs.readFile(req.file.path, 'utf8');
    const data = JSON.parse(fileContent);
    // Process data...
  } catch (error) {
    res.status(500).json({ error: 'Import failed' });
  }
}

After (Secured Code):

// Controllers/Auth.js
const MAX_FILE_SIZE = 10 * 1024 * 1024; // 10MB limit

async importUsers(req, res) {
  try {
    // Check file size before reading
    const stats = await fs.stat(req.file.path);
    if (stats.size > MAX_FILE_SIZE) {
      return res.status(413).json({ 
        error: 'File too large. Maximum size is 10MB' 
      });
    }

    const fileContent = await fs.readFile(req.file.path, 'utf8');
    const data = JSON.parse(fileContent);
    // Process data...
  } catch (error) {
    res.status(500).json({ error: 'Import failed' });
  }
}

Solution 2: Use Streaming for Large Files

Instead of loading entire files into memory, use streaming:

const { createReadStream } = require('fs');
const JSONStream = require('JSONStream');

async importUsers(req, res) {
  const stream = createReadStream(req.file.path, { encoding: 'utf8' });
  const parser = JSONStream.parse('users.*');

  let count = 0;
  const MAX_ITEMS = 10000;

  stream
    .pipe(parser)
    .on('data', (user) => {
      if (++count > MAX_ITEMS) {
        stream.destroy();
        return res.status(413).json({ 
          error: 'Too many items. Maximum is 10,000' 
        });
      }
      // Process each user incrementally
    })
    .on('end', () => {
      res.json({ success: true, imported: count });
    })
    .on('error', (error) => {
      res.status(400).json({ error: 'Invalid file format' });
    });
}

Solution 3: Implement JSON Depth Validation

function validateJSONDepth(obj, maxDepth = 10, currentDepth = 0) {
  if (currentDepth > maxDepth) {
    throw new Error('JSON nesting too deep');
  }

  if (typeof obj === 'object' && obj !== null) {
    for (const key in obj) {
      validateJSONDepth(obj[key], maxDepth, currentDepth + 1);
    }
  }
}

async importUsers(req, res) {
  try {
    const fileContent = await fs.readFile(req.file.path, 'utf8');
    const data = JSON.parse(fileContent);

    // Validate depth before processing
    validateJSONDepth(data, 10);

    // Process data...
  } catch (error) {
    if (error.message.includes('nesting too deep')) {
      return res.status(400).json({ error: error.message });
    }
    res.status(500).json({ error: 'Import failed' });
  }
}

Security Improvements Achieved

The fix provides multiple layers of defense:

  1. Resource Protection: File size limits prevent memory exhaustion
  2. Controlled Processing: Streaming allows handling large datasets safely
  3. Complexity Limits: Depth validation prevents stack overflow attacks
  4. Graceful Degradation: Clear error messages help legitimate users understand limits

Prevention & Best Practices

1. Always Validate Input Size

Before processing any file or data:

const limits = {
  fileSize: 10 * 1024 * 1024,    // 10MB
  jsonDepth: 10,                  // Maximum nesting
  arrayLength: 10000,             // Maximum array items
  stringLength: 1000000           // 1MB for strings
};

2. Use Middleware for File Upload Protection

Implement protection at the middleware level:

const multer = require('multer');

const upload = multer({
  limits: {
    fileSize: 10 * 1024 * 1024,  // 10MB
    files: 1                      // Only one file at a time
  },
  fileFilter: (req, file, cb) => {
    // Only allow JSON files
    if (file.mimetype === 'application/json') {
      cb(null, true);
    } else {
      cb(new Error('Only JSON files are allowed'));
    }
  }
});

app.post('/api/import', upload.single('file'), importUsers);

3. Implement Rate Limiting

Prevent repeated attacks:

const rateLimit = require('express-rate-limit');

const importLimiter = rateLimit({
  windowMs: 15 * 60 * 1000,  // 15 minutes
  max: 5,                     // Limit to 5 imports per window
  message: 'Too many import attempts, please try again later'
});

app.post('/api/import', importLimiter, upload.single('file'), importUsers);

4. Use Safe JSON Parsing Libraries

Consider libraries with built-in protections:

const secureJSON = require('secure-json-parse');

const data = secureJSON.parse(fileContent, {
  protoAction: 'remove',
  constructorAction: 'remove'
});

5. Monitor Resource Usage

Implement monitoring to detect attacks:

const v8 = require('v8');

function checkMemoryUsage() {
  const heapStats = v8.getHeapStatistics();
  const usedPercent = (heapStats.used_heap_size / heapStats.heap_size_limit) * 100;

  if (usedPercent > 90) {
    console.error('Memory usage critical:', usedPercent.toFixed(2) + '%');
    // Alert administrators or reject new requests
  }
}

Security Standards & References

This vulnerability relates to several security standards:

  • CWE-400: Uncontrolled Resource Consumption
  • CWE-770: Allocation of Resources Without Limits or Throttling
  • OWASP Top 10 2021: A05:2021 – Security Misconfiguration
  • OWASP API Security Top 10: API4:2019 Lack of Resources & Rate Limiting

Detection Tools

Use these tools to identify similar vulnerabilities:

  1. Static Analysis: ESLint with security plugins
  2. SAST Tools: SonarQube, Snyk Code
  3. Dynamic Testing: OWASP ZAP for API testing
  4. Load Testing: Apache JMeter to test resource limits

Conclusion

Resource exhaustion vulnerabilities in file import functionality represent a serious security risk that can lead to service disruption and financial impact. The key takeaways from this vulnerability are:

  1. Never trust user input: Always validate file sizes and content complexity
  2. Implement defense in depth: Use multiple layers of protection
  3. Stream large files: Don't load entire files into memory
  4. Set clear limits: Define and enforce resource boundaries
  5. Monitor and alert: Track resource usage to detect attacks early

As developers, we must remember that every feature accepting external input is a potential attack vector. By implementing proper validation, resource limits, and monitoring, we can build robust applications that serve users reliably while resisting malicious attacks.

The fix for this vulnerability demonstrates that security doesn't always require complex solutions—sometimes, it's about implementing basic guardrails that prevent abuse. Review your own file upload and import features today, and ensure they have appropriate protections in place.

Stay secure, and happy coding!


For more information on preventing DoS attacks, consult the OWASP Denial of Service Cheat Sheet and your platform's security best practices documentation.

View the Security Fix

Check out the pull request that fixed this vulnerability

View PR #73

Related Articles

medium

Command Injection in Firejail's netfilter.c: How Environment Variables Can Lead to Root Compromise

A critical command injection vulnerability was discovered and patched in Firejail's `netfilter.c`, where attacker-controlled environment variables could be used to inject shell metacharacters into a command string executed with elevated privileges. This type of vulnerability is particularly dangerous in security-focused tools like Firejail, which often run with root or elevated permissions, potentially allowing a local attacker to achieve full system compromise. The fix removes the unsafe `exec(

medium

Integer Overflow to Heap Corruption: Fixing a Critical q3asm Vulnerability

A critical integer overflow vulnerability in the Quake 3 assembler tool (q3asm) allowed attackers to craft malicious assembly source files that triggered heap corruption through a size calculation wraparound, potentially enabling function pointer hijacking and full supply-chain compromise in CI/CD pipelines. The fix introduces proper bounds checking and overflow-safe allocation size calculations, closing a dangerous attack vector that could have given adversaries elevated pipeline privileges. Th

medium

Fixing NULL Pointer Dereference in eMMC Memory Allocation

A high-severity NULL pointer dereference vulnerability was discovered and fixed in embedded eMMC storage handling code, where unchecked `malloc` and `calloc` return values could allow an attacker with a crafted eMMC image to crash the host process. The fix adds proper NULL checks after every memory allocation, preventing exploitation through maliciously oversized partition size fields. This type of vulnerability is surprisingly common in systems-level C code and serves as a reminder that defensi