• Build With Petar
  • Posts
  • The Best Way To Learn JS Async and Concurrency (Isn’t What You Think)

The Best Way To Learn JS Async and Concurrency (Isn’t What You Think)

// How sending 10,000 requests taught me more than 10 tutorials ever did.

Most web developers "learn" async by reading await and moving on.

You call an API. It works. You feel smart.

But when it's time to scale - 1000 users, 10000 requests, constant background jobs - everything collapses.

Your code starts to feel like magic that you can't control.

The only way I truly understand async and concurrency was by building a tool that needed to control it.

Not a todo app.

A stress tester - something that tried to push Node.js to its limits.

👣 Step 1 - Build the CLI Interface

Before we stress anything, we need a script that takes a user input:

node stress.js http://localhost:3000 100 5

That means:

  • 100 total requests

  • 5 running at once (concurrency)

Here's the code to set that up:

const http = require("http");
const https = require("https");
const { URL } = require("url");

const args = process.argv.slice(2);

if (args.length < 2) {
  console.error("Usage: node stress.js <url> <totalRequests> [concurrency]");
  process.exit(1);
}

const targetUrl = new URL(args[0]);
const totalRequests = parseInt(args[1], 10);
const concurrency = parseInt(args[2], 10) || 1;

console.log(`Target: ${targetUrl.href}`);
console.log(`Total Requests: ${totalRequests}`);
console.log(`Concurrency: ${concurrency}`);

Step 2 - Fire a Single Request (And Time It)

Let's prove that we can send one HTTP request, and track how long it takes.

We wrap it in a Promise so we can later use await or .then():

function fireOnce() {
  return new Promise((resolve) => {
    const lib = targetUrl.protocol === "https:" ? https : http;
    const start = Date.now();

    const req = lib.request(targetUrl, (res) => {
      res.on("data", () => {}); // required or 'end' won't fire
      res.on("end", () => {
        const duration = Date.now() - start;
        resolve({ status: res.statusCode, duration, error: false });
      });
    });

    req.on("error", (err) => {
      const duration = Date.now() - start;
      resolve({ status: null, duration, error: true, message: err.message });
    });

    req.end();
  });
}

💡 Why this matters:

We're not just sending requests.

We're tracking the pressure we're putting on the system.

Next, we'll build the engine that sends many requests at once - without losing control.

async function runLoadTest() {
  let inFlight = 0;
  let completed = 0;
  const results = [];

  return new Promise((resolve) => {
    function launchNext() {
      if (completed >= totalRequests) {
        return resolve(results);
      }

      if (inFlight >= concurrency) {
        return;
      }

      inFlight++;
      fireOnce().then((result) => {
        inFlight--;
        completed++;
        results.push(result);
        process.stdout.write(`\rCompleted: ${completed}/${totalRequests}`);
        launchNext();
      });

      launchNext();
    }

    for (let i = 0; i < concurrency && i < totalRequests; i++) {
      launchNext();
    }
  });
}

Step 4 - Report the Results

Once all requests are done, let’s actually summarize the impact.

We collect:

  • ✅ How many succeeded

  • ❌ How many failed

  • ⏱ The average response time

Here’s the final reporting logic:

runLoadTest().then((results) => {
  const total = results.length;
  const failures = results.filter(r => r.error).length;
  const successes = total - failures;
  const avg = Math.round(results.reduce((sum, r) => sum + r.duration, 0) / total);

  console.log('\n\n--- Report ---');
  console.log(`✅ Successes: ${successes}`);
  console.log(`❌ Failures: ${failures}`);
  console.log(`📊 Avg Response Time: ${avg} ms`);
});

This tells us not just that “it worked” - but how well the system held under pressure.

Now we’re not just writing code.

We’re observing system behavior.

Step 5 - Prove That Concurrency Actually Works

Async is easy to fake. Concurrency is not.

Let’s make sure this stress tester isn’t just sending requests - it’s sending them at the same time.

Here is the code with necessary output logs.



async function fireOnce(id) {
  const start = Date.now();
  console.log(`[START] ${id} @ ${start}`);

  return new Promise((resolve) => {
    const lib = targetUrl.protocol === "https:" ? https : http;

    const req = lib.request(targetUrl, (res) => {
      res.on("data", () => {});
      res.on("end", () => {
        const end = Date.now();
        console.log(`[END]   ${id} @ ${end} (Duration: ${end - start}ms)`);
        resolve({
          status: res.statusCode,
          duration: end - start,
          error: false,
        });
      });
    });

    req.on("error", (err) => {
      const end = Date.now();
      console.log(`[FAIL]  ${id} @ ${end} (${err.message})`);
      resolve({
        status: null,
        duration: end - start,
        error: true,
        message: err.message,
      });
    });

    req.end();
  });
}

async function runLoadTest() {
  let inFlight = 0;
  let completed = 0;
  const results = [];
  let nextId = 1;

  return new Promise((resolve) => {
    function launchNext() {
      if (completed >= totalRequests) {
        return resolve(results);
      }

      if (inFlight >= concurrency || nextId > totalRequests) {
        return;
      }

      if (inFlight >= concurrency) {
        return;
      }

      inFlight++;
      const id = nextId++;
      fireOnce(id).then((result) => {
        inFlight--;
        completed++;
        results.push(result);
        process.stdout.write(`\rCompleted: ${completed}/${totalRequests}`);
        launchNext();
      });

      launchNext();
    }

    for (let i = 0; i < concurrency && i < totalRequests; i++) {
      launchNext();
    }
  });
}

runLoadTest().then((report) => {
  console.log(report);
  process.exit(0);
});

Summary

We don't use threads. We don't use libraries. We just orchestrate async logic - and here's how it all stays under control:

  1. The for loop at the bottom doesn’t send all the requests. It just kicks off the first few - up to your concurrency limit.

  2. Each launchNext() call:

  • Checks if there's room to fire another request

  • Increments inFlight to reserve a slot

  • Starts the request (non-blocking)

  • Immediately tries to fill more (second launchNext())

  1. When a request finishes:

  • We decrement inFlight

  • Increment completed

  • Trigger launchNext() again to refill the open slot

  1. This forms a self-regulating loop:

  • Requests go out only when there's room

  • The system never exceeds concurrency

  • Pressure is steady, not bursty

The end result:

  • A lightweight engine that launches exactly the right number of concurrent HTTP requests

  • Fully async, no blocking

  • No overshoot, no stall

🔁 This is concurrency without parallelism - A system that juggles tasks, not threads.

🧘 Conclusion - This Isn't a Production Tool. It’s Something Better.

Let’s be clear:

This isn’t a high-precision, distributed load tester. It doesn’t handle retries, timeouts, warm-up phases, or real throughput analytics.

It’s not built for production. It’s built for understanding.

If you want to:

  • Sharpen how you think about async

  • See what happens when systems are under pressure

  • Build confidence managing concurrency manually

Then this little script will do more for you than any tutorial ever could.

You don’t need to learn everything. You need to build just enough to realize what’s actually going on.

And once you see it, you start to think like an engineer - not just a developer.

🛠 Ready To Become a Real Fullstack Engineer?

You don’t need more frameworks. You need to build systems that hold under pressure.

If this kind of raw, systems-first engineering speaks to you, I’m building a path to senior fullstack grounded in:

  • Minimalist backend engineering

  • Systems thinking

  • Developer tools + internal MVPs

  • Async mastery through building, not watching

👉 Fill out this form or just reply and let’s talk. I only take on 1–2 developers at a time.

🚀 Need a Coder Who Gets Leverage?

If you’re a founder, operator, or indie hacker and need help with:

  • Coding your MVP fast

  • Building async tools, scrapers, bots, dashboards

  • Delegating your dev stack like a one-person team

👉 Fill out this form or just reply and let’s talk. I only take on 1–2 projects at a time.