How to optimize multiple API requests in JavaScript without hitting rate limits

I’m working on a project where I need to make several API calls in sequence. My main concern is avoiding rate limit issues while keeping the code efficient. I’ve tried using a simple loop but I’m worried about overwhelming the API server.

Here’s my current approach that needs improvement:

if (requestData.type === "photo") {
  for (let index = 0; index < requestData.items.length; index++) {
    try {
      const config = {
        method: "POST",
        headers: {
          accept: "application/json",
          "content-type": "application/json",
          authorization: "Bearer my-token-here",
        },
        body: JSON.stringify({
          file: `${requestData.items[index].file}`,
        }),
      };
      
      const result = await fetch(
        `https://api.example.com/upload/photo?description=${encodeURIComponent(
          requestData.items[index].description
        )}&recipient=${userId}`,
        config
      );
      const responseData = await result.json();
      console.log(`Photo ${index + 1} uploaded:`, responseData);
      
      await waitTime();
    } catch (err) {
      console.error(`Failed to upload photo ${index + 1}:`, err);
    }
  }
  return { status: "Upload completed" };
}

What’s the most effective way to handle multiple API requests while preventing rate limit violations? I need all requests to complete successfully regardless of individual response status.

Been there with rate limiting - it’s a pain. Use a token bucket system where you set a request budget per time window. Build a simple rate limiter class that counts requests and resets based on the API limits. Most APIs let you burst to a threshold, then kick in sustained rates. Your waitTime() function’s doing something, but you need dynamic delays based on what the API actually tells you. Watch for X-RateLimit-Remaining and X-RateLimit-Reset in the response headers. When quota’s running low, bump up the delay automatically. Way better than guessing and way better than hitting 429 errors that force you to wait even longer.

use a queue with promise throttling - p-limit works well for this. set it to 2-3 concurrent requests and add delays between batches. check if the api sends retry-after headers when you hit limits - most decent apis tell you exactly how long to wait.

Use exponential backoff to handle rate limits. Start with a short delay after failures, then double it on each retry until you hit your max limit. When you get a 429 error, this automatically adjusts your request speed based on what the API tells you. Batch your requests in groups of 3-5 with Promise.allSettled—it’s more efficient and you can add small delays between batches. Don’t forget to check the response headers for rate limit info so you can tweak your timing.