≡ Menu

Mastering Concurrent API Calls: A Deep Dive into Batched Fetching in JavaScript

Credit:https://bit.ly/40AWbX5

When building modern web applications, interacting with APIs is a daily task.

Often, you might find yourself needing to fetch data from numerous endpoints.

While simply firing off all requests at once with Promise.all() might seem like a quick solution, it can lead to several issues:

  • API Rate Limits: Many APIs restrict the number of requests you can make within a certain timeframe. Hitting these limits can result in errors and temporary bans.
  • Resource Consumption: Too many concurrent network requests can overwhelm the browser or server, leading to sluggish performance, memory leaks, or even crashes.
  • Network Congestion: Flooding the network with requests can cause delays and reduce overall efficiency.

This is where batched fetching comes in handy.

By processing your API calls in controlled groups, you can manage resources, respect rate limits, and ensure a smoother user experience.

In this blog post, we’ll dissect a powerful JavaScript async function designed for this very purpose: fetchInBatches.

The fetchInBatches Function: A Closer Look

Here’s the JavaScript function we’ll be exploring:

async function fetchInBatches(urls, batchSize = 5) {
  // 1. Initialize an array to store all results
  const results = [];

  // 2. Loop through the URLs array in chunks (batches)
  for (let i = 0; i < urls.length; i += batchSize) {
    // 3. Extract the current batch of URLs
    const batch = urls.slice(i, i + batchSize);

    // 4. Create an array of Promises for fetching each URL in the current batch
    const batchPromises = batch.map(url =>
      fetch(url).then(res => {
        // Check if the response is OK (status in the 200-299 range)
        if (!res.ok) {
          // If not OK, throw an error for better error handling later
          throw new Error(`HTTP error! status: ${res.status} for URL: ${url}`);
        }
        return res.json(); // Parse the response body as JSON
      })
    );

    // 5. Wait for all Promises in the current batch to resolve
    try {
      const batchResults = await Promise.all(batchPromises);
      // 6. Add the results of the current batch to the main results array
      results.push(...batchResults);
    } catch (error) {
      // 7. Handle any errors that occurred within the current batch
      console.error("Error fetching a batch:", error);
      // Depending on your application's needs, you might:
      // - Log the error and continue
      // - Push a placeholder/error object into `results` for the failed requests
      // - Implement retry logic
      // - Stop further processing (e.g., `break;`)
    }
  }

  // 8. Return all collected results
  return results;
}

Step-by-Step Breakdown

Let’s break down each numbered section of the code to understand its role.

1. Function Signature and Initialization

async function fetchInBatches(urls, batchSize = 5) {
  const results = [];
  // ...
}

  • async function fetchInBatches(urls, batchSize = 5):
    • The async keyword is essential. It tells JavaScript that this function will perform asynchronous operations and allows us to use the await keyword inside it. The function itself will implicitly return a Promise.
    • urls: This parameter expects an array of strings, where each string is a URL to an API endpoint.
    • batchSize = 5: This is a default parameter. If you call fetchInBatches without providing a batchSize, it will automatically use 5. This value determines how many fetch requests will run concurrently in each batch.
  • const results = [];: An empty array is initialized to store the data returned from all the successful API calls across all batches.

2. Iterating Through URLs in Batches

for (let i = 0; i < urls.length; i += batchSize) {
  // ...
}

  • This for loop is the core of the batching mechanism.
  • let i = 0: The loop starts at the beginning of the urls array.
  • i < urls.length: The loop continues as long as i is within the bounds of the urls array.
  • i += batchSize: After each iteration (i.e., after processing one batch), the i variable is incremented by batchSize. This ensures that the loop jumps to the start of the next batch of URLs.

3. Extracting the Current Batch

const batch = urls.slice(i, i + batchSize);

  • Inside the loop, urls.slice(i, i + batchSize) is used to extract a subset of URLs for the current batch.
  • The slice() method creates a new array containing elements from the i index (inclusive) up to, but not including, i + batchSize.
  • If i + batchSize goes beyond the end of the urls array, slice() gracefully handles it by just including the remaining elements.

4. Creating Promises for the Batch

const batchPromises = batch.map(url =>
  fetch(url).then(res => {
    if (!res.ok) {
      throw new Error(`HTTP error! status: ${res.status} for URL: ${url}`);
    }
    return res.json();
  })
);

  • batch.map(...): The map() method is incredibly powerful here. It iterates over each url in the batch array and transforms it into a Promise.
  • fetch(url): This initiates the actual network request for each URL. It returns a Promise that resolves to a Response object.
  • .then(res => { ... }): This chain handles the Response object once the fetch Promise resolves.
    • if (!res.ok): It’s crucial to check res.ok. The fetch API’s Promise only rejects for network errors (e.g., no internet connection). It does not reject for HTTP error status codes (like 404 Not Found or 500 Internal Server Error). res.ok is a boolean that is true for successful HTTP status codes (200-299) and false otherwise. If res.ok is false, we explicitly throw new Error() to ensure our Promise.all (and subsequent try...catch) can properly detect and handle these API-level errors.
    • return res.json(): This parses the body of the Response object as JSON. This method also returns a Promise, which resolves with the parsed JavaScript object.
  • The batchPromises array will now hold a collection of Promise objects, each representing the eventual JSON data from a URL in the current batch.

5. Waiting for Batch Completion

try {
  const batchResults = await Promise.all(batchPromises);
  // ...
} catch (error) {
  // ...
}

  • await Promise.all(batchPromises): This is where the concurrent execution within a batch happens.
    • Promise.all() takes an array of Promises (batchPromises in this case).
    • It returns a single Promise that will:
      • Resolve with an array of all the resolved values from batchPromises (in the same order) only when all of them have successfully resolved.
      • Reject immediately with the reason of the first Promise that rejects.
    • The await keyword pauses the execution of the fetchInBatches function until this Promise.all Promise settles. This means the next batch won’t start until the current one is entirely complete.
  • try...catch: This block is vital for robust error handling. If any fetch request within the batchPromises fails (either a network error or an HTTP error that we explicitly threw), Promise.all will reject, and the catch block will execute, allowing you to log the error or handle it as needed without crashing the entire process.

6. Aggregating Results

results.push(...batchResults);

  • ...batchResults: This is the spread syntax. It takes all the individual elements from the batchResults array (the successfully fetched JSON data from the current batch) and “spreads” them as separate arguments to the push() method.
  • This effectively adds all the data from the current batch into the main results array, building up the complete collection of fetched data.

7. Returning All Collected Data

return results;

  • Once the for loop has completed iterating through all the batches, the function returns the results array, which now contains all the successfully fetched and parsed JSON data from all the URLs.

Example Usage

To use this function, you simply provide an array of URLs and, optionally, your desired batchSize.

// Imagine these are your actual API endpoints
const apiUrls = [
  'https://jsonplaceholder.typicode.com/todos/1',
  'https://jsonplaceholder.typicode.com/todos/2',
  'https://jsonplaceholder.typicode.com/todos/3',
  'https://jsonplaceholder.typicode.com/todos/4',
  'https://jsonplaceholder.typicode.com/todos/5',
  'https://jsonplaceholder.typicode.com/todos/6',
  'https://jsonplaceholder.typicode.com/todos/7',
  'https://jsonplaceholder.typicode.com/todos/8',
  'https://jsonplaceholder.typicode.com/todos/9',
  'https://jsonplaceholder.typicode.com/todos/10'
];

async function runBatchedFetches() {
  console.log("Starting batched fetches...");
  // Fetch 3 URLs at a time
  const allData = await fetchInBatches(apiUrls, 3);
  console.log("All data fetched:", allData);
  console.log("Total items:", allData.length);
}

runBatchedFetches();

When you run runBatchedFetches(), you’ll see the requests happening in groups of three, with a slight pause between each group as await Promise.all waits for the current batch to complete.

Benefits of Batched Fetching

  • Rate Limit Compliance: Prevents you from overwhelming APIs that have strict rate limits.
  • Improved Performance: Reduces the strain on the client’s network and CPU by limiting concurrent requests.
  • Better Error Handling: Allows for more granular error handling. If one request in a batch fails, you can catch it without necessarily stopping all subsequent batches (depending on your catch block implementation).
  • Resource Management: More efficient use of network connections and system resources.
  • Predictable Behavior: Provides a more controlled and predictable flow for large sets of asynchronous operations.

Conclusion

The fetchInBatches function is a practical and essential pattern for any JavaScript developer dealing with numerous API calls.

By leveraging async/await, Promise.all(), and Array.prototype.slice(), you can build more powerful applications.

Incorporate this technique into your toolkit to manage your asynchronous operations like a pro!

{ 0 comments… add one }

Leave a Comment