Photo by https://unsplash.com/@abebarrera

Asynchronous Javascript: Sequential, Parallel, and Combined Processing

Control asynchronous flow with sequential and parallel operations in Javascript with async await and more

Tuesday, Mar 9, 2021

avatarjavascript



Event Loop (Quick Synopsis)

The Browser and NodeJs run a constant single threaded event loop to execute your Javascript code. It will first run all of the synchronous non blocking code while queueing up any asynchronous events to be executed at a later point in time.

Tasks such as timeouts, setIntervals, network requests, promises will run in a separate thread pool and once those tasks finish, the event loop will either queue up the microtask at the end of the current event loop/ macrotask at the beginning of the next event loop respectively.

Fantastic talk on the event loop by Jake Archibald

Sequential

Use Cases:

Sequential processing is incredibly handy for asynchronous tasks that need to run in a particular order. You might have data that is required for an API request that depends on another API request

Fetch data from your api, fetch data from another api


Delay Function to mock an asynchronous API call.

const apiCall = (ms) => new Promise((res) => setTimeout(res, ms));
(async () => {
 
    console.log('Api Call 1')
    await apiCall(1000);
 
    console.log('Api Call 2')
    await apiCall(1000);
 
    console.log('Api Call 3')
    await apiCall(1000);
 
    console.log('done!')
 
    /* Outputs */
    // 'Api Call 1'
    // 'Api Call 2'
    // 'Api Call 3'
    // 'done!'
})();

for of loop to sequentially process each apiCall within the array of emoji.

(async () => {
 
    const items = ['šŸ„•', 'šŸš€', 'šŸŽ', 'šŸŒ', 'šŸ¶'];
    for (const item of items) {
        console.log(item)
        await apiCall(2000);
        console.log(`${item} done`)
    }
 
})();

Outputs

šŸ„•
šŸ„• done
šŸš€
šŸš€ done
šŸŽ
šŸŽ done
šŸŒ
šŸŒ done
šŸ¶
šŸ¶ done

Can alternatively use a reduce function to control sequential order

(async () => {
 
    await ['šŸ„•', 'šŸš€', 'šŸŽ', 'šŸŒ', 'šŸ¶'].reduce(async (prev, cur) => {
        // await previous promise, 
        // if remove this line, will run in parallel
        await prev;
        //
        console.log(cur)
        await apiCall(2000);
        console.log(`${cur} done`)
        return;
    }, Promise.resolve());
 
})();

Parallel

Use Cases:

If your promises/asnc tasks have 0 dependencies of one another and only care when all of them resolve to move onto other asynchronous tasks.

(async () => {
    // 
    await Promise.all(['šŸ„•', 'šŸš€', 'šŸŽ', 'šŸŒ', 'šŸ¶'].map(async (item) => {
        console.log(item)
        await apiCall(2000);
        console.log(`${item} done`);
        return;
    }));
 
})()

Outputs

šŸ„•
šŸš€
šŸŽ
šŸŒ
šŸ¶
šŸ„• done
šŸš€ done
šŸŽ done
šŸŒ done
šŸ¶ done

Alternatively can use the previously used reduce function and remove awaiting the previous returned promise

(async () => {
 
    await ['šŸ„•', 'šŸš€', 'šŸŽ', 'šŸŒ', 'šŸ¶'].reduce(async (prev, cur) => {
        // await prev;
        console.log(cur)
        await apiCall(2000);
        console.log(`${cur} done`)
        return;
    }, Promise.resolve());
 
})();

While running in parallel, once each individual promise resolves, you can iterate over the result using for await.

(async () => {
 
    const tasks = ['šŸ„•', 'šŸš€', 'šŸŽ', 'šŸŒ', 'šŸ¶'].map(async (item) => {
        console.log(item)
        await apiCall(2000);
        console.log(`${item} done`);
        return item;
    });
 
    // iterates over results once Promise.all resolves
    for await (const i of tasks) {
        console.log(`for of ${i}`)
    }
 
})()

Outputs

šŸ„•
šŸš€
šŸŽ
šŸŒ
šŸ¶
šŸ„• done
for of šŸ„•
šŸš€ done
for of šŸš€
šŸŽ done
for of šŸŽ
šŸŒ done
for of šŸŒ
šŸ¶ done
for of šŸ¶

Combined

Scenario:

For either performance or API concurrency limitations, you can't affrord to process n number of requests in parallel.

Additionally you don't want to wait for every single async request to resolve one after the other (sequentially), which could take a really long time.

Solution: Combine both sequential and parallel processing

First Implementation

For this attempt we're going to make use of a generator function to create an iterable with inner arrays containing the limit we pass in.

Generator Helper

function * chunkGen(collection, size=2, i=0) {
  for (; i < collection.length; i += size) {
    yield collection.slice(i, i + size);
  }
}
 
function chunk(collection, size=1) {
  const chunked = [];
  const gen = chunkGen(collection, size);
  let c = gen.next();
  while (!c.done) {
    chunked.push(c.value);
    c = gen.next();
  }
  return chunked;
}

This will loop over 20 requests, limiting the total running async tasks to 5. That way we avoid taxing performance or throttle API requests.

...
// helper to create some randomness in our apiCall calls
const delay = () => Math.floor(Math.random() * 5) + 1;
 
(async () => {
 
    const requestsToProcess = ['šŸ„•', 'šŸš€', 'šŸŽ', 'šŸŒ', 'šŸ¶', 'šŸ³', 'šŸ‹',
    'šŸŠ', 'šŸ‰', 'šŸ', 'šŸ¦§', 'šŸ¦’', 'šŸ…', 'šŸ²', 'šŸ„©', 'šŸ¦„', 'šŸˆ', 'šŸ’µ', 
    'šŸ’', 'šŸø'];
    const batches = chunk(requestsToProcess, 5);
    for (const batch of batches) {
        // queue each chunk of promises in parallel
        await Promise.all(batch.map(async (i) => {
            console.log(i)
            await apiCall(delay() * 1000)
            console.log(`done ${i}`)
        }))
    }
 
})();

Outputs

šŸ„•
šŸš€
šŸŽ
šŸŒ
šŸ¶
done šŸ„•
done šŸŒ
done šŸŽ
done šŸ¶
done šŸš€
šŸ³
šŸ‹
šŸŠ
šŸ‰
šŸ
done šŸ‰
done šŸ‹
done šŸ³
done šŸ
done šŸŠ
šŸ¦§
šŸ¦’
šŸ…
šŸ²
šŸ„©
done šŸ¦§
done šŸ¦’
done šŸ„©
done šŸ…
done šŸ²
šŸ¦„
šŸˆ
šŸ’µ
šŸ’
šŸø
done šŸ’
done šŸ¦„
done šŸˆ
done šŸ’µ
done šŸø

So this solves the scenario mentioned. It runs all the promises within a single batch in parallel but processes all of the batches sequentially.

You might have noticed the issue with this approach.

Only once ALL of the promises within a batch resolves, will it move onto the next batch. What if 90% of a batch is done processing and 1 individual promise is still yet to resolve? We could queue up additional promises once resources free up. Meaning we can decrease the overall time it takes to process the entire collection of async tasks.

Final Implementation

For the final solution we're going to take advantage of a promise task queue. Not going to cover how to build from one scratch but feel free to look through the source code from the repo, async-parallel-limit

Going to use this very small npm module I've recently published for fun.

This was inspired by asyncTimesLimit by the async project. Definitely worth taking a look at their docs and features.

async-parallel-limit

npm i @mjyocca/async-parallel-limit

So instead of the previous approach of waiting for all the promises in a batch to resolve before queueing up a new batch, this will queue another async task as soon as each promise resolves.

import asyncParallel from '@mjyocca/async-parallel-limit';
 
(async () => {
    const requestsToProcess = ['šŸ„•', 'šŸš€', 'šŸŽ', 'šŸŒ', 'šŸ¶', 'šŸ³', 'šŸ‹',
    'šŸŠ', 'šŸ‰', 'šŸ', 'šŸ¦§', 'šŸ¦’', 'šŸ…', 'šŸ²', 'šŸ„©', 'šŸ„—', 'šŸˆ', 'šŸ’µ', 
    'šŸ’', 'šŸø', 'šŸŖ²', 'šŸ•ø', 'šŸ—', 'šŸŒ•', 'šŸš', 'šŸ„', 'šŸŖØ', 'šŸŒ¹', 'šŸ‚',
    'šŸŒ²', 'ā›ˆ', 'šŸ‡', 'šŸŗ', 'šŸ¦ž', 'šŸ¦„'];
    // limiting to a total of 5 at any given time
    await asyncParallel(requestsToProcess, 5, async (n, emoji, next) => {
        console.log(emoji)
        await apiCall(delay() * 1000)
        // done
        console.log(`done ${emoji}`)
        next();
    })
})();

Outputs

šŸ„•
šŸš€
šŸŽ
šŸŒ
šŸ¶
done šŸ¶
šŸ³
done šŸ„•
šŸ‹
done šŸŽ
šŸŠ
done šŸš€
šŸ‰
done šŸŒ
šŸ
done šŸ‹
šŸ¦§
done šŸ
šŸ¦’
done šŸ‰
šŸ…
done šŸŠ
šŸ²
done šŸ³
šŸ„©
done šŸ¦’
šŸ¦„
done šŸ¦§
šŸˆ
done šŸ…
šŸ’µ
done šŸ²
šŸ’
done šŸ’µ
šŸø
done šŸˆ
done šŸ„©
done šŸ¦„
done šŸø
done šŸ’

As you can see from the output it doesn't wait for 5 to finish before queuing up more promises. Making it a whole lot faster completing the entire queue.



Hope this helps and cheers!