I am still getting my head around Node’s async model. The specific problem that I was tackling was calling a paged API endpoint. In most programming languages, you’d call the first page, wait to get the data, then call the second and so on. It turns out that there is a better way todo this in Node. Start by defining a few initial variables :
const fetch = require("node-fetch"); let currentPage = 1; const queryUrl = 'http://local.dev/api.php'; let data = [];
Then, create an array of all the urls you want to call. Well, this means you need to know the number of pages in advance, which can be done by doing a single call first. For simplicity I assume I know the number of pages:
let urls = []; while(currentPage <= 5) { urls.push(queryUrl+'?page='+currentPage); currentPage++; }
Now, just do the calls. Since we are getting back promises, this code will not wait for each individual call to finish:
const promises = urls.map(async url => { try { console.log("Querying "+url) const response = await fetch(url); return response.json(); } catch(err) { console.log(err) } });
Finally, assemble back the data returned by each promise:
for (const promise of promises) { data = data.concat(await promise); }
The data array would contain all the info, also sorted in the correct way no matter if the API calls responded in a different order. Neat.