How to fetch multiple Notion pages using an array of page IDs

I’m working on an app that uses the Notion API. I have an array of page IDs stored in my local database and I need to fetch all the corresponding pages from Notion.

I tried using the database query endpoint to filter pages by their IDs, but it looks like you can’t filter by the ID property. The only way I can think of is to call the single page retrieval endpoint multiple times, but that seems really inefficient.

Has anyone found a better approach to get multiple pages at once? Is there some batch endpoint or workaround I’m missing?

Here’s what I’m currently doing:

const pageIds = ['abc123', 'def456', 'ghi789'];
const fetchedPages = [];

for (const id of pageIds) {
  const response = await notion.pages.retrieve({ page_id: id });
  fetchedPages.push(response);
}

This works but makes too many API calls. Any suggestions for a more efficient solution?

yeah, no batch endpoint for that unfortunately. ur approach is pretty standard tho. u could use Promise.all() to run those calls concurrently instead of one by one - it’ll be way faster!

yeah, seems like indiv calls r the only way. maybe add a retry wrapper since notion’s api can be a bit shaky. also, think about if u really need all those pages at once - lazy loading could save u some hassle, depends on ur app!

I’ve faced a similar issue before, and you’re right that individual page retrieval is typically the only route for fetching pages with different IDs across various databases. What I found helpful was implementing a queuing system to manage requests more effectively. This way, I could batch them while also introducing slight delays to play nicely with the API’s rate limits. Additionally, maintaining a local cache of the page metadata allowed me to minimize unnecessary requests, as I would only refetch when the modification timestamps indicated a change. Lastly, think about optimizing your data flow; sometimes, you can pull relevant pages during other processes rather than executing bulk requests. The API’s rate limits are generally accommodating unless you’re working with a very high volume of pages.

Hit this exact problem building our content management pipeline. We had hundreds of Notion pages across different databases that needed regular syncing.

Yeah, Promise.all and caching work, but managing error handling, rate limiting, and retries gets messy quick. You’ll still need to build your own queuing system and handle webhook updates for real-time sync.

Ended up using Latenode for the whole flow. Native Notion connectors handle all the API stuff automatically. Feed it your page IDs and it batches requests while respecting rate limits.

Best part - you can set triggers to auto-fetch updated pages when they change instead of constantly polling. Added logic to detect which pages actually need updating based on modification times.

30 minutes to build vs weeks maintaining custom code. Error handling and retries are built in.

Had this exact problem last year building a documentation aggregator. There’s no batch retrieval endpoint for random page IDs - the database query only works if all your pages are in the same database, which doesn’t sound like your situation. I used Promise.all() with rate limiting to avoid hitting API limits. Chunk your requests into 5-10 concurrent calls instead of firing them all at once. I also cached pages on my end using the last_edited_time property to skip re-fetching unchanged pages. Cut my API usage way down for pages that don’t update much.