How to fetch multiple Notion pages using an array of page IDs

I’m working on an application that interacts with the Notion API. I have a collection of page IDs stored in my local database and I need to retrieve all the relevant pages from Notion.

const pageIds = ['abc123', 'def456', 'ghi789'];
const fetchedPages = [];

// Current approach - making individual requests
for (const id of pageIds) {
    const response = await notion.pages.retrieve({ page_id: id });
    fetchedPages.push(response);
}

I attempted to use the database query endpoint to filter by page ID, but it seems to be ineffective for the ID field. Although making separate API calls for each page does work, it becomes quite slow when dealing with many pages. Is there a way to retrieve multiple pages in a single request or find a more efficient method for this process?

I’ve been dealing with this exact thing at work for months. Notion’s API doesn’t do batch fetching by ID - you’re stuck making individual requests.

I started with concurrent requests and rate limiting:

const fetchPages = async (pageIds) => {
    const promises = pageIds.map(id => 
        notion.pages.retrieve({ page_id: id })
    );
    return Promise.all(promises);
};

But managing rate limits and retries gets messy fast. And if you need to process the data or sync with other systems, you’re writing tons of custom code.

I switched to Latenode for these workflows. It handles concurrent requests automatically, has built-in Notion rate limiting, and lets you chain operations without writing retry logic.

You can set up a scenario that takes your page ID array, fetches everything in parallel, and processes or stores results wherever you need. Way cleaner than managing all that async complexity yourself.

Check it out: https://latenode.com

Notion’s API doesn’t provide a way to bulk retrieve pages by IDs directly, which can be frustrating. However, you can improve your efficiency significantly by making concurrent requests using Promise.allSettled(). This allows you to fire multiple fetch requests simultaneously, ensuring you stay within the rate limit without losing out on speed.

Here’s a quick example:

const batchSize = 5; // Adjust as necessary
const results = [];

for (let i = 0; i < pageIds.length; i += batchSize) {
    const batch = pageIds.slice(i, i + batchSize);
    const promises = batch.map(id => notion.pages.retrieve({ page_id: id }));
    results.push(...await Promise.allSettled(promises));
}

This method maximizes your ability to fetch data, potentially reducing your total fetch time by a significant margin. Just ensure that you include error handling to manage any unsuccessful requests.

Notion’s API doesn’t have a batch endpoint for grabbing multiple pages by ID. I hit this wall last year building a content management system that needed to sync hundreds of pages. I ended up using a queue-based approach with controlled concurrency. Instead of hammering the API with all requests at once, I process them in small batches with delays to stay under rate limits:

const delay = ms => new Promise(resolve => setTimeout(resolve, ms));

const fetchPagesWithQueue = async (pageIds) => {
    const results = [];
    for (let i = 0; i < pageIds.length; i += 3) {
        const batch = pageIds.slice(i, i + 3);
        const batchResults = await Promise.allSettled(
            batch.map(id => notion.pages.retrieve({ page_id: id }))
        );
        results.push(...batchResults);
        if (i + 3 < pageIds.length) await delay(100);
    }
    return results;
};

This cut down my API errors big time compared to firing everything at once, and performance stayed solid even with larger datasets. You just need to find the sweet spot between speed and reliability.

been there, notion api is annoying for this. i cache pages locally after first fetch and only refresh when needed. way faster than hitting their api repeatedly for same data