Retrieve all accessible databases via Notion API using JavaScript

Getting List of Database IDs from Notion API

I need help getting all database IDs that my integration can access through the Notion API. Ideally, I would like a response that looks like this:

{
  object: 'list',
  results: [
    {
      object: 'block',
      id: 'b92g7037-8e3c-54dd-854e-gd3578b716c5',
      created_time: '2022-02-12T18:18:00.000Z',
      last_edited_time: '2022-02-15T16:29:00.000Z',
      has_children: false,
      archived: false,
      type: 'child_database',
      child_database: { title: 'Records' }
    }
  ],
  next_cursor: null,
  has_more: false
}

In reality, I only need the IDs and titles of the databases. Everything was working perfectly until users started reorganizing content, which led to databases being nested within other blocks.

I’ve attempted using the search API with certain filters, but the output is overwhelming, with unnecessary data such as parent pages and individual database rows:

const fetchAllDatabases = async () => {
  console.log('Fetching databases...');
  const dbResults = await notion.search({
    filter: {
      value: 'database',
      property: 'object'
    } 
  });

  console.log('Results:');
  console.dir(dbResults, { depth: null });
}

The results provide excessive information, including all properties and rows of the databases. My objective is simply to create an array like this: [database_id_1, database_id_2].

Additionally, I tried to recursively check through the page blocks:

const fetchBlockChildren = async (pageId) => {
  const blockList = await notion.blocks.children.list({
    block_id: pageId
  });
  return blockList;
}

const extractDatabases = async (pageId) => {
  let dbIds = [];

  const blockList = await fetchBlockChildren(pageId);

  blockList.results.forEach( async (block) => {
    if (block.has_children === true) {
      const nestedDatabases = await extractDatabases(block.id);
      dbIds = [...dbIds, ...nestedDatabases];
    }
    if (['child_database', 'database'].includes(block.type)) {
      dbIds.push(block.id)
    }
  });

  return dbIds;
}

However, I encountered challenges with async/await when using forEach loops. I attempted to implement Promise.all with map/filter, but ran into problems with mismatched array sizes during recursion.

What would be the best method to obtain a concise list of all database IDs accessible to my app?

Your recursive approach is on the right track, but there’s an easier fix for that async forEach problem. Just use Promise.all with proper array handling:

const extractDatabases = async (pageId) => {
  const blockList = await fetchBlockChildren(pageId);
  const dbIds = [];
  
  const nestedPromises = blockList.results.map(async (block) => {
    const currentIds = [];
    
    if (['child_database', 'database'].includes(block.type)) {
      currentIds.push(block.id);
    }
    
    if (block.has_children) {
      const nested = await extractDatabases(block.id);
      currentIds.push(...nested);
    }
    
    return currentIds;
  });
  
  const results = await Promise.all(nestedPromises);
  return results.flat();
};

This runs everything in parallel instead of sequentially, so it’s way faster than for…of loops when you’re dealing with deeply nested stuff. I’ve used this exact pattern migrating big Notion workspaces and it handles recursion without those timing issues you’re hitting.

the search api approach works well, but you’re overcomplicating it. I’ve built this before - keep it simple. Don’t grab extra data you won’t use, just filter for what you need. your original search code was almost there. Just tack on .map(db => db.id) at the end and you’ll get your database id array. works with nested databases too since search pulls everything no matter how deep it’s buried.

Your problem is mixing forEach with async operations. forEach doesn’t wait for async stuff to finish before hitting the next iteration, so your recursion breaks.

I ran into this exact issue building a Notion workspace analyzer. Switching to for...of fixed it since it actually handles async operations in sequence:

const extractDatabases = async (pageId) => {
  let dbIds = [];
  const blockList = await fetchBlockChildren(pageId);

  for (const block of blockList.results) {
    if (['child_database', 'database'].includes(block.type)) {
      dbIds.push(block.id);
    }
    if (block.has_children) {
      const nestedDatabases = await extractDatabases(block.id);
      dbIds = [...dbIds, ...nestedDatabases];
    }
  }

  return dbIds;
}

This keeps everything in order and waits for each recursive call to finish. Just run this function on each top-level page your integration can access, then merge all the results.

Yeah, this is super common with nested Notion structures. Skip the recursive approach - there’s a way cleaner solution.

I’ve built similar integrations for scanning entire workspaces. Trust me, manual recursion gets messy quick, especially with deep nesting or rate limits.

What actually works: set up automated workflow that discovers all databases for you. Configure it to scan your workspace periodically, grab every database ID no matter how nested, and keep a clean auto-updating list when people move stuff around.

The automation handles async complexity, rate limiting, data formatting - everything. It’ll filter archived databases, sort by modified date, plus you get error handling and retries baked in.

Saved me tons of debugging time and made everything way more reliable. Runs in background, keeps your list current without touching it.

Latenode makes this Notion automation pretty straightforward. You can build the whole database scanning workflow visually and it handles the API stuff.

Been there! You’re on the right track with the search API, just overthinking it. Skip the block recursion and filter what you don’t need:

const databases = await notion.search({
  filter: { property: 'object', value: 'database' },
  page_size: 100
});

const dbIds = databases.results
  .filter(db => !db.archived)
  .map(db => db.id);

Grabs all databases at any nesting level without the async mess.

The search API is your best bet, but you’ve got to handle pagination right or you’ll miss databases spread across multiple pages. I made this mistake last year - wasn’t checking has_more and databases just disappeared.

Here’s what actually worked:

const getAllDatabases = async () => {
  let allDatabases = [];
  let hasMore = true;
  let startCursor = undefined;
  
  while (hasMore) {
    const response = await notion.search({
      filter: { property: 'object', value: 'database' },
      start_cursor: startCursor,
      page_size: 100
    });
    
    allDatabases = [...allDatabases, ...response.results];
    hasMore = response.has_more;
    startCursor = response.next_cursor;
  }
  
  return allDatabases
    .filter(db => !db.archived)
    .map(db => ({ id: db.id, title: db.title[0]?.plain_text || 'Untitled' }));
};

This grabs databases no matter how deep they’re nested and gives you both ID and title. Way cleaner than trying to recursively dig through every block in your workspace.

Everyone’s giving you code fixes, but honestly? You’re building something that needs to run reliably and handle changes when users reorganize their workspaces.

I’ve dealt with this exact scenario multiple times. Started with manual scripts, then users would move databases around and everything broke. Rate limits hit. API changes happened. Async bugs crept in.

What actually solved it was setting up a proper automation pipeline. It monitors your Notion workspace, discovers all databases automatically, handles pagination and rate limiting, and keeps your database list updated whenever people reorganize content.

No more debugging async forEach loops or worrying about missing databases on page 3 of search results. The automation runs on schedule, catches all databases regardless of nesting, and gives you clean JSON output with just the IDs and titles you need.

Plus it handles error cases like network timeouts or API rate limits without your main app breaking. Way more reliable than running these searches manually every time.

Latenode has Notion integrations built in, so you can set up this whole database discovery workflow without writing the pagination and async handling yourself.