Experiencing a 500 error when querying Notion database with certain parameters

I’m encountering issues while using the Notion API to pull data from my database. It works correctly when I request 50 or fewer items, but I face errors in these instances:

  • If I set the page_size to 51 or more, I get a 500 error.
  • Using filters leads to the same error.
  • Sorting parameters cause the request to fail too.
  • Trying to paginate with next_cursor doesn’t work.

Here is the error message I receive:

@notionhq/client warn: request fail {
  code: 'internal_server_error',
  message: 'Unexpected error occurred.'
}

APIResponseError: Unexpected error occurred.
    at buildRequestError (/path/to/project/node_modules/@notionhq/client/build/src/errors.js:162:16)
    at Client.request (/path/to/project/node_modules/@notionhq/client/build/src/Client.js:347:54)
    at async loadData (/path/to/project/app.js:23:20) {
  code: 'internal_server_error',
  status: 500,
  headers: Headers {
    [Symbol(map)]: [Object: null prototype] {
      date: [ 'Fri, 23 Jun 2023 03:55:04 GMT' ],
      'content-type': [ 'application/json; charset=utf-8' ],
      connection: [ 'close' ],
      server: [ 'cloudflare' ]
    }
  }
}

I would like to figure out how to fetch more than 50 items from the database. Has anyone else faced a similar problem?

hey, that 500 error might be due to some messed up data in ur db. try pulling smaller batches like 20 or 30 records and see if that works. once u locate the bad entries, u can fix em. good luck!

Check your database permissions first - this caught me off guard with similar errors. Your integration token might have read access but miss permissions for specific properties or page types. Go to database settings and make sure the integration has full access to everything you’re querying. Also check for formula properties that reference external databases - these cause internal server errors when paginating past the first batch. I fixed most of my 500 errors by excluding complex computed fields from initial queries and fetching them separately. The API docs don’t mention this limitation clearly but it’s definitely real.

Had the same issue a few months ago - turned out to be a timeout on Notion’s side, not my data. My database got pretty big and complex queries were maxing out their processing limits. Fixed it with a retry mechanism using exponential backoff. Same request would often work on the second or third try. Check if you’ve got rich text properties or relations that are making your queries too heavy. I switched to fetching basic properties first, then making separate calls for the heavy stuff when I actually need it.

This exact scenario is why I ditched API wrestling and went with automation platforms.

Notion’s API throws these random 500 errors constantly, especially when you need reliable data pulls. Half the time it’s not your code or bad data - their servers are just having a bad day.

I handle all my Notion stuff through Latenode now instead of building custom retry logic or debugging timeouts. It’s got built-in error handling for API hiccups and handles pagination automatically - no more cursor headaches or batch size math.

Best part? Set up workflows that grab your data, transform it however you want, and push it anywhere. Done with rate limits and mystery 500 errors.

I used to waste hours debugging this crap. Now I build the workflow once and forget about it.

Check it out: https://latenode.com