How to handle pagination when querying Notion database API?

I’m working with the Notion API to fetch data from a database that has more than 100 records. I’m using a POST request to query the database and trying to implement pagination using the start_cursor parameter.

curl -X POST 'https://api.notion.com/v1/databases/DATABASE_ID/query' \
  -H 'Authorization: Bearer TOKEN' \
  -H 'Notion-Version: 2021-05-13' \
  -H 'Content-Type: application/json' \
  --data '{
    "filter": {},
    "start_cursor": "CURSOR_VALUE_FROM_PREVIOUS_CALL"
  }' > database_results.json

The problem I’m facing is that even when I include the start_cursor from the previous response, I keep getting the same first page of results instead of the next page. What’s the correct way to structure this request to get paginated results from a Notion database?

Check if you’re using the exact same headers and parameters every time. I hit this same issue while building a migration tool - I wasn’t consistent with the Notion-Version header, which made the API reset pagination. Also double-check that your database permissions didn’t change between calls. Notion sometimes quietly falls back to page 1 when there’s an auth problem with later pages. Try this: make your first request without any cursor, then immediately use that next_cursor for the second call. This’ll tell you if it’s a cursor problem or something else in your request setup.

I’ve hit this pagination nightmare tons of times. The problem? You’re manually handling cursor logic, which always breaks.

Notion’s pagination is a pain - you loop through responses, grab the next_cursor, and deal with edge cases like empty results or broken cursors. Then there’s rate limits and API failures on top of that.

Skip debugging curl commands. I automated the whole thing with a workflow that handles the pagination loop - grabs the first page, checks has_more, pulls the cursor, keeps going until it gets everything. No copying cursors by hand, no missing pages.

It also retries when the API acts up and outputs however you want - JSON, CSV, or pushes data wherever you need it.

Saves me hours every time I pull large datasets from Notion. Way better than manual debugging.

Try adding a page_size parameter to your request - Notion defaults to 100 but there might be some weird limit issue. Also double-check you’re using the exact next_cursor value from the JSON response, not copying it wrong from the terminal output.

same thing happened to me last week. check your cursor value - make sure it doesn’t have extra quotes or whitespace. also verify the response actually shows has_more: true before you try paginating. sometimes the database is just smaller than you think.

This happens when your cursor gets corrupted. I dealt with this all the time until I figured out I was accidentally changing the cursor value between requests. Copy the cursor exactly from the next_cursor field - don’t trim or mess with it at all. Make sure your previous response has a next_cursor field and has_more is true. If has_more is false, you’re done - no more pages left. What helped me debug this was logging the request body and response so I could see exactly what cursor values were going back and forth. Also, if you change your filter or sort parameters mid-pagination, Notion resets everything and you’ll get weird results.

Had this exact problem building a content sync tool. The fix was passing the cursor value as a raw string - no JSON escaping. When you copy from the previous response, grab just the cursor ID, not the whole next_cursor object. Also check your database ID is right. I wasted hours debugging pagination because I was hitting a test database with barely any records. Another thing - filters mess with pagination, so try without the filter first. Notion’s API is annoying about malformed requests and just returns the first page instead of erroring out.