Notion Database Query Fails After 300 Records - Request Body Too Large

I’m having trouble fetching all records from my Notion database through their API. My database has over 300 entries, but I keep hitting a wall when trying to get them all.

I know Notion limits each request to 100 records, so I wrote code to handle pagination using cursors. It works fine for the first 3 pages (300 records total), but then I get this error:

{'errorId': '94d72e18-3501-4976-8c61-1c66177045d3',
 'name': 'PayloadTooLargeError',
 'message': 'Request body too large.'}

Here’s my pagination function:

def fetch_all_pages(response):
    api_url = f"https://api.notion.com/v1/databases/{db_id}/query"
    cursor = response['next_cursor']
    
    try:
        while response['has_more']:
            response['start_cursor'] = cursor
            payload = json.dumps(response)
            
            # Request next batch of 100 items
            next_response = requests.post(
                api_url, headers=request_headers, data=payload).json()
            
            response["results"] += next_response["results"]
            cursor = next_response['next_cursor']
            if cursor is None:
                break
    except:
        pass
    return response

Is there actually a 300-record limit in Notion’s API? Has anyone found a way around this issue?

yeah, totally get it. notion’s paginatn can be a pain. just remember to send start_cursor with each request - it really helps with that payload error. been there too. keep it simple and stick to the basics!

It’s not a 300-record limit - you’re building your request payload wrong. You’re taking the original response object and sending it back as the request body, which piles up all the previous results. Your payload gets bigger with every pagination call. Just create a fresh request body each time with only the start_cursor. Try payload = json.dumps({'start_cursor': cursor}) instead. Hit this exact issue last month and fixing the payload structure solved it right away. The API handles way more than 300 records when you paginate properly.

Found your problem - you’re sending the entire response object back to the API for pagination, which includes all the accumulated results. That’s why your request body keeps growing until it hits Notion’s size limits. For pagination, you only need the cursor. Just do payload = json.dumps({'start_cursor': cursor}) instead of modifying the whole response object. I made this exact mistake when I started with Notion’s API - was way overthinking it. Their docs aren’t super clear on this, but once you strip the request down to just the cursor, you can pull thousands of records no problem.