I’m working with Puppeteer in my Node.js project and running into timeout issues when loading heavy pages. The default 30 second limit isn’t enough for some of the larger files I need to process.
Here’s the timeout error I keep getting:
TimeoutError: Navigation Timeout Exceeded: 30000ms exceeded
at Promise.then (/app/node_modules/puppeteer/lib/NavigatorWatcher.js:74:21)
at <anonymous> name: 'TimeoutError'
My current navigation code looks like this:
await browser.goto(baseUrl + pageId, {waitUntil: 'domcontentloaded'});
What’s the best way to increase this timeout or handle it gracefully when pages take longer to load?
The timeout parameter helps, but don’t just crank up the numbers. I combine a reasonable timeout increase with proper error handling:
try {
await page.goto(baseUrl + pageId, {
waitUntil: 'domcontentloaded',
timeout: 45000
});
} catch (error) {
if (error.name === 'TimeoutError') {
// Log the problematic URL and retry with longer timeout
console.log(`Slow page detected: ${baseUrl + pageId}`);
await page.goto(baseUrl + pageId, {
waitUntil: 'domcontentloaded',
timeout: 120000
});
} else {
throw error;
}
}
You won’t wait 2 minutes on every page by default, but you’ll handle the genuinely slow ones gracefully. This catches about 95% of cases without the performance hit of always using maximum timeouts.
You can bump up the timeout like mentioned before, but manually handling heavy page loads gets old fast.
I’ve hit this same issue - some pages load in 10 seconds, others take 2 minutes. Instead of guessing timeouts, I built automation that monitors page performance and adjusts timeouts dynamically based on page size and content type.
It pulls page metadata first, estimates load time, then sets timeouts automatically. Also handles retries with exponential backoff when pages still fail.
For your current code, set timeout to 0 to disable it completely:
await browser.goto(baseUrl + pageId, {waitUntil: 'domcontentloaded', timeout: 0});
But the automated approach saves way more headaches long term. Set it up once and forget about timeout issues.
Check out Latenode for building this kind of smart page loading automation: https://latenode.com
just add {waitUntil: 'domcontentloaded', timeout: 60000} to ur goto options. worked great for me on heavy pages!
I’ve encountered similar timeout challenges when scraping e-commerce sites congested with images and scripts. While extending the timeout is a quick fix, it means waiting unnecessarily on fast pages too. A better solution is to adjust the wait conditions. Instead of domcontentloaded, use networkidle2 - this waits until only two network connections remain idle for at least 500ms:
await browser.goto(baseUrl + pageId, {
waitUntil: 'networkidle2',
timeout: 90000
});
This way, you can efficiently handle heavy content without waiting for every single resource. For persistently problematic pages, consider implementing a try-catch block around your navigation code, with a fallback that may revert to domcontentloaded or block unnecessary requests using page.setRequestInterception().