I’m working on a project where I need to take a full-page screenshot of a website using Puppeteer. The tricky part is making sure all the images are loaded before capturing the screenshot. I’ve tried a few approaches but haven’t had much luck.
This doesn’t seem to work consistently. Sometimes images are still loading when the screenshot is taken. Any ideas on how to improve this or a better approach to ensure all images are loaded before taking the screenshot?
I’ve faced similar challenges with Puppeteer screenshots. Here’s an approach that’s worked well for me:
Instead of using page.waitForTimeout(), try page.waitForNetworkIdle(). This waits for the network to be idle, which often indicates all resources have loaded.
You can also increase the timeout and set a low ‘idle’ threshold:
As someone who’s worked extensively with Puppeteer for web scraping and automation, I have found that simply waiting for network idle often isn’t enough to ensure every image has fully loaded. In my experience, combining network idle with a custom check for image completion works best. I first navigate to the page and scroll to trigger any lazy‐loading of images, then use a function that repeatedly checks every image—including those in iframes—until they are all complete or a timeout occurs. This technique has reliably produced full-page screenshots even on complex, image-heavy pages.
hey mate, i’ve got a trick that might help. try using page.waitForSelector(‘img’) to wait for the first image to load, then use page.evaluate to check if all images are complete. somethin like this: