How to capture a full-page screenshot with Puppeteer after all images are loaded?

I’m working on a project where I need to take a full-page screenshot of a website using Puppeteer. The tricky part is making sure all the images are loaded before capturing the screenshot. I’ve tried a few approaches but haven’t had much luck.

Here’s what I’ve attempted so far:

const browser = await launchBrowser();
const page = await browser.newPage();
await page.navigate('https://example.com');

await page.setViewport({width: 1920, height: 1080});

// Scroll to bottom
await page.evaluate(() => window.scrollTo(0, document.body.scrollHeight));

// Wait a bit
await page.waitForTimeout(2000);

// Try to preload images
await page.evaluate(() => {
  const imgs = document.querySelectorAll('img');
  return Promise.all(Array.from(imgs).map(img => {
    if (img.complete) return Promise.resolve();
    return new Promise((resolve, reject) => {
      img.onload = resolve;
      img.onerror = resolve;
    });
  }));
});

await page.screenshot({path: 'fullpage.png', fullPage: true});
await browser.close();

This doesn’t seem to work consistently. Sometimes images are still loading when the screenshot is taken. Any ideas on how to improve this or a better approach to ensure all images are loaded before taking the screenshot?

I’ve faced similar challenges with Puppeteer screenshots. Here’s an approach that’s worked well for me:

Instead of using page.waitForTimeout(), try page.waitForNetworkIdle(). This waits for the network to be idle, which often indicates all resources have loaded.

You can also increase the timeout and set a low ‘idle’ threshold:

await page.goto('https://example.com', {
  waitUntil: 'networkidle0',
  timeout: 60000
});

Additionally, consider using page.evaluate() to check if all images are actually loaded:

await page.evaluate(() => {
  return new Promise((resolve) => {
    const interval = setInterval(() => {
      const images = document.getElementsByTagName('img');
      if (Array.from(images).every((i) => i.complete)) {
        clearInterval(interval);
        resolve();
      }
    }, 500);
  });
});

This should give more reliable results for full-page screenshots with all images loaded.

As someone who’s worked extensively with Puppeteer for web scraping and automation, I have found that simply waiting for network idle often isn’t enough to ensure every image has fully loaded. In my experience, combining network idle with a custom check for image completion works best. I first navigate to the page and scroll to trigger any lazy‐loading of images, then use a function that repeatedly checks every image—including those in iframes—until they are all complete or a timeout occurs. This technique has reliably produced full-page screenshots even on complex, image-heavy pages.

hey mate, i’ve got a trick that might help. try using page.waitForSelector(‘img’) to wait for the first image to load, then use page.evaluate to check if all images are complete. somethin like this:

await page.waitForSelector(‘img’);
await page.evaluate(() => {
return new Promise(r => {
const check = () => {
if ([…document.images].every(i => i.complete)) r();
else setTimeout(check, 1000);
};
check();
});
});

hope this helps!