Large Video File Causes Puppeteer Browser to Crash During Processing

I’m working with Puppeteer version 24.6.0 on macOS and running into a frustrating issue. When I upload a large video file (around 650MB) through my automated script, everything seems to work during the upload phase. However, once the file starts processing on the website, the browser controlled by Puppeteer just crashes completely.

The weird thing is that when I test the same upload manually using regular Chrome, everything works perfectly fine. The crash only happens when using Puppeteer automation.

I keep getting this error message: Page crashed!

What I’ve already tried:

  • Set the memory limit higher using --max-old-space-size=4096
  • Switched to headed mode to watch what happens, but no visible errors show up before it crashes
  • Tested with smaller files (they work fine)

My setup:

  • Puppeteer: 24.6.0
  • File size: 650MB video
  • OS: macOS

Has anyone dealt with similar crashes when processing large files through Puppeteer? I’m wondering if there are specific Chrome flags or configurations that might help with memory management during file processing.

Had this exact crash with 500mb+ files. try bumping the page timeout with page.setDefaultTimeout(300000) or higher - the browser isn’t actually crashing, just timing out during processing. also throw in the --disable-backgrounding-occluded-windows flag since it stops chrome from throttling background tabs that puppeteer uses. fixed it for me after weeks of pulling my hair out.

This is Chrome’s process isolation acting up with large file uploads in Puppeteer. Chrome handles memory differently when you’re automating vs. regular browsing sessions. I hit the same issue and fixed it by adding these flags to my launch arguments: --disable-features=VizDisplayCompositor, --disable-background-media-processing, and --memory-pressure-off. The first two stop Chrome from processing media in separate processes that blow past memory limits. The third one turns off memory pressure notifications that trigger garbage collection too early during big file operations. Basically, Puppeteer’s sandboxed environment has tighter memory constraints than normal Chrome, so it crashes when handling large files.

I hit the same issue scraping sites with large media files. Chrome’s headless mode handles memory differently than regular browsing. Add --disable-dev-shm-usage and --disable-web-security to your launch options. Bump up your timeout values too - large files take way longer to process, so Puppeteer thinks the page crashed when it’s just working. I also used --no-sandbox which helped, but don’t use that in production. The main difference is how Puppeteer manages background processes and memory during heavy operations compared to manual Chrome.