I have a WordPress site where I need to show PDF documents using Google’s document viewer. The problem is weird - small files under 100kb work fine, but bigger files always fail with a timeout message.
When I try to open larger PDFs through the viewer URL like http://docs.google.com/viewer?url=http://mysite.com/documents/largefile.pdf, I get this error:
Sorry, it took too long to find the document at the original source. Please try again later.
You can also try to download the original document by clicking here.
The files are publicly accessible if you go directly to the URL, so it’s not a permission issue. I’m wondering if this is a Google Docs limitation, a server configuration problem, or something else entirely.
Has anyone dealt with this before? What’s the best way to make all PDF files work with Google’s viewer regardless of their file size?
I’ve encountered this specific issue while developing document viewers for various clients. Generally, it’s the Google viewer timing out while trying to fetch your PDF, rather than the file size itself that’s causing the problem. It’s likely that your server is throttling requests or imposing bandwidth limits, resulting in slow delivery to Google’s crawlers. When this happens, checking your server logs can reveal that Google’s bot is making the request but getting a delayed response. Many hosting providers impose bandwidth caps per request or use strict rate limiting that can disrupt how external services, like Google’s viewer, function. A solution that worked for me was to use Google viewer initially but to implement a failure detection mechanism that automatically falls back to an embedded viewer like PDF.js. This way, users can still access their previews without relying solely on Google’s service.
yeah, dealt with this exact issue last year. My hosting provider was auto-compressing PDFs, which made Google’s viewer hang forever. check if your server’s doin any PDF compression or processing - that’ll cause delays even on small files.
Google Docs viewer has an undocumented limit around 25MB for PDFs, but your timeout issues with 100kb+ files point to server response time, not Google’s limits. I’ve hit this same problem hosting PDFs on slow shared servers. Google’s crawler needs fast file delivery - if your server’s too slow, it’ll timeout no matter the file size. Test your PDF URLs with speed tools to check response times. Move your PDFs to a CDN or faster host. Or add PDF.js as a backup viewer for files that fail in Google’s viewer - gives you more control and cuts the dependency on Google’s service.