I’m working with a web form that lets users upload files to a shared Google Drive folder. The setup uses a Java service account with editor rights to handle the uploads through the Google Drive API.
Here’s my problem: when I delete files or folders directly in the Google Drive web interface as the folder owner, they disappear from my view completely (not even in trash). But when my service account queries the API for file listings, those deleted items still show up in the results.
I’m wondering if this is related to permissions between the owner account and service account? Maybe the service account has its own view of what exists? I’ve looked through the API documentation but can’t find any method to force a refresh of the service account’s data cache. Do I need to programmatically delete these files through the API instead of using the web interface?
This happens because Google Drive treats file visibility differently for service accounts vs regular users. When you delete files through the web interface as the owner, your service account doesn’t automatically know about those changes - it’s working in its own separate permission bubble. I ran into this exact problem building a document management system. Users would delete files manually, but our backend service account kept trying to process them anyway. Service accounts just operate in their own world, separate from user accounts. Here’s what worked for me: modify your API queries to filter out inaccessible files. When you call files.list(), add parameters that check actual accessibility, not just whether the metadata exists. Also build in error handling for when files show up in API responses but aren’t actually reachable anymore. You could also set up a webhook or periodic sync to validate file accessibility before processing. This stops your app from trying to work with files that users already removed through the web interface.
This happens because Google Drive treats service account permissions differently than regular user permissions. When you delete files through the web interface, the service account might still see metadata or cached references.
I hit this exact problem building an automated document system. The service account kept seeing files users had deleted, which caused major confusion.
What fixed it for me was automating the entire file lifecycle. Don’t mix manual deletions with API operations - handle everything through the API consistently.
With Latenode, you can build workflows that monitor your Drive folder and sync deletions properly. Set up triggers that watch for file changes and keep both views consistent. The platform handles all the API calls and permission headaches automatically.
I built a workflow that runs every few hours to sync what users see with what the service account sees. No more phantom files.
Latenode makes Google Drive automation straightforward without writing custom Java code for all the edge cases.
Yeah, Google’s API has this annoying caching bug. Service accounts hold onto file references even after you delete them manually. Add supportsAllDrives=true and includeItemsFromAllDrives=true to your API calls - sometimes forces a refresh. Also double-check if your files are trashed or permanently deleted. The API acts weird with both states.
I experienced a similar issue recently. The discrepancy arises because service accounts do not synchronize deletions made through the web interface immediately. When a file is removed from the owner’s view, the service account might still retain its reference. To resolve this, I implemented a check in my Java code where I retrieve file info using files.get() before performing any operations. If the file is gone, a 404 error will confirm its deletion. Additionally, including q=“trashed=false” in your files.list() API requests can effectively filter out any deleted files without modifying the existing architecture. This approach streamlined my workflow significantly.
You’re hitting Google’s permission isolation between service accounts and user accounts. Service accounts live in their own bubble and don’t sync with web interface changes.
I hit this same nightmare building a file processing pipeline. Users deleted files thinking they were cleaning up, but our service kept trying to process phantom files for days.
Don’t fight Google’s API quirks - automate the sync instead. Manual deletions and API operations don’t play nice together.
I fixed this with automated workflows that handle the entire file lifecycle. When files get deleted through the web interface, automation detects the change and updates the service account’s view.
With Latenode, you can build a workflow that regularly checks file accessibility and removes stale references from your service account’s perspective. It handles API polling and cleanup without writing complex Java code for these edge cases.
Run the workflow on a schedule to verify file existence and clean up phantom references. Way cleaner than patching this in your application code.