Service account upload to shared Google Drive folder capped at 16GB

I’m having trouble with my Rust program that uses google-drive3 to upload big files to my Google Drive. I’ve got a 10TB Google One plan, but when I try to upload 100GB files using a service account, it fails after a few hours. The error says “user storage quota exceeded”, even though I’ve got plenty of space.

The API says my max upload size is fine, but there’s a 16GB limit when using a service account. This didn’t happen with interactive auth, but that’s not good for what I need.

I’ve tried sharing the folder with the service account and looked into domain-wide delegation, but no luck. I can’t find any relevant quotas in GCP to increase either.

My code uses oauth2::read_service_account_key, builds an authenticator, creates a DriveHub, and then tries to upload with hub.files().create(). I’m using .supports_all_drives(true) and other recommended settings.

How can I get around this 16GB limit with a service account and use all my Google One storage? Any ideas?

I’ve faced a similar issue when working on a large-scale backup solution for a client. The 16GB limit with service accounts can be frustrating, especially when you have ample storage available.

One workaround I found effective was to split the large files into smaller chunks, each under 16GB. Then, upload these chunks separately and use the ‘resumable’ upload feature. This approach not only bypasses the size limit but also provides better reliability for large uploads.

For implementation, you might want to look into the ‘resumable’ upload endpoint in the Drive API. It allows you to upload in chunks and resume interrupted uploads. You’ll need to modify your Rust code to handle the file splitting and manage the resumable upload process.

Another option, though more complex, is to set up a Google Cloud Storage bucket and use it as an intermediary. Upload your large files to GCS first, then transfer them to Drive. This method leverages different quotas and might bypass the service account limitations.

These solutions require more code changes, but they’ve worked well in my experience for handling large file uploads with service accounts.

Having worked extensively with Google Drive APIs, I can suggest an alternative approach. Instead of using a service account, consider implementing a hybrid authentication method. Start with interactive auth to obtain initial tokens, then use refresh tokens for subsequent automated uploads. This method maintains the higher upload limits of user accounts while still allowing for automated processes.

To implement this, you’d need to modify your Rust program to handle the OAuth 2.0 flow for user accounts. Store the refresh token securely after the initial authentication. For subsequent runs, use the refresh token to obtain new access tokens.

This approach has worked well for me in similar high-volume upload scenarios. It balances the need for automation with the higher quotas of user accounts. Remember to implement proper error handling and token refresh logic to ensure smooth operation over time.

hey there, i’ve run into this too. what worked for me was using resumable uploads. you can split ur files into smaller chunks (like 10GB each) and upload em one by one. it’s a bit more work in ur code, but it gets around that annoying 16GB limit. just make sure to handle any connection issues during the upload process. good luck!