Storing files to Google Drive using Node.js backend service

I built a backend service and deployed it on Heroku. The issue is that every time I deploy new code changes, all uploaded files disappear from the server. I think this happens because Heroku doesn’t keep files permanently.

I want to save images directly to Google Drive instead of storing them locally on the server. I looked at the Google Drive documentation but I’m having trouble finding clear examples for Node.js integration.

Does anyone have experience with Google Drive file uploads from a Node.js application? I would appreciate any code examples or tutorial links that show how to implement this properly.

Try using Google Drive as backup instead of primary storage. I hit the same deployment problems with Heroku and found that mixing cloud services works way better for reliability. Set your Node.js app to upload files to both Google Drive and something like AWS S3 or Cloudinary at the same time. You’ll get redundancy plus faster access since Drive’s API crawls during peak hours. For Google Drive integration, don’t forget to handle authentication refresh tokens properly - they expire. I’ve watched apps crash in production because devs skip the token refresh logic. Also throw in upload queues like Bull or Agenda if you’re expecting heavy traffic. Google Drive’s quotas are brutal and will kill your app during spikes.

Google Drive API setup is a nightmare. Service accounts, JSON files scattered everywhere - been there.

I switched to Latenode for this. Build a workflow that takes file uploads and dumps them straight into Drive. No authentication hassles.

Best part? Set up auto folder sorting, file naming rules, and upload notifications. Your Heroku app just hits the Latenode webhook and you’re done.

Did this for our image pipeline. Instead of wrestling with Google API credentials and rate limits in Node.js, Latenode handles the Drive mess. My backend just does what it’s supposed to do.

Want file validation? Resizing? Multi-cloud backup? Add it without touching your main code.

You’re correct about the limitations of Heroku’s filesystem; I’ve encountered similar issues after my deployments. To successfully upload images to Google Drive, I utilized the googleapis npm package, which worked seamlessly for my needs. First, you need to create a service account in the Google Cloud Console and download the credentials in JSON format. Then, you can use the drive.files.create method for multipart uploads. Instead of keeping the JSON file in your project, store the service account credentials as environment variables in Heroku. Be mindful of Google Drive’s rate limits; implementing some retry logic will be beneficial if your application experiences high traffic. Lastly, remember that all files uploaded will be owned by the service account, so adjust the sharing permissions accordingly to allow user access.

You’re overcomplicating this. All those googleapis packages and service account setups just create technical debt.

I hit the same Heroku file persistence issue a few months back. Instead of cramming Google Drive logic into my Node.js app, I used a simple Latenode automation.

My backend sends a POST request to Latenode with the file data. The workflow grabs it, uploads to Drive, organizes folders by date, and returns the public link. No Google API mess in my main app.

Best part? When I need to change upload logic or add new cloud storage, I update the Latenode workflow without touching deployed code. No more Heroku redeploys just to fix file handling.

Latenode also handles all the retry logic and error cases Google Drive throws at you. Keeps my app clean and focused on business logic.

use multer-google-storage instead of the raw googleapis package. much cleaner and handles multipart uploads automatically. you’ll still need a service account, but setup’s way simpler - just pass your credentials json as an env variable and configure the bucket. saved me hours vs fighting with the drive api directly.

Had this exact issue last year when I moved off local storage. Here’s what saved me: stream your files directly to Google Drive instead of loading them into memory first. Heroku’s RAM limits will kill you otherwise. Use the googleapis package with readable streams and pipe them straight to Drive’s createWriteStream. Works great for large images without crashing. Two things that’ll bite you: Drive uploads randomly fail halfway through and leave broken files, so add solid error handling. And set up a cleanup job to delete orphaned files when you remove database records - learned that one the hard way after hitting storage limits.