Automating PDF file uploads to Google Drive

Hey everyone, I’m trying to figure out how to upload a bunch of PDF files to Google Drive automatically. I’ve got about 50GB worth of ebooks spread across folders and subfolders.

I’m wondering if there’s a way to do this with Python or PHP. The tricky part is, I need to be able to stop and start the process multiple times. My computer isn’t always on, so I’d like the script to remember where it left off and pick up from there when I restart it.

Has anyone done something like this before? Any tips on how to make it work smoothly? I’m not super tech-savvy, so a simple solution would be awesome. Thanks in advance for any help!

I’ve actually tackled a similar project recently, and I found that using Python with the Google Drive API was the way to go. Here’s what worked for me:

First, set up the Google Drive API and get your credentials. Then, use the google-auth and google-auth-oauthlib libraries to handle authentication.

For the upload process, I created a script that walks through the directory structure, keeping track of what’s been uploaded in a separate file. This way, when you restart, it can check that file and skip already processed items.

To handle large files, I implemented chunked uploading. This not only helps with big PDFs but also makes it easier to resume if the connection drops.

One tip: use a try-except block around your upload code to catch and log any errors without crashing the entire process.

It took some trial and error, but once set up, it ran smoothly. Hope this helps point you in the right direction!

For automating PDF uploads to Google Drive, I’d recommend using Python with the google-drive-python-client library. It’s straightforward to set up and handles authentication seamlessly.

To manage the stop-start nature of your uploads, implement a simple checkpoint system. Create a text file that logs the paths of successfully uploaded files. When restarting, read this file to skip already processed items.

Consider using multiprocessing to speed up the upload process, especially given your large dataset. This can significantly reduce overall upload time.

One crucial tip: implement rate limiting to avoid hitting Google Drive API quotas. This ensures your script doesn’t get blocked mid-upload.

Lastly, for ease of use, create a simple command-line interface. This allows you to start, pause, and resume uploads effortlessly, making the whole process more user-friendly.

hey, i’ve done that before. you can use python and the google drive api to scan folders and log uploads. upon restart, check the log to skip done files. try chunked uploads for big pdfs so interruptions are less problematic. good luck!