Locally syncing a public Google Drive folder for offline access

I need to sync a public, dynamically updated Google Drive folder for offline access. How can I mirror the main folder and automatically include any new daily subfolders?

I have successfully used a combination of Google Drive API and a custom Python script to achieve a similar outcome. My script queries the folder’s content and downloads any new subfolders, running on a scheduled task every few hours. This approach allows a high degree of customization and can adapt to changes in the folder structure. While it requires some initial setup and familiarity with Python, it provides a robust solution for keeping a local copy of a dynamically updated public folder.

hey, try using rclone with a cron job to check for new subfolders. its worked fine for me in keeping an offlin sync updated. hope that helps ya!