Hey everyone! I’m pretty new to automation tools and hoping someone can help me out here.
I’m building an Airtable database to track business documents like quotes, orders, and bills. These files are stored in shared Google Drive folders with a specific structure.
What I want to do: Automatically monitor Google Drive folders and sync file information to Airtable records.
My folder setup: Main Drive “Quotes” → Year folders (like 2025) → Category folders (Services OR Planning) → Individual client folders
Airtable columns I need: Quote ID, Document Name, Client Name, Year, Category, File Link, Google File ID, File Type, Created Date, Updated Date
Problems I’m running into:
The file watcher only works when I specify exact paths like /2025/Services but I need it to monitor everything from the top level
When I try to watch all files, I get an error saying it needs a trigger even though I set up the date modified trigger
I can’t figure out how to pull the Client/Year/Category info from the folder path automatically
Tried using a search function with parent folder queries but keep getting errors
Has anyone built something similar? Looking for any tips or existing templates that might help with this Google Drive to Airtable workflow.
Make.com (used to be Integromat) is perfect for this. Their Google Drive module watches folders recursively out of the box - just point it at your parent folder and it’ll monitor everything below it. Plus it automatically splits folder paths into separate fields, so you won’t need regex or custom parsing like other solutions.
Been running a similar setup for our client docs since early last year. That recursive folder monitoring issue is super common - most automation tools just can’t handle deeply nested Drive structures. Here’s what actually worked: I ditched real-time monitoring and went with Google Drive API polling plus a simple database cache. Run a script every 30 minutes that grabs files modified since the last check. Way more reliable. The trick is building a local map of all your folder IDs to their full paths during initial setup. For extracting client/year/category data, I parse the folder hierarchy by walking up from each file using the parents field. Don’t try reconstructing paths from folder names - those change all the time. One thing that’ll save you major headaches: always use the Drive file ID as your primary key in Airtable, never the filename. Files get renamed constantly but IDs stay consistent. Prevents duplicate records when clients rename their quotes three times before finalizing (and they always do).
Built something similar for invoice tracking about six months back. Skip the polling - Google Drive’s push notifications are way more reliable than watchers or scheduled runs. Just register a webhook endpoint and Google pings you whenever files change in your monitored folders. You get file metadata in the notification, and you can pull the folder hierarchy from the file’s parent chain. For parsing folder structure, I went simple - stored the full path as a string and used indexOf to find year and category segments. File moves are the annoying part since they show up as both delete and create events. Ended up building an hourly reconciliation process to catch anything that slipped through.
Went through this exact same thing two years ago automating our project docs.
Ditch the file watchers - they’re garbage with deep folders and Google Drive’s API limits will screw you over.
What actually works:
Hit Google Drive API’s list files endpoint with a query filtering by your main folder ID and modified time. Run it every 15 minutes with cron.
For folder paths, grab the parents array from each file and walk backwards through the structure. Store folder ID to name mappings so you’re not constantly hitting the API for parent lookups.
Extract client/year/category with regex on the full path string after you rebuild it. Something like /(?<year>\d{4})/(?<category>Services|Planning)/(?<client>[^/]+)/ works great.
Built this exact system for our compliance docs - it’s been rock solid for 18 months. Handles 500 files daily across 12 nested levels no problem.
One heads up - Google Drive sometimes sends duplicate events for the same file mod. Keep a hash table of file ID + modified timestamp to filter dupes before writing to Airtable.
Honestly, all these custom API solutions are way more work than needed. Why build and maintain code when automation platforms handle this out of the box?
I set up this exact workflow last month for our contract management system. Same nested folders, same Airtable sync.
The trick? Use a platform that actually gets folder hierarchies instead of fighting basic triggers. You want something that monitors folders recursively and extracts path data without writing regex patterns.
For parsing your folder structure, you need a tool that splits file paths automatically and maps those segments to Airtable fields. Most platforms choke on this, but some handle it natively.
That file watching problem happens because you’re probably using tools that treat each folder as a separate trigger. You need one automation that monitors the parent folder and processes all changes regardless of depth.
I got our system running in 30 minutes with zero custom code. Pulls all the metadata you listed, handles file moves properly, and hasn’t missed a single document in two months.
Latenode handles exactly this with their Google Drive integration. The folder monitoring actually works with nested structures and path parsing is built in.
you’re overcomplicating this. had the same problem and switched to zapier instead of building custom watchers. works much better with nested folders and grabs metadata automatically. costs around $20/month but saves all the api headaches
Had the same problem when I built my document tracker last year. Google Apps Script was the game changer - way better than external automation tools. Just set up a script that runs on a timer, scans your drive structure, and uses split() functions to pull year/category/client data from folder paths. For tracking file changes, I store the last modified timestamp in a separate sheet and compare it each run to catch new or updated files. The script writes directly to Airtable through their API, which cuts out all that trigger headache you’re dealing with. Takes about an hour to set up but handles nested folders like a champ and doesn’t cost anything to run.