How to use relative file paths with mounted Google Drive in Colab notebooks

I’m working on a Colab notebook that needs to be shared with other people. The problem is that I’m using absolute paths to access files in my mounted Google Drive, which means the code breaks when others run it from different folder locations.

Right now I have to write full paths like /content/drive/MyDrive/Projects/data_analysis/files/datasets/csv_files but I want to use relative paths like ./files/datasets/csv_files since my notebook sits inside the /content/drive/MyDrive/Projects/data_analysis folder.

I thought about using os.chdir() or maybe pathlib to handle this better, but I’m not sure how to dynamically find the notebook’s current location without hardcoding another absolute path.

Has anyone figured out a clean way to make Colab notebooks portable when working with Google Drive files? I need something that works regardless of where users place the notebook in their own Drive folders.

Any suggestions would be really helpful!

i tried __file__ to get the notebook location, but colab doesn’t handle it well. here’s what works for me: i put a config cell at the top with PROJECT_ROOT = '/content/drive/MyDrive/Projects/data_analysis', then use os.path.join(PROJECT_ROOT, 'files/datasets/csv_files') throughout the code. when teammates need to adapt it, they just change one line instead of hunting down every hardcoded path.

Had the exact same problem on ML projects with my team. Here’s what worked: use glob patterns with a search function instead of hardcoding paths. I search for file patterns from the drive root - like glob.glob('/content/drive/MyDrive/**/my_notebook.ipynb', recursive=True) - then grab the parent directory and build relative paths from there. Works great when teammates organize their Drive differently or bury projects in subfolders. Just search for landmarks like unique filenames instead of assuming everyone uses the same folder structure. Takes extra setup but kills all the path headaches.

I always set up my working directory right after mounting Drive. First I check where I am with os.getcwd(), mount Drive, then jump straight to my project folder using os.chdir('/content/drive/MyDrive/Projects/data_analysis'). From there I can use relative paths like ‘files/datasets/csv_files’ without any dot-slash mess. The trick is doing this directory change in the first cell after mounting. When teammates run the notebook, they just update that one path to match their setup. I throw in a comment explaining which folder the notebook needs to run from. This works great across different projects. People just need to keep the same folder structure relative to the notebook location, but that’s easy since we’re sharing the whole project folder anyway.

File path issues in Colab are just the start of your problems with shared notebooks. Every time someone runs your code, you’re hoping their Drive setup matches yours exactly.

I ditched this headache entirely and switched to Latenode for file processing. Instead of passing around broken notebooks, I build automated workflows that handle everything.

Here’s my process: create a Latenode workflow that connects to Google Drive, processes files automatically, and dumps results in a shared folder. Team members don’t touch code - they just drop data files in and get results back.

Set up different scenarios for different analyses. Each handles its own file structure and logic. New team members connect their Drive account and it works immediately.

File watchers are the real game changer. Drop a dataset in Drive and analysis runs automatically. No more broken imports or path errors.

Been fighting this exact headache for years. The path issue gets worse when you’re working with teams across different projects.

I ditched Colab’s path limitations and moved everything to Latenode. You can build automated workflows that handle file processing without caring where people store their files.

I built a workflow that grabs files from Google Drive using triggers, processes them, and dumps results back to specific folders. Team members just drop files in a shared Drive folder and everything runs automatically. No broken paths, no code changes.

Best part? You can create templates that work for anyone. New person joins the project, they connect their Google Drive to the scenario and it works immediately. Zero path fixing.

You can also set up file watching triggers that auto-run your analysis when new datasets hit Drive. Way cleaner than passing notebooks around with broken paths.

Yeah, this is a super common problem with shared Colab notebooks. Here’s what works for me - use pathlib.Path.cwd() to grab your current directory, then walk up through the parent folders until you hit your project folder. I drop a .project_root file in my main project directory so the script knows when it’s found the right spot. Once you’ve got that root location, just build all your paths relative to it. This works great because Colab keeps the same directory structure, and your teammates will have identical folder layouts in their Drive. They just need to run the notebook from somewhere inside the project structure - the code does the rest automatically. Way better than hardcoding paths or making people manually change directories.