Fetching multiple CSV files from Google Drive using PHP for database import

I’m dealing with a situation where I have tons of information spread across multiple Google Sheets. When I tried to merge everything into one sheet for analysis, I ran into Google’s size restrictions.

To work around this, I created an Apps Script that exports all my data into about 30 separate CSV files. Now I need to pull these CSV files from Google Drive and get them into my MySQL database using PHP.

What’s the best approach for downloading all these CSV files from Google Drive to my server so my PHP script can process them into the database?

I found some code that might help but I’m having trouble adapting it:

function fetchFile($httpClient, $fileUrl, $exportType=null) {
  $authToken = $httpClient->getAuthClient()->getToken();
  $requestOptions = array(
    'http' => array(
      'method' => 'GET',
      'header' => "GData-Version: 3.0\r\n".
                  "Authorization: Bearer $authToken\r\n"
    )
  );
  if ($fileUrl != null) {
    $fileUrl = $fileUrl . "&format=$exportType";
  }
  return file_get_contents($fileUrl, false, stream_context_create($requestOptions));
}

// Setup Google Drive client
$driveClient = new Google_Service_Drive($client);
$fileList = $driveClient->files->listFiles();
$downloadUrl = $fileList->items[0]->exportLinks['text/csv'];
$csvData = fetchFile($client, $downloadUrl, 'csv');
echo 'File data: ' . $csvData;

Any help would be awesome!

Hey! You’re using the old v2 API with exportLinks - that’s deprecated now. Switch to the v3 Drive API and use the export method instead. Try $driveService->files->export($fileId, 'text/csv') for grabbing CSV files.

I dealt with something similar last year. Don’t process files one by one - you’ll hit API rate limits. Instead, batch your requests in chunks of 5-10 files with small delays between each batch. Download everything to temp storage first, then process sequentially into your database. This prevents memory issues with large datasets and gives you better error handling when files fail. Don’t forget to refresh your auth token - processing 30 files takes time and your token will expire halfway through.

That code’s definitely outdated and won’t work with the current Google Drive API. I dealt with this exact same thing 6 months ago when we migrated our reporting system. What worked best was using Drive API v3 with proper file filtering to grab only CSV files. Query the Drive API with mimeType='text/csv' to get just your exported files instead of pulling everything. For downloads, add solid error handling - network timeouts are super common with multiple large files. I saved each CSV with a timestamp prefix to avoid naming conflicts and built a simple queue to track which files processed successfully vs failed downloads. The thing that saved me the most headaches was validating each downloaded CSV before trying to insert into the database - corrupted downloads happen way more than you’d think.