C# - How to download Google Drive files in chunks for poor network conditions?

I’m working on a project where the network is really bad. My app needs to download Google Drive files for users. It’s okay with small files but keeps failing with bigger ones around 9MB. I’ve tried to download in smaller chunks but can’t figure out how to do it with the Google Drive API.

I’ve looked at the docs and tried this code:

var request = DriveService.Files.Export(fileId, exportMimeType);
var message = request.CreateRequest();
var client = request.Service.HttpClient;

client.DefaultRequestHeaders.Range = 
    message.Headers.Range = 
    new RangeHeaderValue(0, 1000);

var response = await client.SendAsync(message);

if (response.IsSuccessStatusCode)
{
    using (var fs = File.Create(localFilePath))
    {
        await response.Content.CopyToAsync(fs);
    }
}

But it’s not working as expected. The downloaded file is still full-size or fails for the big file. I also tried ExportRequest.MediaDownloader.ChunkSize but it only seems to affect progress updates.

Any ideas on how to download these files in smaller pieces? Thanks for your help!

hey bob, have u tried using the resumable download feature? it lets u download in chunks and resume if interrupted. check out the ‘Download Files’ section in google drive api docs. u might need to implement some retry logic too for ur bad network. good luck!

As someone who’s dealt with finicky networks, I feel your pain, Bob. Have you considered using the HTTP client’s GetStreamAsync method combined with a custom stream? This approach lets you control the chunk size and implement retry logic.

Here’s a rough idea:

const int chunkSize = 1024 * 1024; // 1MB chunks
using var httpClient = new HttpClient();
using var stream = await httpClient.GetStreamAsync(downloadUrl);
using var fileStream = File.Create(localFilePath);

byte[] buffer = new byte[chunkSize];
int bytesRead;
while ((bytesRead = await stream.ReadAsync(buffer, 0, buffer.Length)) > 0)
{
    await fileStream.WriteAsync(buffer, 0, bytesRead);
}

This method gives you more control over the download process. You can wrap it in a retry mechanism to handle network hiccups. Adjust the chunk size based on your network conditions. Remember to handle exceptions and implement proper disposal of resources. Good luck with your project!

I’ve encountered similar issues with unreliable networks and found that a more controlled download process can really help. One approach is to fetch the file metadata to determine its total size and then download smaller parts sequentially using the ‘Range’ header. By doing so, you get a chunked download that can better handle interruptions. Error handling and retry logic are essential in this process, allowing you to recover from failures without restarting the entire download. Adjusting the chunk size based on network conditions also plays a crucial role in building a robust solution.