Can BigQuery serve as a data cache for Google Sheets add-on formulas?

I’m working on a Google Sheets add-on with custom formulas that pull data from an API. The problem is we’re hitting the UrlFetch quota a lot. I’m thinking about using BigQuery as a data cache to help with this.

Here’s what I want to do:

  1. When a formula runs, check BigQuery first
  2. If the data isn’t there, get it from the API
  3. Store the API result in BigQuery for next time

I tried setting this up, but I’m running into permission issues. When I test a BigQuery query in a custom function, I get an error about missing authentication. When I try from the sidebar, it says I don’t have permission to create BigQuery jobs.

Is there a way to let all my add-on users access the BigQuery project without setting up individual permissions? Or am I barking up the wrong tree here?

Any advice on how to cache data for custom formulas would be super helpful. Thanks!

hey john, i’ve used cache service for similar stuff. works great for smaller datasets. no need to mess with bigquery permissions. just remember it’s temporary storage (6hrs max). if u need longer storage, maybe look into firebase? it’s easier to set up than bigquery imo. good luck!

I’ve dealt with similar API quota issues in my Sheets add-ons before. While BigQuery is powerful, it can be overkill for caching. Have you considered using Cloud Datastore? It’s more lightweight and easier to set up for this kind of task.

Here’s what worked for me:

  1. Set up a Cloud Datastore instance.
  2. In your custom formula, first check Datastore for the data.
  3. If it’s not there, fetch from the API and store in Datastore.
  4. Set an appropriate TTL on the Datastore entries.

This approach solved my quota issues without the complexity of BigQuery. You’ll need to set up a service account, but it’s straightforward. The best part? It scales well as your user base grows.

Remember to implement some local caching in your add-on too. It can significantly reduce the number of external calls you need to make. Hope this helps!

I’ve tackled similar challenges before, and here’s what worked for me: Instead of BigQuery, consider using Google Cloud Storage (GCS) as your caching layer. It’s more straightforward to set up and manage permissions.

Here’s a quick overview:

  1. Create a GCS bucket for your cache.
  2. When a formula runs, check GCS first.
  3. If not found, fetch from the API and store the result in GCS.
  4. Set appropriate expiration policies on the bucket.

This approach sidesteps the BigQuery permission headaches. You’ll need to set up a service account with GCS access for your add-on, enabling indirect access for all users without individual permissions. Implement proper error handling and consider using a local cache in your add-on to further reduce API calls. Good luck with your project!