Making API calls from Google Sheets to retrieve email addresses

I’m working on a project where I need to get email addresses automatically in Google Sheets. Here’s what I have so far:

My current setup:

  • Column A contains contact names
  • Column B has company domains
  • Column C should show the email addresses (this is what I need help with)

What I’m trying to do:
I want to use an email finder API that requires a POST request with name and domain parameters. The API returns JSON data with either the email address or an error message.

API requirements:

  • Endpoint needs POST method
  • Parameters: contact_name, company_domain, auth_token
  • Returns JSON response

What I’ve tried:

  1. Used Zapier webhooks - works but can’t write results back to the sheet
  2. Tried some Google Apps Script examples but couldn’t figure out how to structure the POST request properly
  3. Looked into importJSON but it seems to be for GET requests only

My question:
How can I create a Google Apps Script function that reads the name and domain from my sheet, makes a POST request to the API, and writes the email result back to column C? I’m not experienced with coding, so a simple example would be really helpful.

Is there a way to do this automatically for each new row that gets added?

sure thing! u can create an Apps Script func that uses UrlFetchApp.fetch() for the POST request, passing the name and domain. then, use it in your sheet as =getEmail(A2,B2). don’t forget to set triggers for auto running when new rows get added!

Google Apps Script with batch processing is your best bet. Don’t make individual API calls for each row - collect all the name-domain pairs first, then process them in batches. This avoids timeouts and saves your API quota.

Add a menu item to run the script manually, or use time-driven triggers instead of onEdit. API calls are slow and onEdit will choke.

For the POST request, stringify your JSON payload properly and don’t forget the content-type header. Add a status column to track which rows you’ve processed - especially with large datasets. You can pick up where you left off after failures without starting over.

The trick is handling async API calls correctly in Google Apps Script.

I had the same need last year and built a custom Google Apps Script function. The trick is using UrlFetchApp.fetch() with the right payload formatting. Here’s what worked for me: Make a function that takes name and domain as parameters, builds the POST request with ‘application/json’ headers, and drops your auth token in the payload. Parse the JSON response and grab just the email field. You can call this function straight from your spreadsheet cells. For auto-processing new rows, I added an onEdit trigger that watches columns A and B for new data, then fills column C automatically. This handled about 200 API calls daily without hitting rate limits. The trickiest part was handling API errors - definitely add try-catch blocks so failed API calls don’t break your entire sheet.