I’m still learning about working with APIs and JSON data. I managed to get a single record working with this approach:
public static async Task SubmitSingleEntry()
{
using (var client = new HttpClient())
{
using (var apiRequest = new HttpRequestMessage(new HttpMethod("POST"), "https://api.airtable.com/v0/BASEID/TableName"))
{
apiRequest.Headers.TryAddWithoutValidation("Authorization", "Bearer TOKEN");
apiRequest.Content = new StringContent("{\"records\": [{ \"fields\": { \"Link\": \"https://example.com/feed.rss\" }}]}");
apiRequest.Content.Headers.ContentType = MediaTypeHeaderValue.Parse("application/json");
var result = await client.SendAsync(apiRequest);
}
}
}
But now I need to handle multiple rows at once. I built these classes to structure the data:
using System.Collections.Generic;
using System.Text.Json.Serialization;
namespace MyApp
{
public class FeedSubmissionModel
{
public class Properties
{
[JsonPropertyName("Link")]
[JsonInclude]
public string Link { get; set; }
}
public class Entry
{
[JsonPropertyName("fields")]
[JsonInclude]
public Properties fields { get; set; }
}
public class Container
{
[JsonPropertyName("records")]
[JsonInclude]
public List<Entry> records { get; set; }
}
}
}
Your classes need tweaks for bulk operations. Drop the “id” field from your JSON when creating new records - only use it for updates.
Here’s my approach for bulk submissions:
public static async Task<bool> BulkSubmitFeeds(List<string> feedUrls)
{
// Chunk into batches of 10
var batches = feedUrls.Select((url, index) => new { url, index })
.GroupBy(x => x.index / 10)
.Select(g => g.Select(x => x.url).ToList());
foreach (var batch in batches)
{
var payload = new FeedSubmissionModel.Container
{
records = batch.Select(url => new FeedSubmissionModel.Entry
{
fields = new FeedSubmissionModel.Properties { Link = url }
}).ToList()
};
var json = JsonSerializer.Serialize(payload);
var success = await SendBatch(json);
if (!success) return false;
// Rate limiting - Airtable gets cranky
await Task.Delay(250);
}
return true;
}
Chunk your data first - hitting the 10 record limit kills the entire batch. The delay between batches saved me from rate limiting during heavy imports.
Strip out null or empty URLs before building your payload. Airtable validation errors are a pain to debug.
Good points in the previous answer, but here are some gotchas I hit when building something similar. First, handle the Airtable response correctly - you get back the created record IDs that you’ll probably need later. Don’t include the “id” field when creating new records, only for updates. Airtable generates IDs automatically. Watch out for error handling too. Airtable can reject individual records in a batch if validation fails, so you’ve got to parse that response carefully. Add proper exception handling around SendAsync and check the status code - don’t assume it worked. Last tip: if you’re processing large datasets, add delays between batches. Airtable caps you at 5 requests per second per base, so rapid batches get throttled fast.
You’re manually building JSON strings instead of using proper serialization. Your class structure looks fine, but you need to populate the objects and serialize them properly.
Here’s how to handle multiple records:
public static async Task SubmitMultipleEntries(List<string> links)
{
var container = new FeedSubmissionModel.Container
{
records = new List<FeedSubmissionModel.Entry>()
};
foreach (string link in links)
{
container.records.Add(new FeedSubmissionModel.Entry
{
fields = new FeedSubmissionModel.Properties { Link = link }
});
}
string jsonPayload = JsonSerializer.Serialize(container);
using (var client = new HttpClient())
{
using (var apiRequest = new HttpRequestMessage(new HttpMethod("POST"), "https://api.airtable.com/v0/BASEID/TableName"))
{
apiRequest.Headers.TryAddWithoutValidation("Authorization", "Bearer TOKEN");
apiRequest.Content = new StringContent(jsonPayload);
apiRequest.Content.Headers.ContentType = MediaTypeHeaderValue.Parse("application/json");
var result = await client.SendAsync(apiRequest);
}
}
}
Just remember Airtable caps batch requests at 10 records, so you’ll need to chunk bigger datasets.