I have basic experience with REST APIs and JSON but still figuring things out. Currently I can send one record at a time to Airtable using their API, but I need to submit multiple records in a single request.
Here’s my working code for single record submission:
public static async Task SubmitSingleEntry()
{
using (var client = new HttpClient())
{
using (var req = new HttpRequestMessage(new HttpMethod("POST"), "https://api.airtable.com/v0/BASEID/Content"))
{
req.Headers.TryAddWithoutValidation("Authorization", "Bearer TOKEN");
req.Content = new StringContent("{\"records\": [{ \"fields\": { \"Link\": \"https://feeds.example.com/podcast/feed.xml\" }}]}");
req.Content.Headers.ContentType = MediaTypeHeaderValue.Parse("application/json");
var result = await client.SendAsync(req);
}
}
}
I created these classes to handle the data structure:
using System.Collections.Generic;
using System.Text.Json.Serialization;
namespace MyApp
{
public class ContentModel
{
public class DataFields
{
[JsonPropertyName("Link")]
public string Link { get; set; }
}
public class DataRecord
{
[JsonPropertyName("fields")]
public DataFields fields { get; set; }
}
public class BatchData
{
[JsonPropertyName("records")]
public List<DataRecord> records { get; set; }
}
}
}
Hit this exact problem last week! Make sure you add using System.Text.Json; at the top - otherwise it won’t compile. Airtable’s rate limits can bite you too, so throw in a short delay between batch requests when you’re pushing through tons of data.
Your model looks solid. The tricky part is wiring it all together with proper serialization.
I’ve done similar bulk operations with various APIs - here’s how to make it work:
public static async Task SubmitMultipleEntries(List<string> urls)
{
var batchData = new ContentModel.BatchData
{
records = urls.Select(url => new ContentModel.DataRecord
{
fields = new ContentModel.DataFields { Link = url }
}).ToList()
};
string jsonPayload = JsonSerializer.Serialize(batchData);
using (var client = new HttpClient())
{
using (var req = new HttpRequestMessage(new HttpMethod("POST"), "https://api.airtable.com/v0/BASEID/Content"))
{
req.Headers.TryAddWithoutValidation("Authorization", "Bearer TOKEN");
req.Content = new StringContent(jsonPayload);
req.Content.Headers.ContentType = MediaTypeHeaderValue.Parse("application/json");
var result = await client.SendAsync(req);
}
}
}
Call it like this:
var urls = new List<string>
{
"https://feeds.example.com/podcast1/feed.xml",
"https://feeds.example.com/podcast2/feed.xml"
};
await SubmitMultipleEntries(urls);
Watch out though - Airtable caps batch requests at 10 records max. Got more URLs? You’ll need to chunk them.
New to JSON serialization in C#? This video breaks it down really well:
Performance matters big time with large datasets. Don’t create a new HttpClient for each batch - that’s a rookie mistake. Set it up once and reuse it throughout the whole operation. HttpClient’s built for reuse, and spinning up multiple instances will kill your socket connections. Also, implement exponential backoff when you hit rate limits. Airtable sends back specific error codes that tell you whether to retry right away or wait it out. I learned this the hard way on a migration project with thousands of records to process. The JsonSerializer approach works great, just make sure you dispose your HttpClient properly or use dependency injection to handle the lifecycle.
I’ve been using Airtable’s API for a while and batch request errors really threw me at first. When you send multiple records and one fails, the whole batch doesn’t crash - Airtable processes what it can and tells you which records had issues in the response. You need to parse that response to catch partial failures. Also, wrap your JsonSerializer.Serialize call in try-catch because bad data will crash it. The 10 record limit is right, but your chunking logic needs error recovery too. If chunk 3 out of 10 fails, just retry that chunk instead of restarting everything.