I need help with copying data from one column to another column in my Airtable base. My table has over 10,000 rows and I want to process them in groups of 50 records at a time using the Scripting block.
My current script has issues. It only processes one record when I use await, but without await it stops after copying about 15 records. Here’s what I’m working with:
let mainTable = base.getTable('Companies');
let gridView = mainTable.getView('Main View');
let queryResult = await gridView.selectRecordsAsync();
let allRecords = queryResult.records;
processRecordsInBatches(allRecords);
async function processRecordsInBatches(recordList) {
let counter = 0;
while (counter < recordList.length) {
const currentBatch = recordList.slice(counter, counter + 50);
for (let item of currentBatch) {
let originalData = item.getCellValue('CompanyName');
await mainTable.updateRecordAsync(item, { 'BackupName': originalData });
}
counter += 50;
}
}
What’s causing this behavior and how can I fix it to process all records properly?
You’re hitting Airtable’s rate limits. Your script updates records one at a time in a loop, which fires off too many API calls and gets throttled around 15 requests. I hit this same issue last year during a data migration. You need to batch your updates at the API level, not just process them in batches. Swap out your inner loop for mainTable.updateRecordsAsync() and feed it an array of updates. Here’s the fix: javascript async function processRecordsInBatches(recordList) { let counter = 0; while (counter < recordList.length) { const currentBatch = recordList.slice(counter, counter + 50); const updates = currentBatch.map(item => ({ id: item.id, fields: { 'BackupName': item.getCellValue('CompanyName') } })); await mainTable.updateRecordsAsync(updates); counter += 50; } } Now you’re making one API call per 50 records instead of 50 separate calls. Way more efficient and won’t trigger the rate limits.
Had the same weird issue doing bulk updates last month. Your table/view setup might be the problem - tons of computed fields or lookups can slow things down massively. Switch to a simpler view with just the fields your script needs. Also, Airtable’s scripting block hits memory limits around 10k records, so break it into smaller chunks if batching doesn’t work.
This happens because Airtable’s scripting environment struggles with async operations in loops. When you use await like this, the script can timeout or lose context halfway through - especially with large datasets. I’ve hit the same issue migrating legacy data across multiple bases. Jack81’s right about using updateRecordsAsync, but you need error handling and progress logging too. Wrap your batch processing in try-catch blocks and add console.log statements to track what’s happening. Also throw in a small delay between batches: await new Promise(resolve => setTimeout(resolve, 100)). This gives the scripting environment breathing room. I’ve used this trick on tables with 15k+ records and it prevents timeouts. The trick is treating each batch as one atomic operation instead of updating records individually.
Your script structure is the problem. You’re calling processRecordsInBatches(allRecords) without await, so the function starts but your main script doesn’t wait for it to finish.
I hit this exact issue copying product data in our inventory system. The script would just die while the async function was still running.
Change this:
processRecordsInBatches(allRecords);
To this:
await processRecordsInBatches(allRecords);
That fixes the execution flow, but you’ll still hit rate limits with individual updates like others mentioned.
For the full fix, combine the await with batch updates: