I need help with a script to copy data from one field to another in my Airtable. The table has over 10,000 records and I want to process them in batches of 50.
Here’s what I’ve tried:
const myTable = base.getTable('Shops');
const gridView = myTable.getView('Main view');
const queryResult = await gridView.selectRecordsAsync();
const allRecords = queryResult.records;
batchUpdateFields(allRecords);
async function batchUpdateFields(allRecords) {
for (let i = 0; i < allRecords.length; i += 50) {
const batch = allRecords.slice(i, i + 50);
for (const item of batch) {
const originalData = item.getCellValue('ShopName');
await myTable.updateRecordAsync(item, { 'ShopLogo': originalData });
}
}
}
The script only updates the first record or stops after 15 records if I remove the await. What am I doing wrong? How can I make it process all records?
Your approach is on the right track, but there are a few improvements we can make to ensure all records are processed efficiently. Here’s a modified version that should work better:
let table = base.getTable('Shops');
let query = await table.selectRecordsAsync();
while (query.records.length > 0) {
let updates = query.records.slice(0, 50).map(record => ({
id: record.id,
fields: {
'ShopLogo': record.getCellValue('ShopName')
}
}));
await table.updateRecordsAsync(updates);
query = await table.selectRecordsAsync({offset: query.records.length});
}
This script processes records in batches of 50, updates them all at once, and then fetches the next batch. It continues until all records are processed. The updateRecordsAsync method is more efficient for batch updates. Remember to test this on a small subset first before running it on your entire dataset.
I’ve tackled similar challenges with large datasets in Airtable before. One key thing I learned is to leverage the updateRecordsAsync method for batch processing. It’s significantly more efficient than updating records individually.
Here’s an approach that worked well for me:
let table = base.getTable('Shops');
let query = await table.selectRecordsAsync();
while (query.records.length > 0) {
let updates = query.records.slice(0, 50).map(record => ({
id: record.id,
fields: { 'ShopLogo': record.getCellValue('ShopName') }
}));
await table.updateRecordsAsync(updates);
query = await table.selectRecordsAsync({offset: query.records.length});
}
This script processes records in batches of 50, updates them all at once, and then fetches the next batch. It continues until all records are processed. I’ve found this method to be much faster and more reliable for large datasets.
Remember to test on a small subset first and consider adding error handling for robustness. Good luck with your project!
hey there! i’ve dealt with similar issues before. instead of updating records one by one, try using updateRecordsAsync() for the whole batch. it’s way faster. also, make sure to use a query cursor to handle large datasets. here’s a quick example:
let cursor;
do {
let query = await table.selectRecordsAsync({cursor});
let updates = query.records.map(r => ({id: r.id, fields: {'ShopLogo': r.getCellValue('ShopName')}}));
await table.updateRecordsAsync(updates);
cursor = query.cursor;
} while (cursor);