Need help with Scripting App: Bulk field copying in large Airtable dataset

I’m stuck with a scripting task in Airtable. I want to copy data from one field to another in the same table. The catch is, I’ve got over 10,000 records to deal with!

I tried using the Scripting App to do this in batches of 50, but something’s not right. Here’s what I’ve come up with:

const dataSheet = base.getTable('Customers');
const mainView = dataSheet.getView('Default view');
const queryResult = await mainView.selectRecordsAsync();
const allData = queryResult.records;

processBulkData(allData);

async function processBulkData(data) {
  let index = 0;
  while (index < data.length) {
    const chunk = data.slice(index, index + 50);
    for (let item of chunk) {
      let originalInfo = item.getCellValue('CustomerName');
      await dataSheet.updateRecordAsync(item, { 'CustomerCode': originalInfo });
    }
    index += 50;
  }
}

The problem is, it only copies the first record. If I take out the ‘await’, it manages to copy about 15 records before stopping. Any ideas what I’m doing wrong? Help would be much appreciated!

hey sofiag, i’ve run into similar issues before. try using the bulkUpdateRecordsAsync method instead. it’s way faster for large datasets. something like:

let updates = chunk.map(item => ({
  id: item.id,
  fields: {'CustomerCode': item.getCellValue('CustomerName')}
}));
await dataSheet.bulkUpdateRecordsAsync(updates);

this should handle your 10k records no prob. lmk if it works!

I’ve dealt with similar large-scale data operations in Airtable before. Your approach is close, but there’s a more efficient method. Instead of updating records one by one, you can use the bulkUpdateRecordsAsync function. This drastically reduces API calls and speeds up the process.

Here’s a tweaked version of your script that should work better:

const dataSheet = base.getTable('Customers');
const mainView = dataSheet.getView('Default view');
const queryResult = await mainView.selectRecordsAsync();
const allData = queryResult.records;

async function processBulkData(data) {
  for (let i = 0; i < data.length; i += 50) {
    const chunk = data.slice(i, i + 50);
    const updates = chunk.map(record => ({
      id: record.id,
      fields: { 'CustomerCode': record.getCellValue('CustomerName') }
    }));
    await dataSheet.bulkUpdateRecordsAsync(updates);
  }
}

await processBulkData(allData);

This should handle your 10,000 records much more efficiently. The key difference is using bulkUpdateRecordsAsync to update multiple records in one go, rather than updating them individually. Give it a try and let us know how it goes.

I’ve encountered similar challenges with large datasets in Airtable. Your approach is on the right track, but there’s a more efficient method. Instead of updating records individually, use the bulkUpdateRecordsAsync function. This significantly reduces API calls and improves performance.

Here’s a modified version of your script that should work:

const dataSheet = base.getTable('Customers');
const mainView = dataSheet.getView('Default view');
const queryResult = await mainView.selectRecordsAsync();
const allData = queryResult.records;

async function processBulkData(data) {
  for (let i = 0; i < data.length; i += 50) {
    const chunk = data.slice(i, i + 50);
    const updates = chunk.map(record => ({
      id: record.id,
      fields: { 'CustomerCode': record.getCellValue('CustomerName') }
    }));
    await dataSheet.bulkUpdateRecordsAsync(updates);
  }
}

await processBulkData(allData);

This should handle your 10,000 records efficiently. Let me know if you need any clarification.