Apps Script timeout when importing large database results to spreadsheet

I’m having trouble with my Google Apps Script timing out when I try to import data from a MySQL database into a Google Sheet. The database connection and query execution work fine and finish in under 10 seconds, but when it comes to writing over 12000 rows to the spreadsheet, the script runs extremely slow and hits the execution time limit before completing.

var dbHost = 'server.address.com';
var dbPort = 3306;
var database = 'my_database';
var user = 'db_user';
var pass = 'db_password';
var connectionString = 'jdbc:mysql://' + dbHost + ':' + dbPort + '/' + database;

function importDatabaseData() {
  var connection = Jdbc.getConnection(connectionString, user, pass);
  var statement = connection.createStatement();
  var queryResult = statement.executeQuery('SELECT * FROM my_table');
  var columnInfo = queryResult.getMetaData();
  var totalColumns = columnInfo.getColumnCount();
  var activeSheet = SpreadsheetApp.getActiveSpreadsheet();
  var targetSheet = activeSheet.getSheetByName('data_import');
  targetSheet.clear();
  var rowData = [];
  
  for (var i = 0; i < totalColumns; i++) {
    rowData.push(columnInfo.getColumnName(i + 1));
  }
  
  targetSheet.appendRow(rowData);
  
  while (queryResult.next()) {
    rowData = [];
    for (var i = 0; i < totalColumns; i++) {
      rowData.push(queryResult.getString(i + 1));
    }
    targetSheet.appendRow(rowData);
  }
  
  queryResult.close();
  statement.close();
  targetSheet.autoResizeColumns(1, totalColumns + 1);
}

What’s the best way to optimize this for better performance?

heres another trick: use batch operations with getRange().setValues(), but also split your import across multiple triggers. set up a time-driven trigger that runs every few minutes and processes chunks one by one. store your progress in PropertiesService so u can pick up where u left off if it times out again.

I’d break the data into smaller chunks - maybe 1000-2000 rows at a time. Use setValues() for each batch, then keep going. This fixes both timeout and memory issues with huge datasets. Also, wrap your database calls in try/catch blocks. Network hiccups will kill your entire script otherwise. Don’t forget to stick connection.close() in a finally block so connections actually get cleaned up when things go sideways.

Your problem is appendRow() inside a loop - each call hits Google’s servers separately, creating massive overhead. I had this exact issue last year with a data import script. Don’t append rows one by one. Build your entire data array in memory first, then write everything at once with setValues(). Replace your while loop to collect all rows into a 2D array, then use targetSheet.getRange(1, 1, dataArray.length, totalColumns).setValues(dataArray) at the end. This dropped my execution time from timing out at 6 minutes to under 30 seconds for the same data size. The performance difference is huge - one API call instead of thousands.