I’m having trouble with my AirTable and Plumsail setup. I’ve got an extension that’s supposed to generate a PDF when an item gets approved. It works fine when I run it manually but fails during automated triggers.
The error message says I’ve gone over the 30 table query limit for one script run. Here’s what I’m trying to do:
- Fetch data from a PO Header table
- Send it to a Plumsail API
- Get a PDF link back
- Attach the PDF to the record
Has anyone run into this before? Any ideas on how to optimize my script or work around this limit? Maybe I need to split it into smaller chunks somehow?
Here’s a simplified version of what my code looks like:
async function main() {
const orderTable = base.getTable('Purchase Orders');
const orderId = input.config().OrderId;
const orderData = await fetchOrderDetails(orderId, orderTable);
const pdfResponse = await sendToPdfGenerator(orderData);
if (pdfResponse.success) {
await attachPdfToRecord(pdfResponse.link, orderId, orderTable);
}
}
// Other helper functions here...
Any help would be much appreciated!
I’ve had a similar challenge before and found that restructuring the process was key. Instead of trying to fetch all data in one go, I reworked my approach to break down the task into smaller segments. I built a dedicated queue to handle pending tasks and adjusted the script so that each run processed only a subset of records, scheduling these runs more frequently. This method kept each execution within the query limits while gradually processing all records.
Additionally, I found that caching frequently accessed data and refining the use of available methods improved overall performance. Embracing a more asynchronous approach, rather than relying on immediate real-time processing, allowed me to overcome the limitations without drastically altering the workflow.
I’ve encountered this issue before, and it can be frustrating. One effective approach is to implement a cursor-based pagination system. This allows you to fetch data in smaller chunks, staying within the query limit. You could modify your fetchOrderDetails function to accept a cursor parameter and return both the data and a next cursor.
Another optimization is to use batch operations where possible. AirTable’s API supports batch reads and writes, which can significantly reduce the number of API calls needed.
Lastly, consider caching frequently accessed data locally. This can cut down on unnecessary queries, especially for static or slowly changing information. Just be mindful of data freshness and implement a sensible cache invalidation strategy.
Remember, sometimes rethinking your data model or workflow can lead to more efficient solutions than trying to optimize within current constraints.
hey alex, had similar issues. try batching ur queries or using a queue system. also, cache data u use often. might help to break down ur script into smaller functions that run separately. good luck mate!