I’m working with a modified price monitoring script for Google Sheets that’s supposed to automatically fetch Amazon product prices. The idea is to avoid manually checking prices for multiple products.
The issue I’m facing is that the script only processes around 20 products before it stops working completely. I need it to handle way more items than that.
I’ve tried looking through the code myself but I’m pretty new to programming and can’t figure out what’s making it quit early. Does anyone know what typically causes Google Sheets scripts to stop running after processing a certain number of items? Any suggestions on how to fix this limitation would be really appreciated.
The script uses three main functions - one for the main logic, one for handling the data table, and another for processing individual rows. It should loop through all the product URLs I’ve provided but keeps stopping at roughly the same point each time.
You’re hitting Google Apps Script’s 6-minute execution limit. Amazon scraping is slow because of network requests and their rate limiting. I had the same issue building an inventory tracker. Fixed it by breaking the work into chunks and using PropertiesService to track progress. Set up a time trigger to run every few minutes - each run picks up where the last one stopped. Also check if Amazon’s blocking you after 20 requests. They’re aggressive about catching scrapers. I added random delays between requests and rotated user agents, but watch their terms of service.
classic apps script timeout - i’ve fought this for months with my price tracker. add utilities.sleep(2000) between product fetches to slow down and dodge amazon’s detection. clear your variables after each loop or you’ll hit memory limits fast.
Google Sheets scripts hit memory and execution limits after about 20-30 web scraping operations, and Amazon has anti-bot protection that detects repeated requests from the same IP. I encountered this issue while tracking competitor prices. A solution that worked for me involved setting up batch processing with triggers running every 10-15 minutes, saving results directly to cells, and adding random delays between requests. Alternatively, you might consider using IMPORTXML, as it manages rate limits automatically, though it may not be suitable for more complex scraping tasks.