Implementing a Loop with Puppeteer

I need assistance to scrape elements and refresh the data every 15 minutes, but I’m unsure how to implement a loop for this.

const casesCount = await page.$eval('.counter-number span', el => el.innerText);

await page.type('[class="input-box"]', casesCount);
await page.keyboard.press('Enter');

Could anyone provide guidance?

An alternative to using setInterval is to implement a recursive function. This method gives you more flexibility to handle errors and control the timing more precisely. Here’s a sample implementation:

async function scrapeAndRefresh() {
  try {
    await page.reload({ waitUntil: 'networkidle0' });
    const casesCount = await page.$eval('.counter-number span', el => el.innerText);
    await page.type('[class="input-box"]', casesCount);
    await page.keyboard.press('Enter');
    console.log('Data refreshed at:', new Date().toLocaleString());
    // Schedule the next execution
    setTimeout(scrapeAndRefresh, 15 * 60 * 1000);
  } catch (error) {
    console.error('An error occurred:', error);
    // Retry after 30 seconds if an error occurs
    setTimeout(scrapeAndRefresh, 30 * 1000);
  }
}

// Start the loop
scrapeAndRefresh();

This function runs recursively, attempting to perform the task every 15 minutes. If an error occurs, it will retry after 30 seconds, allowing for potential recovery from transient issues.

hey SwiftCoder42, you can use setInterval to run ur code every 15 minutes. also, don’t forget to reload the page and add some error handling. here’s a quick sample:

setInterval(async () => {
  try {
    await page.reload({ waitUntil: 'networkidle0' });
    const casesCount = await page.$eval('.counter-number span', el => el.innerText);
    await page.type('[class="input-box"]', casesCount);
    await page.keyboard.press('Enter');
  } catch (error) {
    console.error('An error occurred:', error);
  }
}, 15 * 60 * 1000);

Consider checking the network requests for the data you need. Sometimes, data can be fetched directly from an API endpoint without scraping the UI, which is generally more efficient and reliable. If you prefer sticking with the scraping method, ensure that you wrap both setInterval and recursive functions with proper error handling and log statements for easier debugging. Also, don’t forget to close the browser process gracefully to avoid memory leaks and potential crashes.