Why am I receiving an empty array when using Puppeteer for web scraping?

I am attempting to scrape data from a specific website, but Puppeteer is returning an empty array even after executing my scraping code. Below is an overview of my implementation, where I navigate through multiple pages to collect names while removing certain elements that contain ads:

const puppeteer = require("puppeteer");
const express = require("express");
const cors = require("cors");
const server = express();
server.use(cors());
let scrapedData = [];
(async () => {
  const browserInstance = await puppeteer.launch({
    headless: false,
    defaultViewport: null,
  });
  const newPage = await browserInstance.newPage();
  for (let pageIndex = 1; pageIndex < 42; pageIndex++) {
    await newPage.goto(`https://naamhinaam.com/baby-girl-names-a?page=${pageIndex}`);
    await newPage.waitForTimeout(3000);
    await newPage.click("#promotionalPopup > div > div > div > button > span");
    await newPage.$eval(
      "div.name-suggestion.mt-1 > div > div:nth-child(22)",
      (element) => element.remove()
    );
    await newPage.$eval(
      "div.name-suggestion.mt-1 > div > div:nth-child(43)",
      (element) => element.remove()
    );
    for (let itemIndex = 3; itemIndex < 54; itemIndex++) {
      let name = "Not Found";
      if (await newPage.$("div.name-suggestion.mt-1 > div > div:nth-child(22)")) {
        continue;
      }
      await newPage.waitForSelector(
        `div.name-suggestion.mt-1 > div > div:nth-child(${itemIndex}) > div.nsg__name_meaning > a`
      );
      let nameElement = await newPage.$(
        `div.name-suggestion.mt-1 > div > div:nth-child(${itemIndex}) > div.nsg__name_meaning > a`
      );
      name = await newPage.evaluate((el) => el.textContent, nameElement);
      scrapedData.push({ name });
    }
    console.log(scrapedData);
  }
  await browserInstance.close();
})();
server.get("/", (request, response) => {
  response.status(200).json(scrapedData);
});
server.listen(3000, () => {
  console.log("Server is live...");
});

I am specifically removing certain elements to avoid ads. However, in the end, I receive an empty array. Could anyone offer guidance on what might be causing this issue?

You might be encountering an issue where there is either an incorrect selector or a delay in the page loading, causing your scraper to miss the elements. Firstly, you should ensure the selectors match exactly with those on the page. Try using waitForNavigation or waitForSelector before interacting with certain elements to ensure the page is fully loaded before scraping. Additionally, adding more precise logging can help identify where the process might be failing. Injecting some additional debugging lines, like checking if nameElement is null before extracting the text, can pinpoint issues.

You could also be dealing with dynamic content that loads after your script runs. Consider using page.waitForSelector() for elements that indicate content’s fully loaded, rather than relying on arbitrary timeouts. Also, double-check your selectors for updates to the website structure since they might be invalid now. test locally first.