How brittle are your puppeteer selectors when sites completely redesign?

I’ve been hand-coding Puppeteer automations for a while now, and honestly, it’s become a nightmare. Every time a client’s website gets a redesign, my selectors break. I end up spending hours hunting through the DOM trying to figure out what changed, updating CSS selectors or XPath expressions, testing them again, and hoping they hold up for at least a few months.

The real pain is that I’m writing the same fragile code over and over. A selector that worked perfectly last month suddenly stops finding elements because the dev team added a wrapper div or changed a class name for styling purposes.

I’m curious whether anyone else has found a way to make Puppeteer automations more resilient, or if you’ve moved to a different approach altogether. Have you tried using AI-powered workflow generation to handle these kinds of dynamic page changes without constantly rewriting your code?

This is exactly where most people get stuck with raw Puppeteer. The brittle selector problem is baked in because you’re fighting against page changes manually.

What changed my workflow was switching to using Latenode’s AI Copilot. Instead of obsessing over selectors, I describe what I actually need in plain English: “extract the product price from the main product card” or “click the submit button after login”. The AI generates the workflow, and here’s the key part—it’s built to adapt.

When the site redesigns, I don’t rewrite selectors. I update my plain-language description if needed, regenerate the workflow, and it handles the new structure. The copilot picks smarter, more semantic ways to find elements rather than brittle class-based selectors.

I also use the no-code builder to add fallback steps. If one selector fails, the workflow tries alternatives automatically. No more weekend firefighting when design teams reshuffle the DOM.

You should check it out: https://latenode.com

I used to do exactly what you’re doing. The selector hunting becomes this endless cycle where you’re always one redesign away from everything breaking.

What helped me was moving away from thinking about specific selectors and instead focusing on finding elements by their content or structure that’s less likely to change. Like using text content or ARIA labels instead of relying on class names that designers tinker with constantly.

But honestly? That still requires you to write and maintain code. The deeper issue is that hand-coded automation is inherently fragile because it’s tightly coupled to the current page structure. You’re always reacting to changes rather than building something that anticipates them.

Some teams I’ve worked with moved to higher-level abstraction layers that describe intent rather than implementation. That separation lets the underlying logic handle variations without you touching anything.

The brittleness you’re experiencing is a fundamental problem with selector-based automation. Even experienced developers struggle with this because websites are constantly evolving. Static selectors fail predictably when layouts change. I’ve found that combining multiple selector strategies—like using element text content, data attributes, and structural relationships—reduces failures, but it’s still maintenance-heavy. The real solution requires automation that can adapt intelligently rather than relying on fixed selectors. Some platforms now use AI to understand page elements semantically, which means they find the right element based on context rather than just CSS classes or IDs.

Selector brittleness is a well-documented challenge in browser automation. The core issue stems from coupling your automation directly to the DOM structure, which changes frequently. Modern approaches use multiple identification strategies—text matching, attribute-based selection, and structural hierarchy—to improve resilience. However, the most effective solutions involve abstracting away from specific selectors entirely. Rather than writing selectors, describing the intended action at a higher level allows the automation layer to intelligently determine element location regardless of minor structural changes.

Use semantic selectors—target by content and aria labels, not just classes. Add fallbacks for flexibility.

This topic was automatically closed 6 hours after the last reply. New replies are no longer allowed.