I’ve been working with Puppeteer for a couple of years now, and one thing that constantly frustrates me is how fragile the scripts become once a website updates its layout. I’ll write out a detailed automation for scraping product data or filling forms, and everything works perfectly in testing. Then two weeks later, the target site redesigns their checkout process or changes class names, and the whole thing falls apart.
I started thinking about this more systematically—the real issue isn’t Puppeteer itself, but how I’m writing the selectors and building the logic. Static XPaths based on DOM structure are inherently brittle. But there’s another angle I’ve been exploring: what if I could describe what I actually want the automation to do in plain English, and let something smarter figure out the resilient approach?
I’ve heard some people mention using AI to help generate these workflows in a way that’s supposed to be more adaptive, but I’m skeptical. Has anyone actually tried converting their plain-English automation requirements into a generated workflow that held up when sites changed? I’m curious how much of the brittleness problem that actually solves versus just pushing the problem elsewhere.
This is exactly what I run into too. The real breakthrough for me was realizing that manually writing selectors against a specific DOM structure is fighting against the internet itself. Sites constantly update, and you’re constantly patching.
What changed things was using an AI copilot approach where I describe the task in plain English—like “extract the product title and price from this page”—and the workflow gets generated with more robust logic than I’d write by hand. The AI actually reasons about the content rather than just relying on brittle selectors.
I’ve had workflows generated this way survive multiple design overhauls because they capture intent, not just DOM structure. You describe “click the submit button and wait for the confirmation message” rather than hardcoding a selector that breaks in the next redesign.
Latenode’s AI Copilot does exactly this. You write your automation goal in plain text, and it generates a ready-to-run workflow that handles the resilience problem you’re describing. Then if something does break, you can tweak it in their builder or add custom logic without rewriting from scratch.
I spent months fighting this exact problem before I realized I was approaching it wrong. The scripts broke because I was optimizing for “works right now” instead of “still works in three months.”
What actually helped was building with redundancy built in. Instead of relying on a single selector, I’d try multiple ways to locate an element—class name, then data attribute, then text content fallback. It’s more code upfront, but it survives redesigns way better.
That said, there’s a limit to how much you can patch manually. The real solution is having something that understands the meaning of what you’re trying to do, not just the mechanics. If a workflow is generated from a description of your goal rather than hardcoded selectors, it naturally becomes more resilient.
The brittleness you’re describing is a fundamental issue with selector-based automation. When you hardcode XPaths or CSS selectors, you’re betting that the HTML structure won’t change. That’s a bad bet on the modern web.
The approaches that work better involve either heavily redundant selectors with fallbacks, or alternatively, moving away from pure structural selection entirely. Some teams I’ve worked with started using visual indicators or text matching instead, which survives layout changes more gracefully. But that requires rethinking how you write your automation logic from the ground up.
This is a known limitation of traditional browser automation. The dependency on DOM selectors creates technical debt as soon as you deploy. Industry solutions address this through semantic understanding of page content rather than structural queries.
Some platforms now generate automation based on natural language descriptions of the task. This approach builds workflows that understand intent and adapt to minor structural changes better than selector-based approaches. The workflow understands that it needs to “find and click the proceed button” rather than “click element with ID button-456,” which breaks on redesign.
Hardcoded selectors break on redesigns. Use AI-generated workflows that understand what you’re trying to do rather than exact DOM paths. Much more resilient approach.