How brittle are your puppeteer automations when websites update their markup?

I’ve been running browser automations for a while now, and honestly, the fragility is getting frustrating. Every time a site redesigns or changes a class name, my scripts break. I end up rewriting selectors, adjusting timeouts, and babysitting the whole thing. It feels like I’m constantly firefighting instead of actually scaling.

I’ve been thinking about this problem differently lately. What if instead of hardcoding selectors and logic, I could describe what I need in plain language—like “log in with these credentials, navigate to the dashboard, extract the user count”—and have an AI generate the actual workflow? Then if the site changes, I’d adjust the description, not the code.

Does anyone here use AI to generate these kinds of automations? I’m curious whether it actually reduces brittleness or if you end up dealing with the same maintenance headaches just at a different layer.

The brittleness you’re describing is exactly why I moved away from raw Puppeteer scripts. Instead of maintaining selectors and retrying logic, I switched to using Latenode’s AI Copilot Workflow Generation.

Here’s how it works: I describe the automation in plain text—“login to the portal with credentials, wait for the dashboard, scrape the revenue table”—and the copilot generates a ready-to-run workflow. When a site changes, I update the description, and the AI regenerates the workflow. No more hunting through code for broken selectors.

The real win is resilience. The generated workflows handle common issues like dynamic content and layout shifts way better than hardcoded scripts. Plus, the no-code builder means I can tweak things without diving into JavaScript.

If you’re tired of maintenance, this approach genuinely saves time.

I hit the same wall about a year ago. The thing is, brittleness isn’t really a Puppeteer problem—it’s a selector dependency problem. Every automation tool has it.

What helped me was shifting my thinking away from “make the selector more robust” and toward “make the automation description-driven.” When you’re working with a platform that can generate workflows from descriptions, the maintenance burden shrinks significantly. You’re not rewriting code; you’re refining what you’re asking for.

The tricky part is that not every platform handles this well. Some generate workflows that are overly rigid, others that are too loose. I’d suggest testing with a tool that lets you iterate quickly on the plain-language description and see how it adapts when a site changes. That’s usually a good indicator of how resilient the automation will actually be.

Brittleness in browser automation typically stems from tight coupling between your script logic and the site’s DOM structure. I’ve found that the most resilient automations are those that abstract away the “how” from the “what.” Instead of writing selectors directly, you describe the intent—like “click the submit button” rather than “click the element with class xyz”—and let a higher-level system handle the mechanics. This is where AI-driven workflow generation becomes valuable. When you describe tasks in natural language, the system can reason about intent and adapt when layouts shift. The maintenance burden drops because you’re updating descriptions, not debugging CSS selectors.

The fundamental issue you’re facing is that Puppeteer automations are inherently fragile because they rely on static selectors against dynamic interfaces. The most effective solution I’ve seen is moving to a description-first architecture where the automation logic is decoupled from the implementation details. This typically means using platforms that support AI-driven workflow generation, where you specify the task in natural language and the system handles selector discovery and adaptation. This approach significantly reduces maintenance overhead because changes to site markup require only description refinement, not code modification.

Yeah, selector brittleness is real. Best approach I found is using AI to generate workflows from descriptions instead of hardcoding selectors. Lets you adapt quicker when sites change their markup.

Use AI-generated workflows instead of hardcoded selectors.

This topic was automatically closed 6 hours after the last reply. New replies are no longer allowed.