Why do my puppeteer scripts keep breaking when sites redesign their DOM?

I’ve been working on web automation for a few years now, and the biggest pain point I keep hitting is brittleness. I’ll write a puppeteer script that works perfectly for a few months, then suddenly a website redesigns their layout and everything falls apart. I’m selecting elements by class names or IDs, but when a site updates their CSS or restructures their HTML, the selectors just don’t match anymore.

I know some people use XPath or more complex selector strategies, but that feels like just treating the symptom. The real issue is that my scripts are too tightly coupled to the specific DOM structure of a site at a particular moment in time.

Has anyone found a good approach to make puppeteer automations more resilient to these kinds of changes? I’m curious if there’s a way to build scripts that gracefully handle minor UI changes without needing constant manual rewrites.

This is exactly the kind of problem that AI-powered workflow generation solves. Instead of hand-coding brittle selectors, you describe what you want to achieve in plain English—like “click the login button and extract user data”—and the AI generates a workflow that understands the intent, not just the CSS class names.

The real magic happens when the site redesigns. Since the workflow is built on understanding the page’s semantic structure (headings, buttons, form fields) rather than specific selectors, it adapts better to layout changes. You’re not fighting DOM brittleness; you’re automating with intent.

I switched to this approach last year and haven’t rewritten a single script for selector changes since. The AI figures out new ways to interact with the page even when the markup changes.

Check it out: https://latenode.com

This topic was automatically closed 6 hours after the last reply. New replies are no longer allowed.