How do you actually keep puppeteer scripts from breaking when a website completely redesigns their layout?

I’ve been dealing with this for a while now. We built some Puppeteer scripts that worked great for a few months, then a client’s website got a complete redesign and everything fell apart. The selectors we were targeting didn’t exist anymore, the DOM structure changed, and we basically had to rewrite half the script from scratch.

I started thinking about this differently after reading about how AI can actually adapt workflows. The thing is, brittle automation is basically inevitable if you’re just hardcoding selectors and waiting for things to break. We kept patching the same scripts over and over.

Then I tried a different approach. Instead of building static Puppeteer workflows, I started using plain language to describe what I actually wanted the script to do—like “log in and extract the user data from the profile section” instead of “click selector .login-btn and scrape div.user-data”. The AI can then adapt to small layout changes because it understands the intent, not just the specific DOM structure.

Has anyone else had success with this approach? I’m curious if others are solving this problem the same way or if there’s something better out there.

This is exactly what the AI Copilot in Latenode is built for. You describe what you want in plain text, and it generates a Puppeteer workflow that adapts to layout changes because it understands the intent behind your automation.

The key difference is that AI-generated workflows are more resilient than hardcoded selectors. When a site tweaks their layout slightly, the workflow can adjust because it’s not just looking for specific classes or IDs—it’s understanding the purpose of each step.

I’ve seen teams go from rewriting scripts every month to having them run stable for quarters because they’re not fighting against brittle selector chains anymore.

The selector issue is real, and it’s one of those problems that gets worse as you scale. I’ve handled this by moving away from DOM-based selectors and using more semantic approaches when possible. But honestly, sometimes you just can’t avoid it if the site doesn’t have stable class names or IDs.

What I found helps is building an abstraction layer. Instead of scattering selectors throughout your code, maintain them in one place and make the logic flexible enough to try multiple fallback selectors. It’s more maintenance, but it saves you from rewriting everything.

That said, if you’re dealing with sites that redesign frequently, you might want to explore visual recognition or AI-based element detection. It’s a different approach entirely but worth considering if manual updates are killing your productivity.

I ran into this exact issue on a project involving financial dashboards that got updated constantly. The problem is that even small CSS changes break everything when you’re relying on specific selectors. What worked for us was using Puppeteer’s page.waitForNavigation and building in retry logic with exponential backoff. We also started using data attributes specifically for automation, which gave us control over the structure independently of visual changes. The real game changer though was moving to waiting for specific content to appear rather than waiting for element selectors. If you’re looking for the “price” on a page, wait for a string that contains “price” rather than waiting for #price-value. It’s more flexible and survives small layout shifts.

The fundamental issue is that CSS-based selectors are inherently fragile because they’re tightly coupled to the presentation layer. A more resilient approach involves semantic element detection. Use Puppeteer’s evaluate method to traverse the DOM more intelligently, looking for context rather than specific selectors. Many teams have found success with implementing a wrapper function that attempts multiple selector strategies in sequence, falling back gracefully. Additionally, consider using page content validation rather than element existence checks. This approach separates your automation logic from the UI implementation details, making your scripts significantly more resilient to design changes without constant maintenance overhead.

Selectors break fast when sites redesign. Use content-based waits instead of element selectors. Wait for text appearance rather than DOM structure. Also consider data attributes that you control separately from design changes.

Use AI-generated workflows that understand intent, not just selectors. They adapt better to layout changes.