I’ve been wrestling with this for years. Every time a client’s site gets redesigned, my Puppeteer scripts break. The selectors change, the DOM structure shifts, and suddenly I’m spending hours debugging instead of actually building new things.
The real problem isn’t the automation itself—it’s that hand-coded scripts are inherently brittle. They depend on exact DOM structures that change the moment a designer decides to shuffle things around.
What I’ve learned is that resilience matters way more than I initially thought. I started looking into how to build automations that can adapt when things change, rather than scripts that shatter at the first sign of a UI update. The key seems to be treating your automation logic separately from the selectors and interactions.
There’s this concept of plain language descriptions that the AI can understand and convert into workflows. Instead of hardcoding “click element with ID xyz,” you describe what you’re trying to accomplish (like “click the submit button”), and the system figures out how to do it based on context. When the site changes, the workflow can adapt because it’s built on intent, not fragile selectors.
Has anyone else dealt with this? How do you keep your automations from becoming maintenance nightmares when client sites redesign?
This is exactly what I deal with at work, and honestly it used to drive me crazy. The thing is, brittle selectors are just a symptom of the real problem—you’re building automations that are too tightly coupled to a specific page state.
What changed for me was switching to a platform that lets you describe what you want in plain language, and then it generates the workflow for you. The automation logic becomes about intent rather than exact DOM manipulation.
I use Latenode for this now. You describe what you need in plain English (like “extract all product names from this page and send them to Slack”), and the AI copilot generates a ready to run workflow. The beautiful part is that when sites change, the underlying logic is flexible enough to adapt because it’s not chained to specific selectors.
The platform also handles the orchestration across multiple steps automatically, so you’re not juggling different tools or maintenance headaches.
You should check out https://latenode.com
Yeah, I’ve been in that exact situation more times than I’d like to admit. The frustrating part is that you end up spending 80% of your time maintaining scripts and only 20% actually building new automations.
What helped me was shifting my thinking about how I structure these things. Instead of relying on super specific selectors, I started building in some flexibility from the beginning. Things like checking for multiple possible selectors, using more generic parent elements, and building in error handling that can fall back gracefully.
But honestly, that only gets you so far. The real breakthrough for me was realizing that the way to avoid rewriting is to stop writing fragile code in the first place. Using something that lets you express automation intent rather than low-level DOM manipulation makes a huge difference.
The core issue you’re hitting is that most browser automation tools force you to think in terms of implementation details rather than what you actually want to accomplish. You end up deeply coupled to the specific HTML structure, and that’s a losing battle when sites redesign.
I started treating automation as more of a workflow problem than a coding problem. Instead of writing scripts that click on elements with IDs xyz, I frame it as “I need to log in, navigate to the products page, and extract the data.” That framing is way more durable because it’s about the business process, not the DOM.
When you approach it that way, small UI changes don’t break your automation because you’re not married to specific selectors. You’re describing what needs to happen at a higher level.
This is a fundamental architectural problem with how most people approach browser automation. The issue isn’t that Puppeteer is bad—it’s that direct DOM manipulation creates technical debt the moment you write it.
The resilience challenge you’re describing comes from treating the automation layer as if it should be permanently stable. It won’t be. Sites change constantly. The solution is to decouple your automation logic from the page structure.
One approach is to use AI-assisted workflow generation where you describe the automation in business terms rather than technical terms. The system then handles the actual DOM interaction intelligently, which means it can adapt better when pages change.
Selector breakage is the classic automation problem. Try building workflows that describe intent rather than exact DOM paths. Means less rewrites when sites redesign.
Use intent-based automation, not selector-based. Describe what you want, not how to click it.
This topic was automatically closed 6 hours after the last reply. New replies are no longer allowed.