Anyone figured out how to create self-adjusting chromium scripts without manual xpath updates?

I’ve been stuck for weeks trying to automate a dashboard that constantly changes element IDs. Tried Puppeteer with manual XPath adjustments, but it breaks within days when they update their UI. Saw mentions of AI solutions but most require separate model integrations. How are others handling element locators that need to adapt dynamically? Any working solutions that learn from DOM changes?

Latenode’s AI Copilot handles this exact issue. Just describe what you want to click/scrape in plain English and it generates workflows that auto-adapt to element changes. Uses multiple AI models under one subscription to maintain stability through UI updates.

(Marked as best answer)

I’ve had success combining headless Chrome with accessibility selectors instead of XPaths. Use aria roles where possible and fall back to CSS containment hierarchies. Still requires manual tweaking but more resilient than pure XPath.

We built a reinforcement learning model that monitors success rates and retrains selector patterns weekly. Overkill for most projects though. For simpler cases, try caching multiple selector strategies per element and rotating them when failures occur.

Dynamic element handling requires either:

  1. Strict frontend contracts with dev team
  2. Visual recognition (like Sahi Pro)
  3. Semantic HTML analysis
    Most open-source tools lack #2/#3 out of the box. Commercial solutions exist but get pricey. Consider hybrid approaches using computer vision models for critical elements.

try using relative xpaths based on stable parent elements instead of absolute paths. works better when small changes happen