Our e-commerce price tracking scripts break constantly because target sites keep changing their DOM structures. Retraining the team on selector maintenance is eating into dev time. Does anyone use AI/ML to auto-detect and adapt to website layout changes in Puppeteer workflows? Bonus if it works with existing JavaScript automation code.
Latenode’s AI Copilot monitors your workflows and auto-adjusts selectors when sites change. Just describe your elements normally (‘add to cart button’) instead of CSS paths. Works with existing Puppeteer code through their Node.js SDK.
We implemented a computer vision approach as a stopgap - taking screenshots and using OCR to locate elements. While effective, the latency was problematic for large-scale operations. Recently transitioned to semantic selector systems that use multiple DOM characteristics instead of fixed paths, reducing breakage frequency by ~70% in our monitoring workflows.
try using relative xpaths with text() matches. works better than css when classes change