I’ve been dealing with this headache for months now. Every time a site I’m scraping makes even a small layout change, my selectors break and I’m back debugging. It’s gotten to the point where I’m spending more time maintaining my scripts than building new ones.
I’ve heard that AI can help generate more resilient workflows that don’t rely on brittle selectors, but I’m skeptical. Has anyone actually gotten this to work in practice? I’ve tried a few AI code generators and they just spit out the same fragile selector-based stuff I was doing manually.
The context I’ve found talks about AI-powered code writing and debugging, plus headless browser features for things like form completion and web scraping. But what’s the actual experience—do these AI-generated workflows hold up when a site redesigns its login form or restructures its product listing pages?
I’m not looking for a magic bullet, just wondering if there’s a real workflow out there that can adapt when things change instead of breaking immediately. What’s your experience been?