I’ve been stuck for months trying to figure out how to automate some repetitive scraping work. Every time the website changed its layout even slightly, my scripts would break and I’d have to rewrite everything from scratch. It was honestly draining.
Last week I tried something different. Instead of writing out selenium scripts line by line, I just described what I wanted to happen in plain English. Basically told the system: “log into this site, grab the product prices, extract the titles, and dump it into a spreadsheet.” Nothing fancy, just what I actually needed.
Turned out the AI could translate that into a working workflow that actually handled the changes without me touching the code again. When the site changed some CSS classes, the workflow adapted instead of just dying.
I’m curious though—has anyone else tried this approach? How reliable has it been for you when sites actually went through real UI overhauls, not just minor tweaks?