How to extract website keyword ranking data from Google and import into spreadsheets

I’m trying to figure out how to get keyword ranking information for websites from Google search results and put that data into a spreadsheet automatically.

Basically I want to track how many keywords a specific website ranks for in the top 20 search results for different countries. The end goal is to have this information organized in a Google Sheet with columns showing the website, country, and keyword count.

I’m pretty new to this kind of data extraction and not sure where to begin. Would this be something I could do with Python scripting? Or maybe JavaScript would work better? Are there other tools or programming languages that might be easier for this type of project?

Any advice on the best approach or what technologies to use would be really helpful. I’m open to learning whatever programming language makes the most sense for pulling this search ranking data.

Try Google Search Console API instead of scraping. I’ve worked with ranking data for two years - scraping Google gets blocked fast, even with delays and proxies. Search Console API shows actual ranking positions for keywords you already rank for. Way more reliable than fake searches. You can hook it up to Google Sheets with Apps Script pretty easily. Catch is you need the website verified in Search Console first, so it only works for sites you own or control. For competitor stuff you’ll still need scraping or paid tools. Python with Google’s API client library handles this well.

I’d go with Python and Selenium WebDriver for this. Built similar tracking systems before and it handles Google’s dynamic content really well. You can automate searches, parse results, and export straight to CSV files that sync with Google Sheets through their API. Main issue you’ll hit is Google’s rate limiting and blocking, so you need proper delays and rotating user agents. Start with a small keyword set to test your approach first. Don’t bother with BeautifulSoup and requests - won’t work reliably since Google uses tons of JavaScript. That’s why you need Selenium even though it’s slower.

i totally agree! using tools like SEMrush or Ahrefs makes it way easier, trust me. plus, google really doesn’t like scrapers nowadays. but if you wanna code, Puppeteer with Node.js is a solid choice for this task.