Track your website's Google Search rankings with extended position tracking (up to position 100).
| File | Description |
|---|---|
fetcher.py |
Main script |
credentials.txt |
Your API credentials (fill this!) |
keywords.xlsx |
Your keyword list (you create this) |
serp_progress/ |
Auto-created: stores progress |
serp_config.json |
Auto-created: your configuration |
serp_scraper.log |
Auto-created: log file |
Note:
serp_progress/keeps your progress so you can continue the next day when API quota refreshes.
Required: pip install pandas requests openpyxl
Optional (for stealth scraping): pip install beautifulsoup4 fake-useragent cloudscraper
You need a Google API Key and Search Engine ID (cx).
- Go to Google Cloud Console
- Create a new project (or select existing)
- Enable Custom Search API
- Go to Credentials → Create Credentials → API Key
- Copy your API key
- Go to Programmable Search Engine
- Click Add to create new search engine
- Under "Sites to search" enter
*(searches entire web) - Name it (e.g., "SERP Tracker")
- Click Create
- Copy the Search Engine ID (cx)
- In your search engine settings
- Scroll to Search features → Region
- Select United Kingdom (or your target country)
- Toggle Region-restricted results to ON
api_key=YOUR_GOOGLE_API_KEY_HERE cx=YOUR_SEARCH_ENGINE_ID_HERE
Create an Excel file (.xlsx) or CSV:
| Keyword |
|---|
| industrial hinges |
| cabinet locks |
| panel fasteners |
cd path/to/folder python fetcher.py
The script generates an Excel report with:
- Summary - Overview stats
- Top 10 Competitors - All top 10 results per keyword
- Your Site Positions - Where your site ranks
- Competitor Analysis - Domain frequency & avg position
- Not Ranking - Keywords where you don't appear
- Free: 100 queries/day
- Paid: $5 per 1,000 queries
Script saves progress automatically. Run again tomorrow to continue!