Building API-Powered SEO Tools:
Real Data, Automated Reports
The tools we've built so far work with files you provide manually. This tutorial goes further: connecting directly to external APIs so your tools can pull live data automatically. You'll build two real tools — a PageSpeed Insights batch checker and a Google Search Console data puller — and learn the universal technique for working with any API using Claude Code.
- Understand what APIs are and how to authenticate with them
- Learn how to give Claude Code API documentation to work with
- Store API keys securely using environment variables
- Build a PageSpeed Insights batch checker with CSV output
- Build a Search Console data puller that surfaces declining queries
- Know how to adapt this pattern to any API you encounter
1Understanding APIs — The Short Version
New to APIs? An API (Application Programming Interface) is a way for your script to talk to another service over the internet. Instead of you opening a browser and clicking around in PageSpeed Insights manually, your script sends a request directly to Google's servers and gets the data back — instantly, for hundreds of URLs at once.
A secret password that proves to the API who you are. Never share it or commit it to code. We store it in a separate config file.
Your script asks the API a question — e.g. "give me the PageSpeed score for this URL" — and the API sends data back as JSON.
The format most APIs return data in. It's structured text that Python can easily read. Claude Code handles all the parsing for you.
Most APIs restrict how many requests you can make per second or per day. Our scripts include delays to stay within limits.
2Storing API Keys Safely
The golden rule: never put an API key directly in your code. If you share the script or accidentally push it to a shared drive, your key is exposed. Instead, we use a .env file — a simple text file that stores secrets separately from your code.
Setting up your .env file
How to tell Claude Code about your keys: You never paste your actual API keys into the Claude Code conversation. Instead, tell Claude Code: "API keys are stored in a .env file. Use the python-dotenv library to load them — never hardcode them." Claude Code will write the correct loading code automatically.
3The Universal Technique — Giving Claude API Documentation
Claude Code was trained on a huge amount of code and API documentation, so it already knows many popular APIs. But for any API, the best approach is to paste the relevant section of the official documentation directly into your opening prompt. This removes any ambiguity and ensures Claude uses the exact endpoints, parameters, and response formats that the API currently supports.
How to find and use API documentation
Find the official docs
Search "[API name] documentation" — e.g. "PageSpeed Insights API documentation". Always use the official Google/provider source, not third-party tutorials.
Find the relevant endpoint
Look for a "Reference" or "API Reference" section. Find the endpoint that does what you need — e.g. "run pagespeed" or "query search analytics".
Copy the key sections
Copy the: endpoint URL, required parameters, authentication method, and example response JSON. You don't need the full docs page — just the relevant sections.
Paste into your Claude Code prompt
Include it in a clearly labelled section of your opening prompt: "Here is the relevant API documentation: [paste]". Claude Code will write code that matches it exactly.
What it does
Takes a CSV of URLs, calls the PageSpeed Insights API for each one (for both mobile and desktop), and outputs a report showing Core Web Vitals scores, performance category, and the top 3 opportunities for each URL — the kind of report that previously meant opening each URL manually in PageSpeed and copying numbers into a spreadsheet.
Step 1 — Get your Google API key
Go to the Google Cloud Console
Visit console.cloud.google.com and sign in with your Google account.
Create or select a project
Create a new project (e.g. "SEO Tools") or use an existing one.
Enable the PageSpeed Insights API
Go to APIs & Services → Library → search "PageSpeed Insights API" → Enable.
Create an API key
Go to APIs & Services → Credentials → Create Credentials → API key. Copy it and paste it into your .env file as GOOGLE_API_KEY.
Free tier: The PageSpeed Insights API is free up to 25,000 requests per day — more than enough for agency use. No billing setup required for this volume.
Step 2 — The Claude Code prompt
Expected output sample
Checking https://example.com/ (mobile)... score: 91 Checking https://example.com/ (desktop)... score: 97 Checking https://example.com/services (mobile)... score: 44 Checking https://example.com/services (desktop)... score: 78 Checking https://example.com/blog/seo-guide (mobile)... score: 61 ✓ Saved raw data to output/pagespeed_report.csv ✓ Saved HTML report to output/pagespeed_summary.html Summary: avg mobile 65 | avg desktop 84 | 1 URL failing CWV thresholds
Useful follow-up improvements
| Improvement | Follow-up prompt |
|---|---|
| Filter to failing URLs only | "Add a --failing-only flag that outputs only URLs with mobile score below 50." |
| Add score trends | "Modify the script to load a previous report CSV and add a trend column showing score change since last run." |
| Slack notification | "After the run, send a Slack webhook message summarising the results. The webhook URL should come from .env as SLACK_WEBHOOK_URL." |
What it does
Connects to the Google Search Console API, pulls query performance data for any property, and surfaces the top declining queries week-over-week — without you having to log into GSC, export CSVs, and compare them manually each week.
Authentication is more complex here. The Search Console API uses OAuth2 (not a simple API key) because it accesses private data. This requires a one-time setup involving a credentials JSON file. The steps below walk through it — it looks intimidating but Claude Code handles all the hard parts once you have the credentials file.
Step 1 — Set up OAuth2 credentials
Enable the Search Console API
In Google Cloud Console, go to APIs & Services → Library → search "Google Search Console API" → Enable.
Create OAuth2 credentials
Go to APIs & Services → Credentials → Create Credentials → OAuth client ID. Choose "Desktop app". Name it "SEO Tools".
Download the credentials file
Click the download icon next to your new OAuth client. Save the downloaded JSON file to seo-tools/config/gsc_credentials.json.
First run — authorise access
The first time you run the script, a browser window will open asking you to grant access to your Google account's Search Console data. After approving, a token is saved locally so you won't need to do this again.
Which account to use? Authorise with the Google account that has access to the GSC properties you want to query. For agency work, you'll typically have a service account or verified access to client properties. Claude Code can be asked to support multiple properties if needed.
Step 2 — The Claude Code prompt
Sample summary output
GSC Declining Queries Report Generated: 2026-02-19 | Site: https://example.com/ Period: Last 28 days vs previous 28 days ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ Total queries analysed: 187 Declining queries: 23 (total click loss: -412) Dropped queries: 4 TOP 10 DECLINING QUERIES ━━━━━━━━━━━━━━━━━━━━━━━━ technical seo audit clicks: 84→41 (-51%) pos: 3.2→5.8 core web vitals guide clicks: 67→29 (-57%) pos: 4.1→7.3 hreflang checker clicks: 55→38 (-31%) pos: 6.2→6.8 ...
Automating weekly runs
Once the script is working, ask Claude Code to help you schedule it to run automatically every Monday morning:
6The Universal Pattern — Applying This to Any API
Every API integration you'll ever build follows the same pattern. Once you've built these two tools, you can apply the same approach to any data source:
| Step | What to do | Claude Code's role |
|---|---|---|
| 1. Get credentials | Sign up, create API key or OAuth app in the provider's console | You do this manually — credentials can't be automated |
| 2. Store secrets | Add keys to .env file | Claude Code writes the loading code using python-dotenv |
| 3. Find the endpoint | Read the API docs and find the relevant endpoint + parameters | You paste the docs; Claude Code interprets them |
| 4. Write the integration | Describe what you want to pull and in what format | Claude Code writes all the request/response handling |
| 5. Handle errors | Tell Claude Code what edge cases to handle | Claude Code adds try/catch, retries, rate limit delays |
| 6. Format output | Specify CSV, HTML, JSON, or plain text output | Claude Code writes all the formatting and file writing |
Other APIs worth integrating using this same pattern:
- Ahrefs / Semrush API — pull backlink data or keyword rankings for automated monthly reports
- Google Analytics 4 (Data API) — pull traffic by landing page to correlate with SEO changes
- Screaming Frog API — trigger crawls and pull results programmatically (available in Screaming Frog v20+)
- ValueSERP / SerpAPI — pull SERP features data for tracking rich result ownership
7Practice Exercises
Build Tool 1 using the prompt in Section 4:
- Get your Google API key from Google Cloud Console and add it to .env
- Create a urls.csv with 5–10 URLs from one of your client sites
- Open Claude Code in your seo-tools folder and use the prompt from Section 4
- Let Claude Code build and run the script — review the HTML report in your browser
- Follow up: "Sort the HTML report by mobile score ascending so worst pages appear first"
Build Tool 2 for one of your GSC properties:
- Set up OAuth2 credentials in Google Cloud Console (follow Step 1 in Section 5)
- Save the credentials JSON to config/gsc_credentials.json
- Use the Claude Code prompt from Section 5
- On first run, complete the browser authorisation flow
- Review the declining queries output — do the flagged queries match your expectations from manually reviewing GSC?
Choose one additional API and build a simple integration:
- Pick an API your team uses: SerpAPI, ValueSERP, Ahrefs, or GA4
- Find the relevant API documentation endpoint for the data you want
- Copy the key sections (endpoint URL, parameters, example response)
- Write a Claude Code prompt using the universal pattern from Section 6
- Build, test, and save the script to your scripts/ folder
8Summary
Key takeaway: API integrations unlock the biggest time savings in SEO tool building — they replace manual data collection entirely. Once the PageSpeed checker and GSC puller are running, tasks that previously took 30–60 minutes of manual work happen in seconds, on a schedule, without anyone touching them.