# seo-google

> Google SEO APIs providing Search Console analytics, URL inspection, PageSpeed Insights, CrUX field data with 25-week history, Indexing API submissions, GA4 organic traffic reporting, and YouTube and NLP analysis.

**Use case**: Pull real Google field data without leaving Claude Code

**Canonical URL**: https://agentcookbooks.com/skills/seo-google/

**Topics**: claude-code, skills, marketing, seo

**Trigger phrases**: "search console", "GSC", "PageSpeed", "CrUX", "impressions"

**Source**: [AgriciDaniel](https://github.com/AgriciDaniel/claude-seo/tree/main/skills/seo-google)

**License**: MIT

---

## What it does

`seo-google` is a Claude Code skill from AgriciDaniel's [claude-seo repo](https://github.com/AgriciDaniel/claude-seo). It bridges the gap between static analysis and Google's real-time field data: actual Chrome user Core Web Vitals, real indexation status per URL, search performance (clicks, impressions, CTR, position), and organic traffic trends. All APIs are free — they require a Google Cloud project with an API key and/or service account, but cost nothing per call.

The skill is organized into four credential tiers: Tier 0 (API key only) unlocks PageSpeed Insights, CrUX, CrUX 25-week history, YouTube search, and NLP entity analysis; Tier 1 (service account) adds Search Console, URL Inspection, sitemap status, and Indexing API; Tier 2 adds GA4 organic traffic; Tier 3 (Google Ads account) adds Keyword Planner volume. The skill detects your credential tier at startup and communicates it before running any command. It also offers PDF/HTML report generation via `/seo google report` after any analysis.

## When to use it

Reach for it when:

- You need real CrUX field data rather than Lighthouse lab scores — field data reflects actual users' experience
- You want to verify whether a URL is actually indexed by Google (URL Inspection returns the real verdict, not an inferred one)
- You need Search Console click/impression data to find quick-win keywords at positions 4–10

When *not* to reach for it:

- Google API credentials are not configured — run `/seo google setup` first; the skill will walk you through the setup step by step
- You need CrUX data for a low-traffic site; CrUX requires sufficient Chrome user volume and returns a 404-equivalent when data is insufficient (not an auth error — the skill distinguishes these)

## Install

Copy the [`seo-google` SKILL.md](https://github.com/AgriciDaniel/claude-seo/tree/main/skills/seo-google) into `.claude/skills/seo-google/` along with the `scripts/` directory from the repo root.

Trigger phrases: "search console", "GSC", "PageSpeed", "CrUX", "field data", "indexing API", "GA4 organic", "URL inspection", "real CWV data", "impressions", "clicks", "CTR", "LCP", "INP", "CLS".

Run `/seo google setup` to configure credentials. Commands are tier-gated: the skill tells you which tier is active before every command.

## What a session looks like

A typical session has three phases:

1. **Credential detection.** The skill runs `python scripts/google_auth.py --check --json` and reports which tier is active, which commands are available, and what to configure to unlock the next tier. This takes a few seconds and runs before any other command.
2. **Data retrieval.** Commands follow the pattern of fetching one data type at a time: `pagespeed <url>` for lab + field CWV, `gsc <property>` for search performance, `inspect <url>` for indexation status. Each uses a dedicated script with JSON output.
3. **Report generation.** After collecting data, `/seo google report <type>` generates a professional PDF with charts: CWV gauges and 25-week trend lines for `cwv-audit`, query tables with quick-win highlights for `gsc-performance`, and coverage donut charts for `indexation` reports.

## Receipts

**Works well:** The `crux-history <url>` command showing 25 weeks of Core Web Vitals field data is genuinely difficult to replicate without this setup — it shows whether your CWV improvements are actually landing for real users or just in lab conditions. The quick-win detection in GSC (queries at positions 4–10 with high impressions) is a reliable first pass for finding content to optimize.

**Backfires:** The Indexing API is officially supported only for JobPosting and BroadcastEvent/VideoObject pages. The skill documents this restriction clearly and informs the user — but some people expect it to force-index any URL, which it does not reliably do for general content.

**Pattern that works:** Use `inspect-batch` after any migration that touches URL structure or canonicals. Batch inspection of your top 50 pages takes a few minutes and tells you the real indexation verdict for each — it catches canonicalization mistakes that won't show up in GA4 for two to three weeks.

## Source and attribution

Originally written by [AgriciDaniel](https://github.com/AgriciDaniel). The canonical SKILL.md and supporting scripts live in the [`seo-google` folder](https://github.com/AgriciDaniel/claude-seo/tree/main/skills/seo-google) of the [claude-seo repository](https://github.com/AgriciDaniel/claude-seo).

License: MIT. Install, adapt, and redistribute with attribution preserved.

This page documents the skill from a practitioner's perspective. For the formal spec and updates, defer to the source repo.