Featured Post

How I Automated Blog SEO Optimization with Claude Code

A case study on using Claude Code agents to audit and optimize an entire blog. Covers title fixes, internal linking, and measurable before/after results.

Richard Joseph Porter
9 min read
claude-codeseoai-developmentblog-optimizationcase-study

What if you could audit and optimize every blog post on your site in under an hour? That's exactly what I did using Claude Code on richardporter.dev. In this case study, I'll walk through how I set up a specialized agent, ran a full audit, and fixed the issues it found.

The Problem

As a developer, I know SEO matters, but manually optimizing blog content is tedious and easy to do inconsistently. Each post needs:

  • Title length verification (50-60 characters for SERP display)
  • Meta description optimization (150-160 characters)
  • Internal linking between related posts
  • Consistent tag taxonomy
  • Proper content structure

Across a growing blog, that's hours of repetitive work. I wanted to automate it.

Building the Agent

I created a specialized agent using Claude Code's plugin structure. Agent files use Markdown with YAML frontmatter:

---
name: blog-post-writer
description: Use this agent for SEO optimization and technical writing tasks
model: inherit
---

## Role
SEO/SEM Specialist & Technical Writer

## Expertise
- On-page SEO (titles, meta descriptions, headers)
- Internal linking strategy
- Content structure optimization
- Technical SEO best practices

## Available Tools
The agent uses WebSearch for trends, Read for content analysis,
Glob/Grep for linking opportunities, and Write for optimizations.

The agent's system prompt encodes specific SEO rules:

const seoRules = {
  onPageSEO: {
    titleOptimization: "50-60 characters with keyword near beginning",
    metaDescription: "150-160 characters with keyword and CTA",
    headerHierarchy: "H2 → H3 → H4 with natural keyword placement",
    keywordDensity: "1-2% natural placement throughout content"
  },
  internalLinking: {
    frequency: "2-4 related posts per article",
    anchorText: "Descriptive, keyword-rich (not 'click here')",
    strategy: "Build topic clusters through strategic linking"
  },
  contentQuality: {
    length: "1500-2500 words for technical topics",
    readability: "Flesch Reading Ease score 60+",
    structure: "Headers every 300-400 words"
  }
};

Here's the overall workflow:

┌─────────────┐    ┌──────────────┐    ┌─────────────┐    ┌──────────────┐
│   Define    │───▶│    Audit     │───▶│  Identify   │───▶│  Implement   │
│   Agent     │    │  Blog Posts  │    │   Issues    │    │    Fixes     │
└─────────────┘    └──────────────┘    └─────────────┘    └──────────────┘
       │                  │                  │                   │
       ▼                  ▼                  ▼                   ▼
  Agent config      Read content       Title length         Auto-update
  with SEO rules    Check metadata     Meta desc length     frontmatter
  Tool access       Find links         Missing links        Add links

What the Audit Found

With the agent configured, I ran a full audit across all blog posts. The results were worse than I expected.

Titles Were Too Long

Over 70% of posts exceeded the 60-character SERP display limit. Search engines were truncating titles, which hurts click-through rates.

Before optimization:

  • "Claude Sonnet 4.5's Unexpected Impact on Claude Code Users in the Lowest Paid Subscription Tier" — 92 characters (53% over limit)
  • "Building a Blog Feature for Your Next.js Portfolio: A Complete Implementation Guide" — 83 characters (38% over)
  • "Kimi K2: The Game-Changing Alternative to Claude Code That Might Surprise You" — 83 characters (38% over)

Average title length: 72 characters (20% over optimal)

After optimization:

  • "Claude Sonnet 4.5: What Pro Plan Users Need to Know" — 58 characters
  • "Next.js Blog Tutorial: Building a Markdown Blog with SSG" — 57 characters
  • "Kimi K2: A Powerful Claude Code Alternative for Developers" — 59 characters

Every Meta Description Was Truncated

Every single post exceeded the 160-character meta description limit. Google was cutting them off mid-sentence in search results.

Before optimization:

  • Claude Sonnet 4.5 post: 234 characters (46% over limit)
  • Contact Security post: 208 characters (30% over)
  • Blog Feature post: 182 characters (14% over)

Average excerpt length: 186 characters (16% over optimal)

After optimization: All excerpts trimmed to 149-159 characters, displaying fully in SERPs.

Most posts had zero internal links. Content existed in isolation instead of forming connected topic clusters.

Before: Only 14% of posts had any internal links. After: Every post has 2-4 internal links to related content.

Only one post was marked as featured, despite having several comprehensive cornerstone articles that qualified.

Inconsistent Tags

Tags mixed capitalization styles and lacked any convention:

  • "AI Development" vs "ai-development"
  • "Claude Code" vs "claude-code"
  • Missing relevant tags like "rate-limiting", "api-security"

All tags were standardized to lowercase hyphenated format.

The Optimization Process

Titles and Meta Descriptions

Titles were rewritten to fit within 50-60 characters with the primary keyword near the beginning:

  • Before: "Claude Sonnet 4.5's Unexpected Impact on Claude Code Users in the Lowest Paid Subscription Tier" (92 chars)
  • After: "Claude Sonnet 4.5: What Pro Plan Users Need to Know" (58 chars)

Meta descriptions were condensed to 150-160 characters with a clear value proposition.

Internal Linking

The agent identified and built topic clusters by cross-linking related posts:

  • AI Tools cluster: Claude Sonnet 4.5 ↔ Kimi K2 ↔ Qwen3-Coder
  • Next.js Development cluster: Portfolio ↔ Blog Feature ↔ Dark Mode
  • Security cluster: Contact Form Security ↔ Portfolio

This keeps visitors on-site longer and helps search engines understand content relationships.

Comprehensive posts (2000+ words covering a topic end-to-end) were promoted to featured status. All tags were converted to a consistent lowercase, hyphenated format.

The entire process — reading, analyzing, and rewriting all posts — took about 45 minutes. Doing this manually would have taken 10-15 hours.

Results

Before vs After

Metric Before After Change
Titles within 50-60 chars 29% 100% +71%
Excerpts within 150-160 chars 0% 100% +100%
Posts with internal links 14% 100% +86%
Internal link density minimal 3+ per post significant
Featured posts 14% 71% +57%
Tag consistency 60% 100% +40%

What to Expect from These Changes

Based on industry benchmarks, here's what optimizations like these typically produce:

  • Title optimization tends to improve CTR by 20-30% since full titles display in SERPs instead of being truncated
  • Meta description fixes typically improve CTR by 15-25% for the same reason
  • Internal linking improves page authority distribution and can increase pages per session by 25-35%

These aren't guaranteed outcomes — every site is different. But the technical improvements (no more truncation, better content discovery) are real and measurable immediately.

How to Do This Yourself

Step 1: Set Up Your Agent

Create an agent definition file with SEO rules (title lengths, meta description limits, internal linking targets, tag conventions) and tool access (WebSearch, Read, Glob/Grep, Write). Use the Markdown with YAML frontmatter format shown above. For more on agent configuration, see my legacy codebase practitioner's guide.

Step 2: Run the Audit

cd your-project
claude

# Then prompt:
# "Audit all blog posts for SEO issues — check title length, meta descriptions,
# internal links, and tags. Then optimize each post and add cross-links
# between related content."

Step 3: Review and Deploy

git diff content/blog/   # Review every change
npm run build            # Make sure nothing is broken
git add content/blog/ && git commit -m "SEO optimization" && git push

Extending with MCP and Skills

Two approaches can take this further:

MCP with Puppeteer lets you run browser-based audits that see your site the way search engines do — including JavaScript-rendered meta tags, Core Web Vitals, and structured data. Add it to your .mcp.json:

{
  "puppeteer": {
    "command": "npx",
    "args": ["-y", "@anthropic/mcp-server-puppeteer"]
  }
}

Claude Skills turn your SEO workflow into a reusable command. Define a skill with your standards (title/excerpt lengths, internal linking rules, tag conventions) and invoke it with a slash command inside Claude Code. This makes optimization repeatable without rewriting prompts.

Lessons Learned

Automation Pays Off Quickly

The agent approach took about 2.75 hours (including setup), compared to an estimated 11+ hours manually. That's a 73% time savings, and the agent is reusable for every future post. For strategies on managing token costs during this kind of work, see my token management guide.

Consistency Matters More Than Perfection

Manual optimization leads to inconsistent results — some posts get attention, others don't. The agent applies the same standards everywhere, every time.

Data Reveals Blind Spots

I thought my posts were reasonably well-optimized. The audit showed that most titles were too long, every meta description was truncated, and internal linking was nearly non-existent. Measuring against specific criteria removes guesswork.

Don't Over-Optimize

This is worth calling out explicitly: it's easy to go too far. Stuffing keywords into every heading, manufacturing terms to rank for, and adding FAQ sections purely for search engines will hurt you. Write for people first. Optimize the technical details (character counts, linking, structure) and leave the content natural.

Frequently Asked Questions

Can Claude Code actually handle SEO tasks?

Yes, but it's best at systematic, measurable tasks — checking character counts, finding linking opportunities, standardizing tags, and rewriting titles within constraints. It's less effective at subjective calls like content quality or keyword strategy. Use it as a power tool, not a replacement for thinking.

How long before optimizations show up in rankings?

Typically 3-6 months for significant ranking changes. Google needs to re-crawl your content and observe improved engagement signals. Title and meta description changes can affect CTR within 2-4 weeks. Internal linking improvements take longer, usually 2-3 months.

What's the single most impactful optimization?

Internal linking. It's fast to implement, immediately improves user experience, and distributes page authority across your site. If you only do one thing, do this.

How do I track whether it's working?

Use Google Search Console to monitor impressions, CTR, average position, and pages per session. Set a baseline before you optimize, then compare at 30, 60, and 90 days.

Wrapping Up

Using Claude Code to automate SEO optimization turned a multi-day manual process into a 45-minute automated one. The technical improvements are real: every title fits in SERPs, every meta description displays fully, and every post links to related content.

The bigger lesson is that AI agents are excellent at applying consistent rules across large amounts of content. The rules themselves still need to come from you — and you need to know when to stop optimizing.

If you're interested in other technical implementations on this site, check out how I built the blog feature with Next.js, implemented secure contact forms, and created a modern portfolio with Next.js 15. For more Claude Code productivity strategies, see my token management guide and legacy codebase practitioner's guide.


Want to try this yourself? Set up a Claude Code agent with your SEO rules and run an audit. Start with internal linking — it's the quickest win.

Richard Joseph Porter - Senior PHP and Laravel Developer, author of technical articles on web development

Richard Joseph Porter

Senior Laravel Developer with 14+ years of experience building scalable web applications. Specializing in PHP, Laravel, Vue.js, and AWS cloud infrastructure. Based in Cebu, Philippines, I help businesses modernize legacy systems and build high-performance APIs.

Looking for Expert Web Development?

With 14+ years of experience in Laravel, AWS, and modern web technologies, I help businesses build and scale their applications.