Skip to main content
Back to Projects
SE Ranking logo

Senior Technical Product Manager - APISE Ranking

Jan 20264 min read

MCP Driving 30% of Signups

First to ship MCP. Iterated from local to remote. Now 30% of signups

apiai-ml0-to-1gtmpmf

TL;DR

First to ship Model Context Protocol (MCP) in the SEO/search space. Iterated from local/subscription (poor retention) to remote/pay-as-you-go (signups skyrocketed). Now accounts for 30% of company signups—SEO professionals querying data via ChatGPT, Claude, and Gemini.

Context

AI agents needed easy ways to access SEO data. Model Context Protocol (MCP) was emerging as the standard for connecting AI assistants to external data sources. Being first to market could capture a new segment.

Problem

  • AI agents couldn't easily access SEO data
  • Technical barrier was high for non-developers
  • Traditional API required coding knowledge
  • SEO professionals wanted natural language access to data

What I Did

V1: Local Setup + Subscription (Failed)

  • Shipped MCP integration running locally
  • Required technical setup (Node.js, configuration files)
  • Subscription-based pricing model

What went wrong:

  • Setup abandonment: 60%+ of users dropped off during the Node.js installation step. We were asking SEO professionals to become developers.
  • Wrong pricing model: Monthly subscription made sense for heavy users, but most users wanted occasional queries. The commitment felt too high for "let me just try this."
  • Support burden: The few users who completed setup generated constant support tickets about configuration issues, updates breaking, and localhost problems.
  • Retention cliff: Week 1 retention was under 15%. Users tried it once, hit friction, and never came back.

The signal we almost missed: Users who successfully set it up loved it. The product worked—the delivery mechanism didn't. This insight drove V2.

V2: Remote + Pay-as-You-Go

  • Pivoted to remote-hosted MCP server
  • Pay-as-you-go pricing (no subscription required)
  • One-click setup in Claude, ChatGPT, and Gemini

Result: Signups skyrocketed.

Key Decisions

Bet Early on Emerging Protocol

Committed to MCP before it was widely adopted. Being first established us as the go-to SEO data source for AI assistants.

Remove Friction at Every Step

V1 failed because of friction. V2 succeeded by eliminating setup complexity and subscription commitment.

Target Non-Developers

Realized the market wasn't AI developers (who use the API directly) but SEO professionals who wanted natural language access.

Technical Details

  • MCP Server: Remote-hosted server handling authentication and rate limiting
  • Pay-as-You-Go Metering: Usage-based billing integrated with existing payment infrastructure
  • Multi-Platform Support: Compatible with Claude, ChatGPT, and Gemini
  • Natural Language Interface: Optimized prompts and responses for conversational queries

Results

30%
Of Company Signups
First
To Ship MCP in Space
3
AI Platforms Supported
  • SEO professionals querying data via natural language on ChatGPT, Claude, and Gemini
  • Opened AI agent and no-code markets (n8n, Make.com)
  • Created new acquisition channel independent of traditional marketing

Lessons Learned

  1. Fail fast, iterate faster: V1 was wrong, but we learned in 3 weeks what might have taken 6 months of planning to hypothesize. The cost of V1 was low; the learnings were invaluable.
  2. Friction kills products: Same core technology, completely different outcomes. V1→V2 was a packaging change, not a technical rebuild. Sometimes the product isn't the problem—the delivery is.
  3. Listen to the users who succeed: The signal was in our happiest users, not our churned ones. Those who completed V1 setup loved the product. That told us what to protect in V2.
  4. First-mover advantage is real: Being first to ship MCP in our space established us as the default choice—but only because V2 made it easy to adopt.
  5. Find the real user: We thought we were building for developers but found product-market fit with SEO professionals. The best insight came from watching who actually used it, not who we designed it for.