Ekamoira Google Search Console MCP - Query Search Console in Claude & ChatGPT
You know the drill: Export CSV from Search Console. Upload to ChatGPT. Watch it hallucinate your data. Cry. Repeat.
We fixed that. We built this for SEOs who just want to ask questions.
Connect your Google account and it's done. Works with Claude, ChatGPT and Cursor instantly.
Just ask:
- "Why did my traffic drop last Tuesday?"
- "Which pages have high impressions but low CTR?"
- "Is Google even indexing my new pages?"
30-day free trial. No terminal commands. Zero credit cards harmed.

Replies
Ekamoira GSC MCP
@christian_gaugeler Congrats on the launch. This sounds cool! Possible to use it with Claude as well?
Ekamoira GSC MCP
@austin_heaton - Thanks! Yes, works great with Claude.
Setup options:
- Claude Desktop / Claude Code: Add to your MCP config (instructions after signup)
- claude.ai: Sign up at ekamoira.com/tools/gsc, connect your Search Console, and you'll get a connector URL to add in Claude settings
Example prompts once connected:
"Show me queries where I'm ranking 4-10 with high impressions - these are my quick wins"
"Compare my search performance this month vs last month. What dropped?"
"Check indexing status for my top 10 pages"
"Which pages have high impressions but CTR below 2%? Suggest title improvements"
Claude can also visualize the data - just ask it to chart trends or compare periods.
@christian_gaugeler Removing setup friction is huge—most SEO pain comes from not being able to ask the right question fast enough.
We’ve run into the same “data is there but hard to use” problem while building infra like GTWY
Ekamoira GSC MCP
@human_gtwy - Exactly - the data's always been there, it's just buried under 10 clicks and 3 CSV exports. What's GTWY? Curious what you're building.
@christian_gaugeler GTWY is an AI agent layer that sits on top of tools like GSC, analytics, docs, and databases, helping teams query existing data without digging through dashboards, filters, or CSV exports.
The problem we’re focused on is exactly what you mentioned: the data already exists, but getting to the right insight quickly is painful. We’re trying to reduce those “10 clicks + 3 exports” into a single question → actionable answer flow.
Still early, but building with infra-heavy workflows in mind - SEO, product, and ops teams.
Visla
@christian_gaugeler Congrats on the launch! Wish you all the best with it.
Ekamoira GSC MCP
@mogabr thank you! appreciate it
@christian_gaugeler Hi does the lifetime plan include access to gsc mcp? Also how are multiple gsc accounts handled? I have several. Interested thanks
Spur.fit
Congratulations to the whole team at Ekamoria. You are solving a great problem!
Ekamoira GSC MCP
@rahul_aluri Thank Rahul. Appreciate it a lot!
Ekamoira GSC MCP
@rahul_aluri thank you very much 🙏🏻😊
Spur.fit
Looks amazing! Good luck folks 🚀🚀
Ekamoira GSC MCP
@nikhil_moorjani thank you so much 🙏🏻 Really appreciate your support!
Ekamoira GSC MCP
@zahran_dabbagh - thanks man! let us know your feedback once you have had a chance to use it!
Congratulations on the launch 🎉
Ekamoira GSC MCP
@shubham_pratap Thanks a lot! appreciate the support
Love this! Immediate answers from Search Console data without CSV chaos, exactly what busy SEOs need.
Ekamoira GSC MCP
@leotrim_lota Appreciate it - exactly what we built it for.
Congrats on launch — love how this makes GSC insights actually fast and usable.
Ekamoira GSC MCP
@zeiki_yu thanks! appreciate it. Looking forward to feedback once you have had a chance to use it!
Product Hunt
Ekamoira GSC MCP
@curiouskitty
Insights from clients' use cases:
First questions every morning:
- "Which pages are position 1-10 but getting 0 clicks?" (we found 28 of these last week)
- "Are there any www vs non-www duplicates splitting my impressions?"
- "Did the meta title change from Tuesday actually improve CTR?"
What actually gets copied into workflow:
The output goes straight into a progress doc. Example: we had a page with 1,879 impressions at 0.05% CTR. Asked Claude to rewrite the title, tracked it, checked back 4 days later. Position held, CTR moved.
How it becomes daily vs one-off:
It's the tracking. We run the same queries weekly: "compare this week vs last week for these 5 pages." Claude remembers what you changed and when, so you're not starting fresh.
The stickiness comes from follow-up, not the initial report. "That page I fixed Tuesday - what happened?" is more useful than "give me all my data."
I run an agency. Can I manage multiple properties at once? We manage like 50 sites and GSC is painful.
Ekamoira GSC MCP
@nandini_choudhury1 - Yes! You can select which property to use and switch between them easily from the settings. Each time you connect, you pick the property you want to work with.
This is a brilliant use of MCP! The Model Context Protocol is transforming how we connect LLMs to real data sources, and GSC data is notoriously painful to work with.
@christian_gaugeler Your example prompts are spot-on—"show me quick wins where I'm ranking 4-10" is exactly the kind of insight that gets buried in CSVs. Being able to ask Claude to chart trends and compare periods in natural language is a huge productivity unlock.
Question: How do you handle the Google Search Console API rate limits? For larger sites with lots of pages/queries, does the MCP implement any caching or batching strategies to avoid hitting quotas during longer analysis sessions?
Ekamoira GSC MCP
@new_user___01920256530b9f092c9e057:
Great question! Here's how the GSC MCP handles API quotas:
Built-in Safeguards:
1. Row limits - Analytics queries default to 100-1000 rows with a max of 25,000 per request (GSC API limit). This keeps individual
requests fast and within quotas.
2. Batch inspection limits - URL inspection is capped at 10 URLs per batch to avoid hitting the URL Inspection API's daily quota (~2000
requests/day/property).
3. Graceful error handling - Returns helpful messages for HTTP 429 (rate limit) and 503 (service unavailable) responses.
GSC API Quotas (good news):
- Search Analytics API has generous daily quotas (not strict per-second rate limits)
- Most users won't hit limits during normal analysis sessions
- The main constraint is URL Inspection (~2000/day/property)
What's NOT implemented:
- No client-side caching (queries always return fresh data via dataState: 'all')
- No automatic request batching beyond URL inspection
For larger sites, we recommend:
- Use the days parameter instead of querying daily breakdowns
- Leverage filters to narrow result sets
- Spread URL inspection over multiple sessions if auditing 1000s of URLs
Happy to discuss adding caching if there's demand!
@soumyadeep_ux This is exactly the kind of detailed implementation info I was looking for! Really appreciate the thorough breakdown.
The built-in safeguards are well thought out - capping URL inspection at 10 per batch to stay well under the ~2000/day quota is smart defensive programming. And graceful 429/503 handling is critical for production use.
The "no client-side caching" decision makes sense for freshness, but for enterprise users analyzing the same properties daily, an optional cache layer with TTL could be valuable. Even a simple 1-hour cache would dramatically reduce API calls for repetitive queries.
Love the recommendations for larger sites - spreading URL inspection across sessions is practical advice. This is a really solid MCP implementation!