Skip to content

docs: add MCP lifecycle pattern and fix GSC/identity docs#173

Merged
marcusquinn merged 2 commits intomainfrom
chore/architecture-mcp-patterns
Jan 24, 2026
Merged

docs: add MCP lifecycle pattern and fix GSC/identity docs#173
marcusquinn merged 2 commits intomainfrom
chore/architecture-mcp-patterns

Conversation

@marcusquinn
Copy link
Owner

@marcusquinn marcusquinn commented Jan 24, 2026

Summary

  • Document the MCP lifecycle decision framework established during the MCP audit session
  • Fix GSC subagent to reflect its actual usage (MCP primary, curl fallback)
  • Clarify the OpenCode→Claude Code identity rewrite is intentional design

Changes

File Change
.agent/aidevops/architecture.md Add MCP lifecycle pattern section with decision table and three-tier strategy
.agent/seo/google-search-console.md Add curl OAuth2 fallback, update description to MCP-primary
.agent/prompts/build.txt Expand identity rewrite comment to prevent future "fixes"

MCP Lifecycle Pattern (new)

Three-tier strategy:

  1. Globally enabled (always loaded): osgrep, augment-context-engine, claude-code-mcp
  2. Enabled, tools disabled (zero context until agent invokes): gsc, outscraper, etc.
  3. Replaced by curl subagent (removed entirely): hetzner, serper, dataforseo, etc.

Decision factors: tool count, auth complexity, session frequency, context cost, statefulness.

Summary by CodeRabbit

  • Documentation

    • Expanded Google Search Console docs with MCP-first workflow, detailed direct API (curl) fallback, OAuth2 guidance, Playwright automation, troubleshooting, and richer SEO analytics examples.
    • Added architecture guidance describing tiered MCP lifecycle, tool gating patterns, and example configuration patterns.
    • Clarified agent identity messaging to support cross-platform prompt usage.
  • Chores

    • Setup flow now includes a cleanup step for deprecated MCP entries and auto-detection to enable SEO/Analytics integrations when credentials are present.

✏️ Tip: You can customize this high-level summary in your review settings.

…ewrite

- Add MCP lifecycle decision framework to architecture.md (when MCP vs curl)
- Document three-tier MCP strategy (global, per-agent, replaced)
- Fix stale SEO agent example in architecture.md
- Add curl OAuth2 fallback to GSC subagent (JWT token exchange pattern)
- Update GSC description to reflect MCP-primary with curl fallback
- Clarify OpenCode→Claude Code identity rewrite is intentional design
@gemini-code-assist
Copy link

Summary of Changes

Hello @marcusquinn, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request significantly enhances the project's documentation and clarifies core architectural decisions. It introduces a comprehensive lifecycle pattern for managing Multi-Context Proxy (MCP) tools, providing a clear framework for their usage and migration. Additionally, it refines the documentation for the Google Search Console (GSC) subagent, detailing its primary MCP integration and offering robust curl-based fallback instructions. Finally, it addresses potential confusion by explicitly documenting an intentional identity rewrite within prompts, ensuring consistency and preventing unintended modifications.

Highlights

  • MCP Lifecycle Pattern Documentation: Introduced a new section in the architecture documentation outlining a decision framework and a three-tier strategy for managing MCP (Multi-Context Proxy) tools, including when to use MCP vs. curl subagents.
  • Google Search Console (GSC) Subagent Update: Clarified that GSC primarily uses MCP tools with a curl-based OAuth2 fallback, updating its documentation to reflect this and providing detailed curl examples for direct API access.
  • Identity Rewrite Clarification: Added an explicit comment in the build.txt prompt to emphasize that the "OpenCode" to "Claude Code" identity rewrite is an intentional design feature, preventing future attempts to "fix" it.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Jan 24, 2026

Caution

Review failed

The pull request is closed.

Note

Other AI code review bot(s) detected

CodeRabbit has detected other AI code review bot(s) in this pull request and will avoid duplicating their findings in the review comments. This may lead to a less comprehensive review.

Walkthrough

Documentation and scripts add a formal MCP lifecycle pattern and tool-gating model, update agent tool enablement in opencode.json guidance, make Google Search Console MCP-primary with a curl fallback and Playwright onboarding, add a cleanup routine and auto-enable logic in setup.sh, and clarify prompt identity rewriting for runtime auth.

Changes

Cohort / File(s) Summary
Architecture & Tool Gating
\.agent/aidevops/architecture.md``
Repointed Multi-Layer Action Space to opencode.json as authoritative; added GLOBAL_TOOLS mapping; revised AGENT_TOOLS entries (Build+, SEO); added MCP Lifecycle Pattern with decision framework, three-tier deployment strategy, migration criteria, and opencode.json examples.
Agent Prompt Identity
\.agent/prompts/build.txt``
Added an INTENTIONAL DESIGN note stating the OpenCode identity is rewritten to Claude Code at runtime by the opencode-anthropic-auth plugin; prompt text preserved.
SEO Docs & GSC Flow
\.agent/seo/google-search-console.md``
Converted GSC guidance to MCP-primary (gsc_* tools) with detailed curl/OAuth2 Direct API fallback; added token-exchange/curl examples, Playwright automated bulk setup script, expanded use-case examples, troubleshooting and MCP config notes.
Setup Script & MCP Cleanup
\setup.sh``
Added cleanup_deprecated_mcps() to remove deprecated MCP entries and tool refs (uses jq, makes backups); integrated cleanup into main flow; migrated SEO setup to curl-based subagents by default and added auto-detection/auto-enable logic for Google Analytics MCP when GSC creds are present; updated prompts/messages.

Sequence Diagram(s)

sequenceDiagram
  participant Agent as User Agent
  participant Config as opencode.json
  participant Decision as MCP Lifecycle Decision
  participant MCP as MCP Server (gsc_*)
  participant Curl as Curl Subagent
  participant API as External API (Google Search Console)

  Agent->>Config: read tool/tier configuration
  Agent->>Decision: evaluate MCP vs curl (tool count, auth, sessions, state)
  Decision-->>MCP: select MCP path (tier 1/2)
  MCP->>API: proxy requests (gsc_* tools)
  Decision-->>Curl: select curl subagent (tier 2/3)
  Curl->>API: direct OAuth2 / curl requests
  API-->>Agent: return data
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~45 minutes

Possibly related PRs

Poem

🧭 JSON gates and MCP tiers align,

curl waits patient, saving state and time,
OpenCode becomes Claude at runtime's call,
cleanup clears the old, enabling all,
a tidy Rails of tools — automation sublime. ✨

🚥 Pre-merge checks | ✅ 3
✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title accurately captures the primary changes: documenting an MCP lifecycle pattern and updating GSC/identity documentation.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing touches
  • 📝 Generate docstrings

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@github-actions
Copy link

🔍 Code Quality Report

�[0;35m[MONITOR]�[0m Code Review Monitoring Report

�[0;34m[INFO]�[0m Latest Quality Status:
SonarCloud: 0 bugs, 0 vulnerabilities, 406 code smells

�[0;34m[INFO]�[0m Recent monitoring activity:
Sat Jan 24 15:28:02 UTC 2026: Code review monitoring started
Sat Jan 24 15:28:02 UTC 2026: SonarCloud - Bugs: 0, Vulnerabilities: 0, Code Smells: 406
Sat Jan 24 15:28:02 UTC 2026: Qlty - 0 issues found, auto-formatting applied
Sat Jan 24 15:28:04 UTC 2026: Codacy analysis completed with auto-fixes

📈 Current Quality Metrics

  • BUGS: 0
  • CODE SMELLS: 406
  • VULNERABILITIES: 0

Generated on: Sat Jan 24 15:29:22 UTC 2026


Generated by AI DevOps Framework Code Review Monitoring

Copy link

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces valuable documentation for the MCP lifecycle pattern and correctly updates the Google Search Console documentation to reflect its new MCP-primary, curl-fallback architecture. The clarification regarding the OpenCode vs Claude Code identity is also a welcome improvement to prevent future confusion. The overall changes are good, but I've identified a couple of critical issues in the new curl fallback examples for GSC. The provided shell commands will fail as written due to incorrect handling of home directory paths. I've included specific suggestions to fix these bugs.

# Test GSC MCP connection
opencode mcp list | grep -i search
# Verify credentials file
python3 -c "import json; d=json.load(open('$HOME/.config/aidevops/gsc-credentials.json')); print(f'Service account: {d[\"client_email\"]}')"

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

This command will fail because the shell does not expand the $HOME variable inside single quotes. Python will then attempt to open a file at the literal path $HOME/..., which doesn't exist. Using os.path.expanduser('~/...') is a more robust way to resolve the home directory path.

Suggested change
python3 -c "import json; d=json.load(open('$HOME/.config/aidevops/gsc-credentials.json')); print(f'Service account: {d[\"client_email\"]}')"
python3 -c "import json, os; d=json.load(open(os.path.expanduser('~/.config/aidevops/gsc-credentials.json'))); print(f'Service account: {d[\"client_email\"]}')"

Comment on lines +62 to +69
import json, time, jwt, requests
creds = json.load(open('$HOME/.config/aidevops/gsc-credentials.json'))
now = int(time.time())
payload = {'iss': creds['client_email'], 'scope': 'https://www.googleapis.com/auth/webmasters.readonly',
'aud': creds['token_uri'], 'iat': now, 'exp': now + 3600}
signed = jwt.encode(payload, creds['private_key'], algorithm='RS256')
r = requests.post(creds['token_uri'], data={'grant_type': 'urn:ietf:params:oauth:grant-type:jwt-bearer', 'assertion': signed})
print(r.json()['access_token'])

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

This Python script has a few issues that will prevent it from running correctly:

  1. Path Expansion: The shell won't expand $HOME inside the single-quoted path string on line 63, causing the script to fail when trying to open the credentials file. This can be fixed by using os.path.expanduser('~/.config/...').
  2. File Handling: The file is opened without a with statement, meaning it might not be closed properly if an error occurs. It's best practice to use with open(...).
  3. Error Handling: The requests.post call on line 68 does not check for HTTP errors. If the request fails, the script will raise an unhandled exception when r.json() is called. It's better to add r.raise_for_status() to handle potential network or authentication issues gracefully.
Suggested change
import json, time, jwt, requests
creds = json.load(open('$HOME/.config/aidevops/gsc-credentials.json'))
now = int(time.time())
payload = {'iss': creds['client_email'], 'scope': 'https://www.googleapis.com/auth/webmasters.readonly',
'aud': creds['token_uri'], 'iat': now, 'exp': now + 3600}
signed = jwt.encode(payload, creds['private_key'], algorithm='RS256')
r = requests.post(creds['token_uri'], data={'grant_type': 'urn:ietf:params:oauth:grant-type:jwt-bearer', 'assertion': signed})
print(r.json()['access_token'])
import json, time, jwt, requests, os
with open(os.path.expanduser('~/.config/aidevops/gsc-credentials.json')) as f:
creds = json.load(f)
now = int(time.time())
payload = {'iss': creds['client_email'], 'scope': 'https://www.googleapis.com/auth/webmasters.readonly',
'aud': creds['token_uri'], 'iat': now, 'exp': now + 3600}
signed = jwt.encode(payload, creds['private_key'], algorithm='RS256')
r = requests.post(creds['token_uri'], data={'grant_type': 'urn:ietf:params:oauth:grant-type:jwt-bearer', 'assertion': signed})
r.raise_for_status()
print(r.json()['access_token'])

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🤖 Fix all issues with AI agents
In @.agent/aidevops/architecture.md:
- Around line 116-118: Fix the typo "playwriter" to "playwright" in the
architecture documentation: update the list entry currently reading "playwriter"
to "playwright" (and search for any other occurrences of the string "playwriter"
in the repo to replace them) so tool naming and discovery match the actual
Playwright subagent.

In @.agent/seo/google-search-console.md:
- Around line 19-24: The "Capabilities" entry claiming "indexing requests" is
misleading because the example labeled "Submit URL for indexing" uses the URL
Inspection API (`urlInspection/index:inspect`) which only inspects status and
the doc currently references the `webmasters.readonly` scope; either remove
"indexing requests" from the Capabilities list and rename the example to
"Inspect URL index status" (update the example title and text to reflect
`urlInspection/index:inspect`), or add a new section describing the actual
Indexing API (`indexing.googleapis.com/v3/urlNotifications:publish`) including
required `auth/indexing` scope and the content restrictions
(JobPosting/BroadcastEvent) so readers know submission requires a separate API
and scope.

Comment on lines +116 to +118
1. **Globally enabled** (always loaded, ~2K tokens each): osgrep, augment-context-engine, claude-code-mcp
2. **Enabled, tools disabled** (zero context until agent invokes): gsc, outscraper, google-analytics-mcp, quickfile, amazon-order-history, context7, repomix, playwriter, chrome-devtools, etc.
3. **Replaced by curl subagent** (removed entirely): hetzner, serper, dataforseo, ahrefs, hostinger
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Typo: “playwriter” → “playwright”.
This will confuse tool naming and discovery.

🔧 Suggested fix
-2. **Enabled, tools disabled** (zero context until agent invokes): gsc, outscraper, google-analytics-mcp, quickfile, amazon-order-history, context7, repomix, playwriter, chrome-devtools, etc.
+2. **Enabled, tools disabled** (zero context until agent invokes): gsc, outscraper, google-analytics-mcp, quickfile, amazon-order-history, context7, repomix, playwright, chrome-devtools, etc.
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
1. **Globally enabled** (always loaded, ~2K tokens each): osgrep, augment-context-engine, claude-code-mcp
2. **Enabled, tools disabled** (zero context until agent invokes): gsc, outscraper, google-analytics-mcp, quickfile, amazon-order-history, context7, repomix, playwriter, chrome-devtools, etc.
3. **Replaced by curl subagent** (removed entirely): hetzner, serper, dataforseo, ahrefs, hostinger
1. **Globally enabled** (always loaded, ~2K tokens each): osgrep, augment-context-engine, claude-code-mcp
2. **Enabled, tools disabled** (zero context until agent invokes): gsc, outscraper, google-analytics-mcp, quickfile, amazon-order-history, context7, repomix, playwright, chrome-devtools, etc.
3. **Replaced by curl subagent** (removed entirely): hetzner, serper, dataforseo, ahrefs, hostinger
🤖 Prompt for AI Agents
In @.agent/aidevops/architecture.md around lines 116 - 118, Fix the typo
"playwriter" to "playwright" in the architecture documentation: update the list
entry currently reading "playwriter" to "playwright" (and search for any other
occurrences of the string "playwriter" in the repo to replace them) so tool
naming and discovery match the actual Playwright subagent.

Comment on lines +19 to 24
- **Primary access**: MCP tools (`gsc_*`) - enabled for SEO agent
- **Fallback**: curl with OAuth2 token from service account
- **API**: REST at `https://searchconsole.googleapis.com/v1/`
- **Auth**: Service account JSON at `~/.config/aidevops/gsc-credentials.json`
- **Capabilities**: Search analytics, URL inspection, indexing requests, sitemap management
- **Metrics**: clicks, impressions, ctr, position
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

# First, let's view the file to understand the full context
cat -n .agent/seo/google-search-console.md | head -100

Repository: marcusquinn/aidevops

Length of output: 4240


🌐 Web query:

Google Search Console API URL Inspection endpoint scope requirements webmasters.readonly

💡 Result:

The Search Console API URL Inspection endpoint (urlInspection.index.inspectPOST https://searchconsole.googleapis.com/v1/urlInspection/index:inspect) accepts either of these OAuth scopes:

  • https://www.googleapis.com/auth/webmasters.readonly (read-only)
  • https://www.googleapis.com/auth/webmasters (read/write)

So webmasters.readonly is sufficient for calling the URL Inspection endpoint. [1]


🌐 Web query:

Google Indexing API vs Search Console API URL submission scope requirements

💡 Result:

Google Indexing API (URL submission)

  • What it can submit: Individual URL notifications (URL_UPDATED, URL_DELETED) to trigger recrawling/removal. (developers.google.com)
  • Allowed content scope (not OAuth scope): Only URLs whose pages contain structured data for JobPosting or BroadcastEvent embedded in a VideoObject. (developers.google.com)
  • OAuth scope required: https://www.googleapis.com/auth/indexing. (developers.google.com)
  • Property/permission requirement: You must verify the site in Search Console and add the service account as a delegated owner of that Search Console property. (developers.google.com)

Google Search Console API (no “request indexing” URL submission)

  • What it can do instead:
    • URL Inspection API can inspect a URL’s index status, not submit it for indexing. (developers.google.com)
    • Can manage/list sites and submit sitemaps (discovery), but this is not the same as submitting a single URL for “Request indexing”. (developers.google.com)
  • OAuth scopes used (depends on method):
  • URL/property requirement (URL Inspection): The inspected URL must be under the Search Console property provided in siteUrl. (developers.google.com)

Net: If you need programmatic per-URL submission, Google only provides that via the Indexing API, and only for JobPosting/BroadcastEvent pages, using the auth/indexing scope plus Search Console ownership/delegation. (developers.google.com)

Citations:


Remove misleading "indexing requests" or clarify it requires a separate API setup.

The example at lines 87–91 is labeled "Submit URL for indexing" but uses the URL Inspection API (urlInspection/index:inspect), which only inspects index status—it does not submit URLs for indexing. Actual URL submission requires the Indexing API (indexing.googleapis.com/v3/urlNotifications:publish) with the separate auth/indexing scope and is restricted to JobPosting or BroadcastEvent content.

The doc's "Capabilities" line lists "indexing requests" and "sitemap management" as supported, but the example and scope (webmasters.readonly) don't support URL submission. Either:

  • Remove "indexing requests" from the capabilities list, or
  • Add a separate section with the correct Indexing API endpoint, scope, and content restrictions.
🤖 Prompt for AI Agents
In @.agent/seo/google-search-console.md around lines 19 - 24, The "Capabilities"
entry claiming "indexing requests" is misleading because the example labeled
"Submit URL for indexing" uses the URL Inspection API
(`urlInspection/index:inspect`) which only inspects status and the doc currently
references the `webmasters.readonly` scope; either remove "indexing requests"
from the Capabilities list and rename the example to "Inspect URL index status"
(update the example title and text to reflect `urlInspection/index:inspect`), or
add a new section describing the actual Indexing API
(`indexing.googleapis.com/v3/urlNotifications:publish`) including required
`auth/indexing` scope and the content restrictions (JobPosting/BroadcastEvent)
so readers know submission requires a separate API and scope.

@augmentcode
Copy link

augmentcode bot commented Jan 24, 2026

🤖 Augment PR Summary

Summary: Documents the MCP lifecycle decision framework and reconciles agent docs with current MCP usage.

Architecture: Adds a three-tier MCP strategy (always loaded vs enabled/tools-hidden vs replaced by curl) plus a decision table in .agent/aidevops/architecture.md.

SEO: Updates the Google Search Console subagent to be MCP-first (gsc_*) with a service-account OAuth2 curl fallback.

Prompts: Expands the Build prompt note explaining the intentional OpenCode→Claude Code identity rewrite performed at runtime.

Notes: Highlights minimizing context/token overhead by keeping rarely used MCP tools hidden until explicitly enabled per agent.

Impact: Documentation/prompt updates only; no runtime code changes in this PR.

🤖 Was this summary useful? React with 👍 or 👎

Copy link

@augmentcode augmentcode bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review completed. 5 suggestions posted.

Fix All in Augment

Comment augment review to trigger a new review at any time.


```python
# Tools disabled globally, enabled per-agent
GLOBAL_TOOLS = {"gsc_*": False, "outscraper_*": False, "osgrep_*": True, ...}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The GLOBAL_TOOLS example shows osgrep_* enabled globally, but generate-opencode-agents.sh currently writes config.tools.osgrep_* = false and relies on per-agent enablement. This mismatch could cause readers to configure opencode.json in a way that doesn’t match the generator’s behavior.

Fix This in Augment

🤖 Was this useful? React with 👍 or 👎

"Build+": {"write": True, "context7_*": True, "osgrep_*": True, ...},
"SEO": {"gsc_*": True, "dataforseo_*": True, "serper_*": True, ...},
"Build+": {"write": True, "context7_*": True, "repomix_*": True, ...},
"SEO": {"gsc_*": True, "google-analytics-mcp_*": True, ...},
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This snippet implies the SEO agent enables google-analytics-mcp_* directly, but services/analytics/google-analytics.md describes GA tools as being enabled via the @google-analytics subagent. If the subagent-only pattern is the intended design, the example here may be misleading.

Fix This in Augment

🤖 Was this useful? React with 👍 or 👎


1. **Globally enabled** (always loaded, ~2K tokens each): osgrep, augment-context-engine, claude-code-mcp
2. **Enabled, tools disabled** (zero context until agent invokes): gsc, outscraper, google-analytics-mcp, quickfile, amazon-order-history, context7, repomix, playwriter, chrome-devtools, etc.
3. **Replaced by curl subagent** (removed entirely): hetzner, serper, dataforseo, ahrefs, hostinger
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tier 3 states serper, dataforseo, ahrefs, etc are “removed entirely”, but the repo still configures these MCP servers (e.g., in generate-opencode-agents.sh) and references them in .agent/seo.md. If this table is meant as a future target, it might help to call that out so readers don’t assume the current config already matches.

Fix This in Augment

🤖 Was this useful? React with 👍 or 👎

-H "Authorization: Bearer $ACCESS_TOKEN"

# Search analytics query
curl -s -X POST "https://searchconsole.googleapis.com/v1/sites/https%3A%2F%2Fexample.com/searchAnalytics/query" \
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The sites/{siteUrl} examples omit the trailing / that URL-prefix properties require (and thus the %2F at the end of the URL-encoded path), so copy/paste may fail against real properties. Same issue applies to the siteUrl value in the URL Inspection request body.

Other Locations
  • .agent/seo/google-search-console.md:91

Fix This in Augment

🤖 Was this useful? React with 👍 or 👎

"rowLimit": 25
}'

# Submit URL for indexing
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The comment says “Submit URL for indexing”, but urlInspection/index:inspect only performs an inspection and does not request indexing. This could lead users to think the call triggers indexing when it won’t.

Fix This in Augment

🤖 Was this useful? React with 👍 or 👎

- setup_google_analytics_mcp() auto-detects shared GSC credentials
- Add cleanup_deprecated_mcps step to main() flow
- Update SEO confirm_step text (curl subagents, not MCP servers)
- Update final output to show GSC/GA MCPs and SEO curl subagents
@marcusquinn marcusquinn merged commit e54b59f into main Jan 24, 2026
5 of 8 checks passed
@sonarqubecloud
Copy link

@github-actions
Copy link

🔍 Code Quality Report

�[0;35m[MONITOR]�[0m Code Review Monitoring Report

�[0;34m[INFO]�[0m Latest Quality Status:
SonarCloud: 0 bugs, 0 vulnerabilities, 406 code smells

�[0;34m[INFO]�[0m Recent monitoring activity:
Sat Jan 24 15:39:06 UTC 2026: Code review monitoring started
Sat Jan 24 15:39:06 UTC 2026: SonarCloud - Bugs: 0, Vulnerabilities: 0, Code Smells: 406
Sat Jan 24 15:39:07 UTC 2026: Qlty - 0 issues found, auto-formatting applied
Sat Jan 24 15:39:09 UTC 2026: Codacy analysis completed with auto-fixes

📈 Current Quality Metrics

  • BUGS: 0
  • CODE SMELLS: 406
  • VULNERABILITIES: 0

Generated on: Sat Jan 24 15:40:17 UTC 2026


Generated by AI DevOps Framework Code Review Monitoring

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant