Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: MCP Server
description: Connect AI assistants to Injective documentation using the Model Context Protocol
title: Documentation MCP Server
description: Connect AI assistants to Injective documentation using the Model Context Protocol (MCP)
---

The Injective documentation provides a Model Context Protocol (MCP) server that allows AI assistants like
Expand Down
80 changes: 80 additions & 0 deletions .gitbook/developers-ai/index.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,80 @@
---
description: >-
Build on Injective with generative AI software engineering.
title: AI developers
---

## Why build using generative AI tools on Injective?

With generative AI coding tools, you can build applications very quickly, including on Injective.
However building very fast in the wrong direction is not ideal.
You will find skills, agents, workflows, and MCP servers here that will help you with effective AI software engineering.

## What types of generative AI tools are available?

- **LLMs** -
Large language models (LLMs) are the base-layer technology powering almost all generative AI software engineering.
Almost all AI development tools are wrappers around LLMs.
Popular ones include Claude Opus (Anthropic), Gemini (Google), and Kimi (Moonshot AI).
- **LLM Providers** -
Low- and mid-tier LLMs are possible to run on retail/ consumer hardware.
However the top-tier LLMs need to be accessed remotely.
You have 3 main options:
- **Local providers** - For example, using LM Studio or Ollama.
- **Remote providers from model developers** - For example, accessing Claude Opus via an Anthropic subscription.
- **Remote providers from model aggregators** - For example, accessing Claude Opus, Gemini, or Kimi via an OpenRouter subscription.
- **Tools** -
These can be anything from functions, to scripts, to command-line interfaces (CLIs) that are packaged up in a manner
that make them understood or callable by LLMs.
For example, if you want the LLM to access real-time information,
i.e. information that was no available to it when the LLM was trained,
you would need to give it access to call tools for web searches or other data APIs.
- **MCP** -
Model-Context-Protocol (MCP) is a protocol designed for discovery and calling of tools by LLMs.
They are designed to standardise the way for different LLMs and LLM providers to invoke tools.
Previously each LLM or LLM provider had competing standards/ protocols for doing so.
- **Skills, workflows, agents** -
These are markdown files that optionally reference supporting resources, tools, MCP servers and others.
They are designed specifically to work with AI engineering harnesses (but can be used in other contexts).
They can be recursive, for example a skill can reference other skills.
Likewise workflows are usually a set of skills with a defined order;
and agents are are sets of workflows and skills.
Note that the term "agents" is overused, with multiple definitions, so the above does not apply in other contexts.
- **AI engineering IDEs** -
These are either dedicated IDEs or plugins within IDEs that allow you to prompt LLMs,
including execution of tool calls or MCPs,
and use their output to work on the code base that is open within the IDE.
Popular ones include: Roo, Cline, and Cursor.
- **AI engineering Harnesses** -
These are command line interfaces (CLIs) or terminal user interfaces (TUIs) that are designed around
invoking LLMs for coding tasks.
The operate directly on the file system, and often come with baked-in optimisations and utilities for engineering tasks.
These tend to be more powerful than working with AI engineering IDEs,
as they work best when skills, workflows, and agents are used.
Popular ones include: Claude Code (Anthropic), Codex (OpenAI), and OpenCode (unaffiliated).
- **AI engineering Orchestrators** -
These are tools that act as wrappers around harnesses.
Their main intent is to enable long-running loops or parallelisation of harness invocation,
such that it becomes possible have LLMs working autonomously on longer and more complex tasks
without the need for constant human supervision.
Popular ones include: Ralph, GSD.

## Which types of generative AI tools will work when developing on Injective?

LLMs, LLM Providers, Tools, MCP, Skills, Workflows, Agents, AI engineering IDEs, AI engineering Harnesses, AI engineering Orchestrators.
In short, all of them!
In fact, that is the point of this **AI Developer** section!

## MCP Servers

### Injective Documentation MCP

How to use the [Injective documentation MCP server](/developers-ai/documentation-mcp)
to obtain up-to-date information from this documentation site.
If you ask any LLM about Injective,
it will likely give you out-of-date information, as it does with any fast moving technology.
This is because their information is obtained from its training data,
which is by definition "frozen in time".
By adding this MCP to your AI development workflow,
not only will you get up-to-date information about developing on Injective,
you will also have citations for that information in the form of URLs from this documentation site.
2 changes: 1 addition & 1 deletion .gitbook/developers/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
description: >-
The goal of this section is to help developers build their projects on
Injective
title: Overview
title: Developers
---

Injective is the only blockchain specifically designed for cross-chain trading, derivatives, DeFi, and Web3 applications. Positioned to become the premier global destination for DeFi ecosystem builders, Injective offers a multitude of advantages for developers, empowering them to build more powerful applications in less time.
Expand Down
21 changes: 15 additions & 6 deletions .gitbook/docs.json
Original file line number Diff line number Diff line change
Expand Up @@ -107,12 +107,6 @@
"developers/index",
"developers/convert-addresses",
"developers/network-information",
{
"group": "AI development",
"pages": [
"developers/ai/mcp"
]
},
{
"group": "injectived",
"pages": [
Expand Down Expand Up @@ -171,6 +165,13 @@
"redirects/https_api_injective_exchange"
]
},
{
"group": "AI Developers",
"pages": [
"developers-ai/index",
"developers-ai/documentation-mcp"
]
},
{
"group": "Infrastructure",
"icon": "network-wired",
Expand Down Expand Up @@ -1220,6 +1221,14 @@
"eyebrows": "breadcrumbs"
},
"redirects": [
{
"source": "/developers/ai/",
"destination": "/developers-ai/"
},
{
"source": "/developers/ai/mcp",
"destination": "/developers-ai/documentation-mcp"
},
{
"source": "/defi/community-burn",
"destination": "/defi/community-buyback"
Expand Down