Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion data/onPostBuild/llmstxt.ts
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ const LLMS_TXT_PREAMBLE = `# Ably Documentation

- **Global Edge Network**: Ultra-low latency realtime messaging delivered through a globally distributed edge network
- **Enterprise Scale**: Built to handle millions of concurrent connections with guaranteed message delivery
- **Multiple Products**: Pub/Sub, Chat, LiveSync, LiveObjects and Spaces
- **Multiple Products**: Pub/Sub, AI Transport, Chat, LiveSync, LiveObjects and Spaces
- **Developer-Friendly SDKs**: SDKs available for JavaScript, Node.js, Java, Python, Go, Objective-C, Swift, Csharp, PHP, Flutter, Ruby, React, React Native, and Kotlin

`;
Expand Down
4 changes: 4 additions & 0 deletions src/pages/docs/channels/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -230,3 +230,7 @@ Ably does not support channel groups, a concept used by some other providers whe
* Channel namespaces already provide grouping functionality for configuration purposes.

Instead of channel groups, simply subscribe to the specific channels your client needs access to. The efficient multiplexing ensures optimal performance regardless of the number of channels.

<Aside data-type='further-reading'>
If you are building AI-powered applications that stream LLM responses to users, [AI Transport](/docs/ai-transport) provides purpose-built features for token streaming, session management, and human-in-the-loop workflows on top of Ably channels.
</Aside>
Comment on lines +234 to +236
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This does look a bit odd on this page, especially as it makes it look like it's related to channel groups when rendered. Can we drop it for now and get the rest in?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I really think it's pretty common that users will want to stream tokens over channels, and we should capture the opportunity to direct them. Where do you think there's a better place to share this info?

16 changes: 11 additions & 5 deletions src/pages/docs/platform/ai-llms/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,10 @@ meta_keywords: "Ably LLM, AI documentation, CLAUDE.md, AGENTS.md, Cursor rules,

Ably documentation is designed to be LLM-friendly, making it easy to use AI assistants like Claude, ChatGPT, or Cursor to help you build realtime applications.

<Aside data-type='note'>
If you're building AI-powered applications, see the [AI Transport](/docs/ai-transport) documentation for purpose-built infrastructure covering token streaming, session management, and human-in-the-loop workflows.
</Aside>

## Available resources

Ably provides two key resources optimized for LLM consumption:
Expand Down Expand Up @@ -126,11 +130,12 @@ Use **product and feature docs** for:
When a product-specific abstraction exists, **always prefer its documentation over generic Pub/Sub docs**, even if the underlying concepts overlap.

Priority order:
1. **Chat SDK docs** – for chat, rooms, messages, reactions, typing, moderation
2. **Spaces SDK docs** – for collaborative cursors, avatars, presence in UI
3. **LiveObjects docs** – for realtime shared state (maps, counters)
4. **LiveSync docs** – for database-to-client synchronization
5. **Generic Pub/Sub docs** – only when no higher-level abstraction applies
1. **AI Transport docs** – for AI/LLM token streaming, sessions, human-in-the-loop, tool calls, multi-device
2. **Chat SDK docs** – for chat, rooms, messages, reactions, typing, moderation
3. **Spaces SDK docs** – for collaborative cursors, avatars, presence in UI
4. **LiveObjects docs** – for realtime shared state (maps, counters)
5. **LiveSync docs** – for database-to-client synchronization
6. **Generic Pub/Sub docs** – only when no higher-level abstraction applies

Do not rebuild Chat, Spaces, or LiveObjects behavior directly on raw channels unless explicitly requested.

Expand Down Expand Up @@ -182,6 +187,7 @@ Use placeholders for secrets:

Explicitly choose the correct interface and explain why:

- **AI Transport**: realtime AI/LLM token streaming, resumable sessions, human-in-the-loop, multi-device continuity
- **Realtime SDK**: realtime pub/sub, presence, collaboration
- **REST SDK**: server-side publishing, token creation, history, stats
- **Chat SDK**: structured chat features and moderation
Expand Down
3 changes: 3 additions & 0 deletions src/pages/docs/platform/ai-llms/llms-txt.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -32,6 +32,9 @@ Core platform documentation including account management, architecture, pricing,
### Pub/Sub
Documentation for Ably's core realtime messaging capabilities: channels, messages, presence, authentication, connections, and protocols.

### AI Transport
Documentation for Ably's AI Transport product covering token streaming, sessions and identity, messaging features such as human-in-the-loop and tool calls, and getting started guides for OpenAI, Anthropic, Vercel AI SDK, and LangGraph.

### Chat
The Ably Chat product documentation covering rooms, messages, reactions, typing indicators, and moderation features.

Expand Down
2 changes: 2 additions & 0 deletions src/pages/docs/platform/architecture/connection-recovery.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,8 @@ Ably minimizes the impact of these disruptions by providing an effective recover

Applications built with Ably will continue to function normally during disruptions. They will maintain their state and all messages will be received by the client in the correct order. This is particularly important for applications where messages delivery guarantees are crucial, such as in applications where client state is hydrated and maintained incrementally by messages.

Connection recovery is especially important for AI applications, where a network interruption during token streaming can disrupt the user experience. Ably [AI Transport](/docs/ai-transport) builds on this mechanism to enable [resumable token streaming](/docs/ai-transport/token-streaming) from language models, ensuring users can reconnect mid-stream and continue from where they left off.

Ably achieves a reliable connection recovery mechanism with the following:

* [Connection states](#connection-states)
Expand Down
8 changes: 8 additions & 0 deletions src/pages/docs/platform/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -114,3 +114,11 @@ LiveObjects is effective for use cases such as realtime voting and polling syste
Use Ably [LiveSync](/docs/livesync) to synchronize changes between your database and frontend clients. It provides support for PostgreSQL and MongoDB and uses the Ably platform to synchronize your application's data.

LiveSync automatically streams changes you make in your database to clients to keep them in sync with the source of truth in your database.

### Ably AI Transport <a id="ai-transport"/>

Use Ably [AI Transport](/docs/ai-transport) as a drop-in infrastructure layer that upgrades your AI streams into bi-directional, stateful experiences. It provides resumable token streaming, multi-device continuity, human-in-the-loop workflows, and session management that works with any AI model or framework.

AI Transport is built on Ably Pub/Sub. It utilizes Ably's platform to benefit from all of the same performance guarantees and scaling potential.

AI Transport is effective for use cases such as multi-turn conversational AI applications, AI agent coordination, live steering with human takeover, and any scenario where reliable LLM token delivery and session resumability are critical.