Skip to content

[AIT-273] Message per response guide for Vercel AI SDK#3151

Merged
matt423 merged 2 commits intomainfrom
ait-273-vercel-token-per-response-guide
Feb 2, 2026
Merged

[AIT-273] Message per response guide for Vercel AI SDK#3151
matt423 merged 2 commits intomainfrom
ait-273-vercel-token-per-response-guide

Conversation

@matt423
Copy link
Member

@matt423 matt423 commented Jan 23, 2026

Description

Follow existing message per response guides, add guide for Vercel AI SDK.

Review App

Checklist

Summary by CodeRabbit

  • Documentation
    • Added a new guide for streaming Vercel AI SDK responses using the message-per-response pattern over Ably, including prerequisites, setup instructions, and example implementations for publishers and subscribers.

✏️ Tip: You can customize this high-level summary in your review settings.

@matt423 matt423 self-assigned this Jan 23, 2026
@matt423 matt423 added the review-app Create a Heroku review app label Jan 23, 2026
@coderabbitai
Copy link

coderabbitai bot commented Jan 23, 2026

Important

Review skipped

Auto reviews are disabled on this repository. Please check the settings in the CodeRabbit UI or the .coderabbit.yaml file in this repository. To trigger a single review, invoke the @coderabbitai review command.

You can disable this status message by setting the reviews.review_status to false in the CodeRabbit configuration file.

  • 🔍 Trigger a full review

Walkthrough

This PR adds a new guide document for implementing token streaming with Vercel AI SDK using a message-per-response pattern, including a corresponding navigation entry. The guide covers setup, prerequisites, and code examples demonstrating publisher and subscriber implementations for streaming responses via Ably.

Changes

Cohort / File(s) Summary
Navigation Configuration
src/data/nav/aitransport.ts
Added new navigation entry for "Vercel AI SDK token streaming - message per response" guide under Guides → Token streaming section
Documentation
src/pages/docs/guides/ai-transport/vercel-message-per-response.mdx
New guide document covering token streaming implementation using message-per-response pattern, including prerequisites, setup steps, code examples for publisher/subscriber implementations, event type handling, message serial management, and performance considerations

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~15 minutes

Possibly related PRs

Suggested reviewers

  • GregHolmes
  • mschristensen

Poem

🐰 A message flows, response by response,
Through Vercel's streams, no need to force,
With Ably's aid, we guide the way,
Token by token, sent to stay,
Documentation bright and clear! 📚✨

🚥 Pre-merge checks | ✅ 5
✅ Passed checks (5 passed)
Check name Status Explanation
Title check ✅ Passed The title clearly identifies the main change: adding a message per response guide for Vercel AI SDK, which matches the primary objective of creating a new guide article.
Linked Issues check ✅ Passed The PR fulfills AIT-273's requirement by adding a comprehensive guide documenting token streaming with the Vercel AI SDK, including setup, implementation examples, and historical retrieval patterns.
Out of Scope Changes check ✅ Passed All changes are directly related to adding the requested guide: navigation entry in aitransport.ts and the new documentation file vercel-message-per-response.mdx are both in scope for AIT-273.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch ait-273-vercel-token-per-response-guide

Comment @coderabbitai help to get the list of available commands and usage tips.

@matt423
Copy link
Member Author

matt423 commented Jan 23, 2026

@coderabbitai review

@ably-ci ably-ci temporarily deployed to ably-docs-ait-273-verce-q4lzrh January 23, 2026 15:17 Inactive
@coderabbitai
Copy link

coderabbitai bot commented Jan 23, 2026

✅ Actions performed

Review triggered.

Note: CodeRabbit is an incremental review system and does not re-review already reviewed commits. This command is applicable only when automatic reviews are paused.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🤖 Fix all issues with AI agents
In `@src/pages/docs/guides/ai-transport/vercel-message-per-response.mdx`:
- Around line 40-46: Update the installation command and version note to
reference AI SDK 6.x: replace the install line "npm install ai@^5 ably@^2" with
"npm install ai@^6 ably@^2" (or "ai@6" per style) and change the Aside text that
currently says "This guide uses version 5.x of the AI SDK" to "This guide uses
version 6.x of the AI SDK" (or similar wording), ensuring any other mentions of
"5.x" in this file are updated unless the guide is intentionally for legacy v5
users.
🧹 Nitpick comments (1)
src/pages/docs/guides/ai-transport/vercel-message-per-response.mdx (1)

196-231: Consider adding basic error handling for production readiness.

The publish/append pattern is correct, but the example lacks error handling. While this is a guide focused on the happy path, a brief mention of error handling strategies (e.g., try-catch around the initial publish, handling append failures) would help users build more robust implementations.

💡 Optional: Add minimal error handling
     case 'text-start':
       // Publish initial empty message when response starts
-      const result = await channel.publish({
-        name: 'response',
-        data: ''
-      });
-
-      // Capture the message serial for appending tokens
-      msgSerial = result.serials[0];
+      try {
+        const result = await channel.publish({
+          name: 'response',
+          data: ''
+        });
+        // Capture the message serial for appending tokens
+        msgSerial = result.serials[0];
+      } catch (err) {
+        console.error('Failed to publish initial message:', err);
+      }
       break;

@matt423 matt423 temporarily deployed to ably-docs-ait-273-verce-q4lzrh January 23, 2026 15:53 Inactive
@matt423 matt423 marked this pull request as ready for review January 23, 2026 15:56
@matt423 matt423 requested a review from mschristensen January 23, 2026 15:59
@GregHolmes GregHolmes force-pushed the ait-273-vercel-token-per-response-guide branch from a01a124 to 916151f Compare January 27, 2026 12:53
@GregHolmes GregHolmes temporarily deployed to ably-docs-ait-273-verce-q4lzrh January 27, 2026 12:53 Inactive
@GregHolmes GregHolmes force-pushed the ait-273-vercel-token-per-response-guide branch from 916151f to 12a8018 Compare January 27, 2026 14:17
@GregHolmes GregHolmes temporarily deployed to ably-docs-ait-273-verce-q4lzrh January 27, 2026 14:17 Inactive
Copy link
Contributor

@GregHolmes GregHolmes left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Two comments below,
But also we'd need to add these to the AIT Landing (about) page: https://ably.com/docs/ai-transport

I can't test this just yet as I'm trying to an account to test.


<Code>
```shell
mkdir ably-vercel-message-per-response && cd ably-vercel-message-per-response
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
mkdir ably-vercel-message-per-response && cd ably-vercel-message-per-response
mkdir ably-vercel-example && cd ably-vercel-example

This is very minor, but to keep consistency with the other guides.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm, I think the OpenAI and Anthropic use -example but if a user follows multiple guides for a specific provider then they will have to delete the existing folder(S) which is why I switched to this format.

@GregHolmes
Copy link
Contributor

Just got an API key, tested, no issues from me with the code. So just the two comments above.

@matt423 matt423 force-pushed the ait-273-vercel-token-per-response-guide branch from 12a8018 to 2b0edbf Compare January 28, 2026 17:47
@matt423 matt423 temporarily deployed to ably-docs-ait-273-verce-q4lzrh January 28, 2026 17:48 Inactive
@matt423 matt423 requested a review from GregHolmes January 28, 2026 17:50
@matt423 matt423 force-pushed the ait-273-vercel-token-per-response-guide branch from 2b0edbf to 2d4d50a Compare January 30, 2026 10:03
@matt423 matt423 temporarily deployed to ably-docs-ait-273-verce-q4lzrh January 30, 2026 10:03 Inactive
@GregHolmes GregHolmes force-pushed the ait-273-vercel-token-per-response-guide branch from 2d4d50a to 3782036 Compare February 2, 2026 11:01
@GregHolmes
Copy link
Contributor

All good! I've just rebased to resolve the conflict and bring up to date with main.

@GregHolmes GregHolmes temporarily deployed to ably-docs-ait-273-verce-q4lzrh February 2, 2026 11:01 Inactive
V6 in latest stable(since Decemeber), so use it. No changes needed for our code samples.
Updated event streaming Vercel doc links to more specific page also.
@GregHolmes GregHolmes force-pushed the ait-273-vercel-token-per-response-guide branch from 3782036 to a4e1b59 Compare February 2, 2026 11:10
@GregHolmes GregHolmes temporarily deployed to ably-docs-ait-273-verce-q4lzrh February 2, 2026 11:10 Inactive
@matt423 matt423 merged commit 3bff138 into main Feb 2, 2026
7 checks passed
@matt423 matt423 deleted the ait-273-vercel-token-per-response-guide branch February 2, 2026 11:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

review-app Create a Heroku review app

Development

Successfully merging this pull request may close these issues.

3 participants