Skip to content

Conversation

@m-hulbert
Copy link
Contributor

Description

This PR adds an overview page for AI Transport.

Checklist

@coderabbitai
Copy link

coderabbitai bot commented Jan 12, 2026

Important

Review skipped

Auto reviews are disabled on this repository.

Please check the settings in the CodeRabbit UI or the .coderabbit.yaml file in this repository. To trigger a single review, invoke the @coderabbitai review command.

You can disable this status message by setting the reviews.review_status to false in the CodeRabbit configuration file.


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.


AI Transport runs on Ably's fault-tolerant and highly-available platform. The platform enables data to be streamed between all internet-connected devices at low latencies across the globe. Its elastic global infrastructure delivers enterprise-scale messaging that effortlessly scales to meet demand.

Drop AI Transport into your applications to transform unreliable HTTP connections into reliable, stateful AI experiences that keep users engaged.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I personally don't think we should be making our argument based on HTTP streams being unreliable. Generally they're not. The key benefits of AIT vs streamed HTTP are:

  • they're stateful;
  • they're bidirectional.

We should of course additionally talk about resumability etc, but if HTTP reliability was the only concern, there wouldn't be that much incentive to use AIT.


Token streaming is the core of how LLMs deliver their responses to users. Tokens are progressively streamed to users from your LLM so that users don't need to wait for a complete response before seeing any output.

If you stream tokens over brittle HTTP then any interruption to the connection means that all tokens transmitted during the interruption are lost. That might be if a user switches tabs, temporarily loses network connectivity or if their browser crashes - all common scenarios that should be handled gracefully for industry-leading user experiences.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

See comment above


### Enterprise controls <a id="enterprise"/>

Ably's platform provides [integrations](/docs/platform/integrations) and functionality to ensure that your applications always exceed the requirements of enterprise environments. Whether that's [message auditing](/docs/platform/integrations/streaming), [client identification](/docs/auth/identified-clients) or [RBAC](/docs/auth/capabilities).
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Ably's platform provides [integrations](/docs/platform/integrations) and functionality to ensure that your applications always exceed the requirements of enterprise environments. Whether that's [message auditing](/docs/platform/integrations/streaming), [client identification](/docs/auth/identified-clients) or [RBAC](/docs/auth/capabilities).
Ably's platform provides [integrations](/docs/platform/integrations) and functionality to ensure that your applications always exceed the requirements of enterprise environments. Whether that's [message auditing](/docs/platform/integrations/streaming), [client identification](/docs/auth/identified-clients) or [fine-grained authorization](/docs/auth/capabilities).

I'm not that comfortable describing it as RBAC, because it's not role-based

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

review-app Create a Heroku review app

Development

Successfully merging this pull request may close these issues.

4 participants