diff --git a/src/pages/[platform]/ai/conversation/connect-your-frontend/index.mdx b/src/pages/[platform]/ai/conversation/connect-your-frontend/index.mdx index b3915b464bb..047294733b2 100644 --- a/src/pages/[platform]/ai/conversation/connect-your-frontend/index.mdx +++ b/src/pages/[platform]/ai/conversation/connect-your-frontend/index.mdx @@ -117,7 +117,7 @@ Example conversation data */ ``` -You can optionally attach a `name` and `metadata` to a conversation by passing them as arguments to the `.create()` method. There are no uniqueness constraints on conversation `name` or `metadata` values. +You can optionally attach a `name` and `metadata` to a conversation by passing them as arguments to the `.create()` method. There are no uniqueness constraints on conversation `name` or `metadata` values. You can use `metadata` as a way to organize chats and group them into certain topics. ```ts const { data: chat, errors } = await client.conversations.chat.create({ diff --git a/src/pages/[platform]/ai/conversation/context/index.mdx b/src/pages/[platform]/ai/conversation/context/index.mdx index fac4cd0ac69..b2233acec75 100644 --- a/src/pages/[platform]/ai/conversation/context/index.mdx +++ b/src/pages/[platform]/ai/conversation/context/index.mdx @@ -32,6 +32,8 @@ export function getStaticProps(context) { For LLMs to provide high-quality answers to users' questions, they need to have the right information. Sometimes this information is contextual, based on the user or the state of the application. To allow for this, you can send `aiContext` with any user message to the LLM, which can be any unstructured or structured data that might be useful. +> Note: `aiContext` is available during the chat and is passed to the LLM, however `metadata` is not available to the chat or passed to the LLM. + ```ts