Class: MistralAI
MistralAI LLM implementation
Implements
Constructors
constructor
• new MistralAI(init?
)
Parameters
Name | Type |
---|---|
init? | Partial <MistralAI > |
Defined in
packages/core/src/llm/mistral.ts:59
Properties
apiKey
• Optional
apiKey: string
Defined in
packages/core/src/llm/mistral.ts:52
callbackManager
• Optional
callbackManager: CallbackManager
Defined in
packages/core/src/llm/mistral.ts:53
hasStreaming
• hasStreaming: boolean
= true
Implementation of
Defined in
packages/core/src/llm/mistral.ts:45
maxTokens
• Optional
maxTokens: number
Defined in
packages/core/src/llm/mistral.ts:51
model
• model: "mistral-tiny"
| "mistral-small"
| "mistral-medium"
Defined in
packages/core/src/llm/mistral.ts:48
randomSeed
• Optional
randomSeed: number
Defined in
packages/core/src/llm/mistral.ts:55
safeMode
• safeMode: boolean
Defined in
packages/core/src/llm/mistral.ts:54
session
• Private
session: MistralAISession
Defined in
packages/core/src/llm/mistral.ts:57
temperature
• temperature: number
Defined in
packages/core/src/llm/mistral.ts:49
topP
• topP: number
Defined in
packages/core/src/llm/mistral.ts:50
Accessors
metadata
• get
metadata(): Object
Returns
Object
Name | Type |
---|---|
contextWindow | number |
maxTokens | undefined | number |
model | "mistral-tiny" | "mistral-small" | "mistral-medium" |
temperature | number |
tokenizer | undefined |
topP | number |
Implementation of
Defined in
packages/core/src/llm/mistral.ts:70
Methods
buildParams
▸ Private
buildParams(messages
): any
Parameters
Name | Type |
---|---|
messages | ChatMessage [] |
Returns
any
Defined in
packages/core/src/llm/mistral.ts:85
chat
▸ chat<T
, R
>(messages
, parentEvent?
, streaming?
): Promise
<R
>
Get a chat response from the LLM
Type parameters
Name | Type |
---|---|
T | extends undefined | boolean = undefined |
R | T extends true ? AsyncGenerator <string , void , unknown > : ChatResponse |
Parameters
Name | Type | Description |
---|---|---|
messages | ChatMessage [] | The return type of chat() and complete() are set by the "streaming" parameter being set to True. |
parentEvent? | Event | - |
streaming? | T | - |
Returns
Promise
<R
>
Implementation of
Defined in
packages/core/src/llm/mistral.ts:97
complete
▸ complete<T
, R
>(prompt
, parentEvent?
, streaming?
): Promise
<R
>
Get a prompt completion from the LLM
Type parameters
Name | Type |
---|---|
T | extends undefined | boolean = undefined |
R | T extends true ? AsyncGenerator <string , void , unknown > : ChatResponse |
Parameters
Name | Type | Description |
---|---|---|
prompt | string | the prompt to complete |
parentEvent? | Event | - |
streaming? | T | - |
Returns
Promise
<R
>
Implementation of
Defined in
packages/core/src/llm/mistral.ts:117
streamChat
▸ Protected
streamChat(messages
, parentEvent?
): AsyncGenerator
<string
, void
, unknown
>
Parameters
Name | Type |
---|---|
messages | ChatMessage [] |
parentEvent? | Event |
Returns
AsyncGenerator
<string
, void
, unknown
>
Defined in
packages/core/src/llm/mistral.ts:128
streamComplete
▸ Protected
streamComplete(query
, parentEvent?
): AsyncGenerator
<string
, void
, unknown
>
Parameters
Name | Type |
---|---|
query | string |
parentEvent? | Event |
Returns
AsyncGenerator
<string
, void
, unknown
>
Defined in
packages/core/src/llm/mistral.ts:172
tokens
▸ tokens(messages
): number
Calculates the number of tokens needed for the given chat messages
Parameters
Name | Type |
---|---|
messages | ChatMessage [] |
Returns
number