Class ChatCohere<CallOptions>

Integration with ChatCohere

const model = new ChatCohere({
apiKey: process.env.COHERE_API_KEY, // Default
model: "command-r-plus" // Default
});
const response = await model.invoke([
new HumanMessage("How tall are the largest pengiuns?")
]);

Type Parameters

Hierarchy (view full)

Implements

Constructors

Properties

client: CohereClient
model: string = "command-r-plus"

The name of the model to use.

{"command"}
streamUsage: boolean = true

Whether or not to include token usage when streaming. This will include an extra chunk at the end of the stream with eventType: "stream-end" and the token usage in usage_metadata.

{true}
streaming: boolean = false

Whether or not to stream the response.

{false}
temperature: number = 0.3

What sampling temperature to use, between 0.0 and 2.0. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic.

{0.3}

Methods