OpenRouter Chat

Provided by OpenRouterLearn More

Chat with models from OpenRouter. Access a wide variety of language models from different providers through a single interface.

Preview

Inputs

Prompt
text

The message prompt to send to the model.

Model
dropdown

The OpenRouter model to use for chat completion.

optional: truedefault: qwen/qwen-turbo
System Prompt
text

A system prompt that sets the behavior and context for the conversation.

optional: true
Attachment
generic
Accepts multiple

Attachments to include with the prompt (images, text, etc.).

optional: true
Last Message Only
toggle

If enabled, only the last message will be sent to the model instead of the full conversation history.

optional: true
Temperature
number

Controls randomness in the output. Higher values make the output more random, lower values make it more deterministic.

optional: trueminimum: -1maximum: 1
Seed
seed

A seed value for reproducible outputs. Use the same seed to get consistent results.

optional: truedefault: 8519
Top P
number

Nucleus sampling parameter. Controls diversity via nucleus sampling.

optional: true
Top K
number

Limits the number of highest probability tokens to consider.

optional: true
Max Tokens
number

Maximum number of tokens to generate in the response.

optional: true
Stop Sequences
text
Accepts multiple

Sequences that will cause the model to stop generating further tokens.

optional: true
Presence Penalty
number

Penalizes tokens based on whether they appear in the text so far.

optional: trueminimum: -2maximum: 2
Frequency Penalty
number

Penalizes tokens based on their frequency in the text so far.

optional: trueminimum: -2maximum: 2

Outputs

Last Response
text

The last response from the model.