Chat styles
Written By ThinkBuddy
Last updated About 1 year ago
Chat styles are instructions that direct how the AI should respond during conversations. These styles guide the AI's tone and approach when answering your questions.
Use pre-built styles created by ThinkBuddy for various purposes, or create your own custom styles. You can save your custom styles to use them in future conversations.
Custom chat style
To customize your chat style, select Custom style from the style menu. You can then use the Sidepeek panel to customize instructions for your current chat. Advanced options allow you even more control like adjusting context limit and LLM parameters to precisely tailor the conversation to your preferences. You can save your custom style to use it in future conversations.
Context limit
Context limit determines how many questions and answers the AI remembers in a conversation. By default, it includes the entire chat history. Select from available options to adjust the context limit based on your needs.
LLM parameters
LLM parameters are settings that control how the model processes and generates text. Understanding these parameters allows you to fine-tune your conversations for specific outcomes.
Temperature
Temperature controls the randomness and creativity in an LLM's output. Lower temperatures produce more focused and predictable responses, while higher temperatures create more diverse and unpredictable outcomes..
Top_p
Top_p, also known as nucleus sampling, controls the diversity of the model's output by limiting the pool of words considered for the next word. Higher top_p values allow for more diverse and creative responses, while a lower top_p value produces more focused and coherent responses.
Presence Penalty
The presence penalty encourages the model to use a wider variety of words by reducing the probability of using a word that has already appeared in the response.
Frequency Penalty
The frequency penalty discourages the model from repeating words, encouraging more diverse vocabulary usage.