Adjusting chunks retrieved:
TheMax Chunks Retrievable slider controls how many pieces of information () the LLM can retrieve in one interaction. You can adjust it anywhere between 1 and 10. We recommend keeping it at around 3-4 depending on in the knowledgebase.
When you adjust the slider, you change the maximum number of the model will retrieve. If you set it to a higher number, the model can pull more information at once, which might be useful for complex queries. However, be mindful that some models have a lower , which means they can’t process as much information at once and may require you to set a lower value on the slider.
Search similarity prompt:
To further enhance your agent’s ability to retrieve the most relevant information from the knowledge base, use a prompt for the :Write Your Prompt
In the input field that appears, write your custom prompt. If you’re unsure, you can use the provided example as a starting point.
Include Chat History Variable
Ensure you include the
{chat_history} variable in your prompt. This is crucial for the proper functioning of your knowledge base.
