Customize the prompt
What is a prompt?
Asking directly the user's question to the model is not optimal. You need to provide context and guide the model to generate accurate answers. This is done by providing a prompt to the model.
To be more exact, there are 3 prompts that you can customize:
- The system prompt, which is indicating the behavior of the model when answering the question.
- The user prompt, which is handling the context and the question. To refer to the user question, use
{question}
and to refer to the context, use{context}
. - The rephrase prompt, which is used to rephrase the user question in order to optimize the quality of the search results.
Which prompt should I use?
The most important distinction is between the rephrase prompt and the 2 others. The rephrase prompt will be applied before the search, while the user prompt and the system prompt will be applied after the results are retrieved.
Example:
-
Rephrase prompt:
Focus on South Pacific marine life.
-
System prompt:
You are a submarine biology expert. Use relevant vocabulary accordingly.
-
User prompt:
Given this context: {context}. Answer this {question} in a concise way using the provided context. If the context refers to any endangered species, please mention it explicitly.
By putting Focus on South Pacific marine life
in the rephrase prompt, you make sure this criteria will be taken into account when searching for the context. If you put it in user prompt, it is too late, the search has already been done, so if it happens that the best matches are all about North Atlantic marine life, you will get no answer.
How to write a good prompt?
Nuclia uses default prompts to generate answers to your questions. However, you can customize the prompt to get the best results from the model. Here are some tips to help you create an effective prompt:
- Be clear and concise: Make sure that your prompt is clear and to the point. Avoid using long sentences or complex language.
- Use proper punctuation and put new lines to separate clearly the different parts of the prompt.
- Experiment with different prompts: Try different prompts to see which one works best for your use case. You can use the same prompt with different models to compare the results.
A great way to experiment with different prompts is to use the Nuclia dashboard. The Prompt lab section (in Advanced) allows you to test different prompts with all the supported LLMs and see how they behave.
How to set the prompts?
The prompts can be set at configuration level, in your Nuclia Dashboard, you can set the system prompt and the user prompt for your Knowledge Box. They will apply to any call to the /ask
endpoint on this Knowledge Box.
Several examples of prompts are available in the Knowledge Box configuration page.
You can also set the prompts at the query level, by passing the prompt
parameter in the query to the /ask
endpoint. This will override the default prompts set in the Knowledge Box.
The prompt
parameter is a JSON object with the following structure:
{
"system": "Your system prompt",
"user": "Your user prompt",
"rephrase": "Your rephrase prompt"
}
You can also set the prompts when configuring a widget in your application.