Skip to main content

Customize the prompt

What is a prompt?

Asking directly the user's question to the model is not optimal. You need to provide context and guide the model to generate accurate answers. This is done by providing a prompt to the model.

To be more exact, there are 2 prompts that you can provide to the model:

  • the system prompt, which is indicating the behavior of the model (e.g. "You are a submarine biology expert."),
  • and the user prompt, which is handling the context and the question. To refer to the user question, use {question} and to refer to the context, use {context}.

How to write a good prompt?

Nuclia uses default prompts to generate answers to your questions. However, you can customize the prompt to get the best results from the model. Here are some tips to help you create an effective prompt:

  • Be clear and concise: Make sure that your prompt is clear and to the point. Avoid using long sentences or complex language.
  • Use proper punctuation and put new lines to separate clearly the different parts of the prompt.
  • Experiment with different prompts: Try different prompts to see which one works best for your use case. You can use the same prompt with different models to compare the results.

A great way to experiment with different prompts is to use the Nuclia dashboard. The Prompt lab section (in Advanced) allows you to test different prompts with all the supported LLMs and see how they behave.

How to set the prompt?

The prompt can be set at configuration level, in your Nuclia Dashboard, you can set the system prompt and the user prompt for your Knowledge Box. They will apply to any call to the /ask endpoint on this Knowledge Box. Several examples of prompts are available in the Knowledge Box configuration page.

You can also set the prompt at the query level, by passing the prompt parameter in the query to the /ask endpoint. This will override the default prompts set in the Knowledge Box.

You can also set the prompt when configuring a widget in your application.