Create Prompt
Karini AI's prompt playground enables domain experts to become prompt engineers with a guided experience. You can develop high quality prompts, test them and track your prompt experimentation by saving the prompt runs.
There are multiple ways to create a prompt:
Create Prompt Using a Prompt Template
On prompt playground, click "Add new" to start new prompt creation. Click "Prompt templates" to access the available prompt templates. You can then select a template relevant to the task and continue to customize the prompt as required as described in the following section.
Create New Prompt
On the Prompt Playground, click "Add new" to start new prompt creation.
Provide a prompt name and select an appropriate task from the available list.
Agent
Construct your prompt with appropriate instructions and variables.
Variables can be added dynamically using "Add variable" button.
You can select the "User Input" option if the variable will be used as an input by the user in copilot interface.
The constructed prompt - including the context and variables can be viewed on the right hand side Prompt panel.
Once the prompt is created, it can be tested on the Test & Compare tab.
Special Variables
When authoring prompts, following variables are treated as special or predefined variables.
Context: This is a predefined variable for LLM. For prompt testing, authors can provide appropriate text as a value for this variable. However, for production use or use within recipes and copilots, the value for this variable will be replaced with relevant context retrieved from the vector store. You can also upload a context file as a input to this variable. The context file must be of type .txt or .pdf. The contents from the pdf file are automatically preprocessed by doing OCR before adding them as a input to context. The limit to file size that can be added as a context is 2 MB.
Question: This is a predefined variable for LLM. For prompt testing ,authors can provide the question related to the context as value for this variable. Enable the User input checkbox on the Edit prompt page to display the question under User Input in the Test & compare section.
Evaluation Metric Name: This is a predefined variable for LLM used in Evaluation prompts. For prompt testing ,authors can provide the evaluation metric name as value for this variable.
Evaluation Metric Description: This is a predefined variable for LLM used in Evaluation prompts. For prompt testing ,authors can provide the evaluation metric description as value for this variable.
Evaluation Grading Criteria: This is a predefined variable for LLM used in Evaluation prompts. For prompt testing ,authors can provide the evaluation grading criteria as value for this variable.
Evaluation Input: This is a predefined variable for LLM used in Evaluation prompts. For prompt testing ,authors can provide the question as as value for this variable.
Evaluation Output: This is a predefined variable for LLM used in Evaluation prompts. For prompt testing ,authors can provide an answer to compare with the ground truth output for assessment.
Evaluation Ground Truth: This is a predefined variable for LLM used in Evaluation prompts. For prompt testing ,authors can provide ground truth answer for assessment.
Last updated