Karini AI Documentation
Go Back to Karini AI
  • Introduction
  • Installation
  • Getting Started
  • Organization
  • User Management
    • User Invitations
    • Role Management
  • Model Hub
    • Embeddings Models
    • Large Language Models (LLMs)
  • Prompt Management
    • Prompt Templates
    • Create Prompt
    • Test Prompt
      • Test & Compare
      • Prompt Observability
      • Prompt Runs
    • Agentic Prompts
      • Create Agent Prompt
      • Test Agent Prompt
    • Prompt Task Types
    • Prompt Versions
  • Datasets
  • Recipes
    • QnA Recipe
      • Data Storage Connectors
      • Connector Credential Setup
      • Vector Stores
      • Create Recipe
      • Run Recipe
      • Test Recipe
      • Evaluate Recipe
      • Export Recipe
      • Recipe Runs
      • Recipe Actions
    • Agent Recipe
      • Agent Recipe Configuration
      • Set up Agentic Recipe
      • Test Agentic Recipe
      • Agentic Evaluation
    • Databricks Recipe
  • Copilots
  • Observability
  • Dashboard Overview
    • Statistical Overview
    • Cost & Usage Summary
      • Spend by LLM Endpoint
      • Spend by Generative AI Application
    • Model Endpoints & Datasets Distribution
    • Dataset Dashboard
    • Copilot Dashboard
    • Model Endpoints Dashboard
  • Catalog Schemas
    • Connectors
    • Catalog Schema Import and Publication Process
  • Prompt Optimization Experiments
    • Set up and execute experiment
    • Optimization Insights
  • Generative AI Workshop
    • Agentic RAG
    • Intelligent Document Processing
    • Generative BI Agentic Assistant
  • Release Notes
Powered by GitBook
On this page
  1. Prompt Management

Prompt Versions

PreviousPrompt Task TypesNextDatasets

Last updated 1 month ago

Upon creation, a prompt is initially saved as a draft version. You can continue to refine the prompt by tweaking it, testing alternative models, and fine-tuning the model configurations as outlined in the section.

A new version of a prompt can be created by clicking the Publish button located at the top right corner of the Prompt Playground.

You can create multiple versions of a prompt as it undergoes testing. The prompt versions can be viewed under the Versions tab which becomes visible after publishing the first prompt version. Selecting a specific version displays the corresponding LLM and its configuration details on the right panel of the interface.

Refer to the following video for guidance on interfacing with the Version tab in the UI.

You can switch to the desired version by clicking the Load Version button, which will overwrite the current prompt configurations.

Refer to the video for the Load version functionality.

Upon publication, the version will be displayed.The current version is displayed on the prompt.

The most recently published version will be displayed in the Prompt Table for reference and management.

Prompts associated with versions can be utilized in recipes. A prompt must have a model associated with it in order to be utilized in the recipe. It can be done by using the Select as best answer button on a model response when testing the prompt.

Test & Compare