Prompt Management

Overview

Prompt management is a crucial aspect of building robust and maintainable applications that utilize Large Language Models (LLMs). This document provides an overview of prompt management, its benefits, and key considerations for implementation.

What is Prompt Management?

Prompt management is a systematic approach to storing, versioning, and retrieving prompts in LLM applications. It involves:

  • Version control: Tracking changes to prompts over time

  • Decoupling prompts from code: Storing prompts separately from application code

  • Monitoring and logging: Tracking prompt performance and usage

  • Optimization: Improving prompts based on performance metrics

  • Integration: Seamless integration with the application and tool stack

Prompt Object

  • name: Unique name of the prompt within a project.

  • type: The type of the prompt content (text or chat). Default is text.

  • prompt: The text template with variables (e.g. This is a prompt with a {{variable}}). For chat prompts, this is a list of chat messages each with role and content.

  • config: Optional JSON object to store any parameters (e.g. model parameters or model tools).

  • version: Integer to indicate the version of the prompt. The version is automatically incremented when creating a new prompt version.

  • labels: Labels that can be used to fetch specific prompt versions in the SDKs.

    • When using a prompt without specifying a label, the app will serve the version with the production label.

    • latest points to the most recently created version.

    • You can create any additional labels, e.g. for different environments (staging, production) or tenants (tenant-1, tenant-2).

How it works

  1. Create/update prompt If you already have a prompt with the same name, the prompt will be added as a new version.

  2. Use prompt At runtime, you can fetch the latest production version from the app.

  3. Add Tracing (optional) Enable tracking of metrics by prompt version and name, such as "movie-critic", directly in the App UI. Metrics like scores per prompt version provide insights into how modifications to prompts impact the quality of the generations.

  4. Rollbacks (optional) When a prompt has a production label, then that version will be served by default in the SDKs. You can quickly rollback to a previous version by setting the production label to that previous version in the app UI.

Last updated