Chapter 5

Prompts as Hyperparameters

Treat prompts as trainable hyperparameters optimized through automated search.

The Hyperparameter Framework

In traditional ML, hyperparameters like learning rate are tuned for optimal performance. In DSPy, prompts themselves become hyperparameters. This transforms prompt engineering from an art into a systematic optimization problem.

Types of prompt hyperparameters include: instruction templates, few-shot example selection, formatting patterns, task decomposition, and reasoning steps.

Auto-Optimization Architecture

from dataclasses import dataclass
from typing import List, Dict, Any

@dataclass
class PromptHyperparameters:
    instruction_template: str
    example_selection_strategy: str
    formatting_pattern: str
    reasoning_guidance: str
    task_decomposition: List[str]

class PromptHyperparameterOptimizer:
    def __init__(self, base_program, metric_fn, search_space):
        self.base_program = base_program
        self.metric_fn = metric_fn
        self.search_space = search_space

    def optimize(self, trainset, valset, num_iterations=50):
        best_params, best_score = None, 0.0
        for _ in range(num_iterations):
            current_params = self._sample_hyperparameters()
            program = self._apply_hyperparameters(current_params)
            score = self._evaluate(program, valset)
            if score > best_score:
                best_score = score
                best_params = current_params
        return best_params