What It Is
ContextComposer is the module that builds the final prompt sent to the LLM. It's essentially an automated prompt engineer that combines multiple context sources while respecting token budgets.
How It Maps to Concepts
| AI/ML Concept | Amprealize Implementation |
|---|---|
| Prompt Engineering | context_composer.py applies role setting, few-shot examples, format specs automatically |
| Tokenization | Token counting for budget management — ensures prompts fit context windows |
| RAG | Retrieved behaviors and knowledge packs are injected into the composed prompt |
Context Assembly Order
1. System Instructions (role, rules, constraints)
2. Retrieved Behaviors (from BCI)
3. Knowledge Pack Overlays (domain-specific context)
4. Conversation History (trimmed to fit budget)
5. User MessageEach layer has a token budget. If the total exceeds the model's context window, lower-priority layers get trimmed first (conversation history → knowledge packs → behaviors).
Key Files
amprealize/context_composer.py— Orchestrates assemblyamprealize/context_resolver.py— Resolves references and dependenciesamprealize/context.py— Context data structures
See Also
- BCI In Practice — How behaviors get retrieved before composition
- Agent Orchestration In Practice — What happens after the prompt is composed