The Context Guillotine

Difficulty: HARDID: ai-context-001

The Scenario

Your chatbot has a 4000 token limit. Users complain that the bot "forgets" things mentioned 20 messages ago. Currently, you just truncate old messages.

The Problem

Truncation deletes information.

  • User: "My name is Alice." (Message #1)
  • ... 50 messages later ...
  • User: "What is my name?"
  • Bot: "I don't know." (Message #1 was deleted)

The Goal

Implement Summarization:

  1. When history exceeds the limit, don't just delete.
  2. Call mock_summarize(text) on the oldest messages.
  3. Replace them with a single "System Summary" message.

Requirements:

  • Target limit: 4000 tokens.
  • If history > limit:
    • Identify oldest messages to drop.
    • Summarize them.
    • Insert summary at the start of history.
  • Preserve System Prompt and Latest User Message.
solution.py
Loading...
⚠️ Do not include PII or secrets in your code.
SYSTEM_LOGS
5/5
// Waiting for execution trigger...
PREVIEW MODE — SOLVE PREVIOUS MISSIONS TO UNLOCK