Skip to content

Commit bc1571f

Browse files
jx453331958claude
andcommitted
fix: use first entry per message ID for correct output_tokens
Claude Code JSONL writes multiple entries per message (streaming chunks). The FIRST entry contains the correct final usage from the API response. Later entries are intermediate states with inflated cumulative output_tokens. Previously we kept the entry with the highest output_tokens, causing output to be ~9x higher than actual. Now we keep the first entry, matching ccusage CLI's behavior (which also deduplicates by messageId:requestId, keeping first). Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
1 parent 1a3a119 commit bc1571f

2 files changed

Lines changed: 6 additions & 5 deletions

File tree

agent/agent.js

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -329,9 +329,9 @@ function parseJsonlFile(filePath) {
329329
_msgId: msgId,
330330
};
331331

332-
// Keep the entry with the largest output_tokens per message ID
333-
const existing = msgMap.get(msgId);
334-
if (!existing || outputTokens >= existing.output_tokens) {
332+
// Keep the FIRST entry per message ID (it has the correct final usage)
333+
// Later entries are streaming chunks with cumulative intermediate values
334+
if (!msgMap.has(msgId)) {
335335
msgMap.set(msgId, recordData);
336336
}
337337
} catch (err) {

agent/agent.py

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -261,8 +261,9 @@ def parse_jsonl_file(file_path: Path, state: State) -> List[Dict]:
261261
'_msg_id': msg_id,
262262
}
263263

264-
# Keep the entry with the largest output_tokens per message ID
265-
if msg_id not in msg_map or output_tokens >= msg_map[msg_id]['output_tokens']:
264+
# Keep the FIRST entry per message ID (it has the correct final usage)
265+
# Later entries are streaming chunks with cumulative intermediate values
266+
if msg_id not in msg_map:
266267
msg_map[msg_id] = record_data
267268

268269
except json.JSONDecodeError:

0 commit comments

Comments
 (0)