From 64bbbcc9d17422d038aec35d55496b93a3778571 Mon Sep 17 00:00:00 2001 From: Justin Cechmanek Date: Thu, 15 Aug 2024 15:42:56 -0700 Subject: [PATCH] adds new notebook to README --- README.md | 1 + 1 file changed, 1 insertion(+) diff --git a/README.md b/README.md index 569c266..10732fd 100644 --- a/README.md +++ b/README.md @@ -76,6 +76,7 @@ LLMs are stateless. To maintain context within a conversation chat sessions must | Recipe | Description | | --- | --- | | [/session-manager/00_session_manager.ipynb](python-recipes/session-manager/00_session_manager.ipynb) | Chat session manager with semantic similarity | +| [/session-manager/01_multiple_sessions.ipynb](python-recipes/session-manager/01_multiple_sessions.ipynb) | Handle multiple simultaneous chats with one instance | ## Semantic Cache An estimated 31% of LLM queries are potentially redundant ([source](https://arxiv.org/pdf/2403.02694)). Redis enables semantic caching to help cut down on LLM costs quickly.