Skip to content

feat(redis): add redis memory backend and redis memory example #376 #377

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: develop
Choose a base branch
from

Conversation

briancaffey
Copy link
Contributor

Description

Closes #376

By Submitting this PR I confirm:

  • I am familiar with the Contributing Guidelines.
  • We require that all contributors "sign-off" on their commits. This certifies that the contribution is your original work, or you have rights to submit it under the same license, or a compatible license.
    • Any contribution which contains commits that are not Signed-Off will not be accepted.
  • When the PR is ready for review, new or existing tests cover these changes.
  • When the PR is ready for review, the documentation is up to date with these changes.

Copy link

copy-pr-bot bot commented Jun 16, 2025

This pull request requires additional validation before any workflows can run on NVIDIA's runners.

Pull request vetters can view their responsibilities here.

Contributors can view more details about this message here.

@mdemoret-nv mdemoret-nv added feature request New feature or request non-breaking Non-breaking change labels Jun 16, 2025
Copy link
Collaborator

@mdemoret-nv mdemoret-nv left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great PR! Just a couple of changes needed. I'll enable CI and we can merge once CI is passing and all comments are resolved.

Thank you!

Comment on lines +83 to +92
logger.info("Computing embedding for memory text")
search_vector = self._embedder.embed_query(memory_text)
logger.info(f"Generated embedding vector of length: {len(search_vector)}")
memory_data["embedding"] = search_vector

try:
# Store as JSON in Redis
logger.info(f"Attempting to store memory data in Redis for key: {memory_key}")
await self._client.json().set(memory_key, "$", memory_data)
logger.info(f"Successfully stored memory data for key: {memory_key}")
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a large number of logging statements to be printed each time at the INFO level. Can we change these to logger.debug()?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should be applied consistently throughout the entire file


self._client = redis_client
self._key_prefix = key_prefix
self._embedder = embedder
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Check for None values for redis_client and embedder. The checks later on in the file for if self._embedder: can then be removed.

@@ -38,6 +38,7 @@ llms:
nim_llm:
_type: nim
model_name: meta/llama-3.3-70b-instruct
# base_url: http://192.168.5.173:8000/v1
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Was this intentional? Can it be removed?

@mdemoret-nv
Copy link
Collaborator

/ok to test 7ead134

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature request New feature or request non-breaking Non-breaking change
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[FEA]: Redis memory backend
2 participants