Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP] AI/LLM integration #1325

Draft
wants to merge 149 commits into
base: develop
Choose a base branch
from
Draft
Changes from 1 commit
Commits
Show all changes
149 commits
Select commit Hold shift + click to select a range
e09e15a
start from scratch again
perfectra1n Mar 3, 2025
f2a6f92
hey look, it doesn't crash again
perfectra1n Mar 3, 2025
9f84a84
Merge branch 'develop' into ai-llm-integration
perfectra1n Mar 8, 2025
b248a7a
create embedding services
perfectra1n Mar 8, 2025
b97c8dd
set up DB migrations
perfectra1n Mar 8, 2025
1ff5bc6
set up embedding providers here?
perfectra1n Mar 8, 2025
c442943
add additional AI / LLM options and translations
perfectra1n Mar 8, 2025
1361e4d
set up embedding API endpoints
perfectra1n Mar 8, 2025
ea6f9c8
initialize embeddings if option is enabled
perfectra1n Mar 8, 2025
d3013c9
add additional options for ollama embeddings
perfectra1n Mar 8, 2025
553f7dd
fix the Ollama embedding model setting option breaking
perfectra1n Mar 8, 2025
dc439b2
update schema with our new tables
perfectra1n Mar 8, 2025
6ace4d5
nearly able to process embeddings
perfectra1n Mar 8, 2025
0daa9e7
I can create embeddings now?
perfectra1n Mar 8, 2025
0cd1be5
Show embedding generation stats to user
perfectra1n Mar 8, 2025
1ca98e2
update embedding stats every 5s for user
perfectra1n Mar 8, 2025
51c83bb
show fancier stats
perfectra1n Mar 8, 2025
19bf741
fancier embedding process stats
perfectra1n Mar 8, 2025
7e232d1
Create better relationships between notes, sanitize ridiculous spacin…
perfectra1n Mar 8, 2025
733fdcf
update relationship weights
perfectra1n Mar 8, 2025
adaac46
I'm 100% going to have to destroy this commit later
perfectra1n Mar 9, 2025
cf0e924
try a context approach
perfectra1n Mar 10, 2025
ef6ecdc
it errors, but works
perfectra1n Mar 10, 2025
c1585c7
actually shows useful responses now
perfectra1n Mar 10, 2025
75e18e4
Make the sources section fancier
perfectra1n Mar 10, 2025
bd97d97
this is pretty close to opening a new tab?
perfectra1n Mar 10, 2025
08626c7
when a user clicks on a source, don't swap focus
perfectra1n Mar 10, 2025
c386e34
Update the chat panel theme some
perfectra1n Mar 10, 2025
f482b3b
do a better job of extracting context
perfectra1n Mar 10, 2025
9834e77
fix context logic
perfectra1n Mar 10, 2025
ecc183f
almost completely styled codeblocks in response
perfectra1n Mar 10, 2025
d713f38
Merge branch 'develop' into ai-llm-integration
perfectra1n Mar 10, 2025
d2dc401
add these options as configurable
perfectra1n Mar 11, 2025
d413e60
update checkbox options in settings and update translations
perfectra1n Mar 11, 2025
ff679b0
move providers to their own folder
perfectra1n Mar 11, 2025
56fc720
undo accidental MAX_ALLOWED_FILE_SIZE_MB change
perfectra1n Mar 11, 2025
4160db9
fancier (but longer waiting time) messages
perfectra1n Mar 11, 2025
0985cec
implement chunking and use becca for some functionality
perfectra1n Mar 11, 2025
71b3b04
break up the huge context_extractor into smaller files
perfectra1n Mar 11, 2025
f47b070
I think this works to handle failed embeddings
perfectra1n Mar 11, 2025
1f661e4
make sure to not retry chunks if they fail or something else
perfectra1n Mar 11, 2025
6ce3f1c
better note names to LLM?
perfectra1n Mar 11, 2025
0d2858c
upgrade chunking
perfectra1n Mar 11, 2025
3f37196
add additional options for users
perfectra1n Mar 11, 2025
730d123
create llm index service
perfectra1n Mar 11, 2025
fc55995
allow users to manually request index to be rebuilt
perfectra1n Mar 11, 2025
72b1426
break up large vector_store into smaller files
perfectra1n Mar 12, 2025
eaa947e
"rebuild index" functionality for users
perfectra1n Mar 12, 2025
fcba151
allow for manual index rebuild, and ONLY rebuild the index
perfectra1n Mar 12, 2025
e5afbc6
better manage chunking errors
perfectra1n Mar 12, 2025
46a6533
update chunking management
perfectra1n Mar 12, 2025
73445d9
move chunking to its own folder
perfectra1n Mar 12, 2025
3fee82e
rename files with the same name
perfectra1n Mar 12, 2025
b6df3a7
allow user to select *where* they want to generate embeddings
perfectra1n Mar 12, 2025
a930b79
synchronize embeddings
perfectra1n Mar 12, 2025
39d265a
Merge branch 'develop' into ai-llm-integration
perfectra1n Mar 12, 2025
c914aaa
do a better job of handling failed note embeddings
perfectra1n Mar 12, 2025
ee7b228
correctly style the failed embeddings section
perfectra1n Mar 12, 2025
d4cfc65
yes, now the failed embeddings section at least looks passable
perfectra1n Mar 12, 2025
8d7e5c8
fix the maths for stats
perfectra1n Mar 12, 2025
67766e3
create note_embedding object for becca
perfectra1n Mar 12, 2025
6bb4bbb
specially handle Buffer objects into Base64 and back for Becca
perfectra1n Mar 12, 2025
4796c24
Merge branch 'develop' into ai-llm-integration
perfectra1n Mar 12, 2025
f8d4088
Merge branch 'develop' into ai-llm-integration
perfectra1n Mar 13, 2025
c556989
Merge branch 'develop' into ai-llm-integration
perfectra1n Mar 15, 2025
572a03a
Merge branch 'develop' into ai-llm-integration
perfectra1n Mar 16, 2025
697d348
set up more reasonable context window and dimension sizes
perfectra1n Mar 16, 2025
c315b32
wait for DB init even to emit before starting LLM services
perfectra1n Mar 16, 2025
d8c9d3b
move these settings between db migrations
perfectra1n Mar 16, 2025
0081e6f
fix sql error and add missing options
perfectra1n Mar 16, 2025
781a250
fix embeddings w/ cls.init()
perfectra1n Mar 16, 2025
d2072c2
"lock" notes that are having their embeddings created
perfectra1n Mar 16, 2025
ab3c6b6
remove options from migrations files
perfectra1n Mar 16, 2025
cc85b9a
fix autoupdate name inconsistency
perfectra1n Mar 16, 2025
7b643a7
fix(llm): duplicate launch bar config
eliandoran Mar 16, 2025
d716713
refactor(llm): use dedicated widget for llm chat button
eliandoran Mar 16, 2025
5d0be30
feat(llm): show/hide LLM button based on setting
eliandoran Mar 16, 2025
1dafa65
fix(settings/llm): extra separators
eliandoran Mar 16, 2025
2853b8e
feat(options/llm): use tabs for provider configuration
eliandoran Mar 17, 2025
36a6d75
feat(options/llm): group into sections
eliandoran Mar 17, 2025
5a6d271
feat(options/llm): improve checkboxes
eliandoran Mar 17, 2025
e6cb06b
feat(options/llm): use form text style
eliandoran Mar 17, 2025
1efc923
feat(options/llm): use columns and separators
eliandoran Mar 17, 2025
fa99624
feat(options/llm): move stats at the top
eliandoran Mar 17, 2025
0ea7e10
feat(options/llm): improve style of tabs
eliandoran Mar 17, 2025
1844ad7
fix the isEnabled function
perfectra1n Mar 17, 2025
fe1faf7
show user at the top of settings if there are issues
perfectra1n Mar 17, 2025
79514b8
also add the errors to the top of the chat window
perfectra1n Mar 17, 2025
8d8c34c
fancier LLM/AI chat errors
perfectra1n Mar 17, 2025
5aef80f
fix openai endpoints
perfectra1n Mar 17, 2025
6d146c2
try ollama first, always
perfectra1n Mar 17, 2025
d95fd0b
allow specifying openai embedding models too
perfectra1n Mar 17, 2025
4a4eac6
Allow users to specify OpenAI embedding and chat models
perfectra1n Mar 17, 2025
c40c702
add anthropic options as well
perfectra1n Mar 17, 2025
14acd1c
improve LLM response parsing
perfectra1n Mar 17, 2025
7ee6cf6
add additional options and provider sorting
perfectra1n Mar 17, 2025
37f1dcd
add ability to fetch available models from openai
perfectra1n Mar 17, 2025
3268c43
improve embedding precedence
perfectra1n Mar 17, 2025
ebc5107
add missing options
perfectra1n Mar 17, 2025
43cf33c
make the AI settings even fancier to setting precedence
perfectra1n Mar 17, 2025
ac40fff
draggable options for LLM provider too
perfectra1n Mar 17, 2025
5ad730c
openai finally works, respect embedding precedence
perfectra1n Mar 17, 2025
84a8473
adapt or regenerate embeddings - allows users to decide
perfectra1n Mar 17, 2025
558f6a9
add translations
perfectra1n Mar 17, 2025
c372011
add Voyage AI as Embedding provider
perfectra1n Mar 17, 2025
08f7f19
do a better job with similarity searches
perfectra1n Mar 18, 2025
f05fe3f
set up embedding normalization
perfectra1n Mar 18, 2025
8129f8f
oh my goodness, saving these settings finally works
perfectra1n Mar 18, 2025
1a8ce96
stop the log spam
perfectra1n Mar 18, 2025
f6afb1d
set up agentic thinking
perfectra1n Mar 19, 2025
492c05b
clean up silly chat_widget that was in the wrong place
perfectra1n Mar 19, 2025
352204b
add agentic thinking to chat
perfectra1n Mar 19, 2025
db4dd6d
refactor "context" services
perfectra1n Mar 19, 2025
466b749
yeet deprecated function
perfectra1n Mar 19, 2025
d5efcfe
fix chat_service imports
perfectra1n Mar 19, 2025
2348096
get rid of this unused file too
perfectra1n Mar 19, 2025
024b063
Merge branch 'develop' into ai-llm-integration
perfectra1n Mar 19, 2025
5b81252
fix translation
perfectra1n Mar 19, 2025
0d4b6a7
update agent tools
perfectra1n Mar 19, 2025
90db570
agent tools do something now
perfectra1n Mar 19, 2025
4ff3c5a
agentic thinking really works now 🗿
perfectra1n Mar 19, 2025
e566692
centralize all prompts
perfectra1n Mar 20, 2025
eb1ef36
move the llm_prompt_constants to its own folder
perfectra1n Mar 20, 2025
273dff2
create a better base system prompt
perfectra1n Mar 20, 2025
9c1ab4f
add to base prompt
perfectra1n Mar 20, 2025
1be70f1
do a better job of building the context
perfectra1n Mar 20, 2025
915c95f
more aggressively filter notes out that don't work for us
perfectra1n Mar 20, 2025
c9728e7
also extract Note relationships and send as context
perfectra1n Mar 20, 2025
34940b5
Merge branch 'develop' into ai-llm-integration
perfectra1n Mar 20, 2025
3d70a6c
appropriately show if there are any notes still in the queue
perfectra1n Mar 20, 2025
0707266
reset embedding_queue where objects are "isprocessing"
perfectra1n Mar 20, 2025
150b0f0
remove isEnabled from embedding providers
perfectra1n Mar 24, 2025
567e9e8
Remove the drag-and-drop for settings, kept breaking
perfectra1n Mar 24, 2025
9d29ff4
don't spam the logs if a provider isn't enabled
perfectra1n Mar 24, 2025
b00c20c
Merge branch 'develop' into ai-llm-integration
perfectra1n Mar 24, 2025
f1ecc15
might have to delete this later, fixing the right-pane-container
perfectra1n Mar 24, 2025
3534399
yerp that was it
perfectra1n Mar 24, 2025
654ed47
fix embedding provider precedence settings issue
perfectra1n Mar 24, 2025
44b6734
anthropic works
perfectra1n Mar 26, 2025
c49883f
move constants to their own files and folder
perfectra1n Mar 26, 2025
a50575c
move more prompts to the constants file
perfectra1n Mar 26, 2025
5869eaf
move more constants from files into centralized location
perfectra1n Mar 26, 2025
7138053
move providers.ts into providers folder
perfectra1n Mar 26, 2025
7c519df
fix prompt path import
perfectra1n Mar 26, 2025
15630fb
add swaggerUI docstrings for LLM/AI API routes
perfectra1n Mar 26, 2025
baef5f9
fix updateProvider parameter
perfectra1n Mar 26, 2025
35fbc73
Merge branch 'develop' into ai-llm-integration
eliandoran Mar 26, 2025
a7cafce
more heavily weigh notes with title matches when giving context to LLM
perfectra1n Mar 26, 2025
5456ac3
set up embedding similarity constants and similarity system
perfectra1n Mar 26, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
remove options from migrations files
perfectra1n committed Mar 16, 2025

Verified

This commit was signed with the committer’s verified signature.
perfectra1n Jon Fuller
commit ab3c6b6fb1ed318d533d74747541b77825f99b40
42 changes: 0 additions & 42 deletions db/migrations/0229__ai_llm_options.sql

This file was deleted.

File renamed without changes.
2 changes: 1 addition & 1 deletion src/services/app_info.ts
Original file line number Diff line number Diff line change
@@ -5,7 +5,7 @@ import build from "./build.js";
import packageJson from "../../package.json" with { type: "json" };
import dataDir from "./data_dir.js";

const APP_DB_VERSION = 230;
const APP_DB_VERSION = 229;
const SYNC_VERSION = 35;
const CLIPPER_PROTOCOL_VERSION = "1.0";

2 changes: 1 addition & 1 deletion src/services/options_init.ts
Original file line number Diff line number Diff line change
@@ -180,7 +180,7 @@ const defaultOptions: DefaultOption[] = [
{ name: "ollamaDefaultModel", value: "llama3", isSynced: true },
{ name: "ollamaBaseUrl", value: "http://localhost:11434", isSynced: true },
{ name: "ollamaEmbeddingModel", value: "nomic-embed-text", isSynced: true },
{ name: "embeddingAutoUpdate", value: "true", isSynced: true },
{ name: "embeddingAutoUpdateEnabled", value: "true", isSynced: true },

// Adding missing AI options
{ name: "aiTemperature", value: "0.7", isSynced: true },