Releases: Coelacanthiform/TALKER-Expanded
Releases · Coelacanthiform/TALKER-Expanded
alpha-testing-00027g
Hotfix 6
- Added filter for anomaly and radiation damage to the Injury trigger.
- Improved injury trigger event info to account for self-damage and environmental damage
alpha-testing-00027f
Hotfix 5
- Added yet another safety check for task giver rank check on Task trigger
- Maybe reduced console spam from weapon_unstrapped
alpha-testing-00027e
Hotfix 4
- Added nil fallbacks to Task trigger, and fallbacks against dummy objects
- Added protection against a race condition in the Task trigger that could cause crashes if the task giver was deleted at the exact moment of task completion
- Removed token limits for memory management replies and disabled the long-term memory safety reset. This will stop "out of memory" errors on very long save files.
- Reduced default chance to ask the player a question in Idle Conversation trigger from 50% to 15%
Long-term memory is now effectively uncapped, and could theoretically get very long if you have a really long save file. Will monitor feedback on how this behaves on very long save files and if a better long-term solution needs to be implemented.
alpha-testing-00027d
Hotfix 3
- Idle Conversation script now uses neither actor_binder nor CreateTimeEvent. Should be busyhands-proof.
alpha-testing-00027c
Hotfix 2
- Stricter XML loading for anomaly trigger to prevent misattribution
alpha-testing-00027b
CHANGELOG
- Fixed anti-spam function for repeat callouts (dynamic event info format broke it, but using flags instead of string matching is probably better anyway).
- Increased max token limit for responses from 2500 to 3500
- Increased memory reset threshold from 8500 characters to 13500 characters. Hopefully combined with the above gives the AI a little more leeway to break long-term memory character limits on very long save files.
alpha-testing-00027
CHANGELOG
- Added MCM option for dialogue message display length
- Added improved weapon context for characters - current weapon status now show whether their weapon is holstered or not
- Inject current weapon status of all nearby characters into the prompt's scene context, hopefully creating less confusion about who is using what weapon
Fixes
- Added missing tonumber in injury trigger
- Added missing initial load of queries in sleep and reload triggers
alpha-testing-00026b
alpha-testing-00026
CHANGELOG
New Features
- Silent events: Added support for events that are registered in event history but don't immediately trigger dialogue. This lets NPCs "see" them happening even without immediately commenting on them.
- Many event triggers can now be set to "Silent" in the MCM. If you want NPCs to notice and remember a type of event, but don't want immediate dialogue when it happens you can set it to silent (e.g., for anomalies).
- Added cooldowns to the Death Trigger, utilizing the new silent events.
- The Death trigger uses silent events whenever the cooldown isn't ready. The cooldown limits how often a killing can produce dialogue, but all deaths are still registered as silent events when the cooldown isn't ready.
- This should reduce dialogue flood in firefights without making NPCs "blind" or "forgetful" to deaths happening.
- Added support for disguises.
- Event system fully supports disguises. Companions should be aware that you're disguised and play along. If you're disguised as an "organized faction" (ie: NOT Loner, Bandit or Renegade) and you talk to someone of the same faction, they might be a little suspicious of you.
- Disguise information is stored in event history. I haven't tested disguises over long-term (e.g., talking to somebody wearing a disguise, then returning to that person without a disguise. AIs should be able to handle this well, but more specific instructions might be needed.)
- Support for dynamic world state.
- Certain important events are injected into the prompt as context if they have happened in your save. This includes: Miracle Machine being disabled, Brain Scorcher being disabled, Faction Leaders being dead, and a select list of "important characters" being dead.
- Injection of important character deaths is context-aware, only injecting if they are mentioned in recent events, if their area is mentioned or if you currently are in the area they were in.
- Support for female player characters.
- MCM option for LASS users or anyone else who would like to be addressed with female pronouns.
- Greatly improved logging
- New talker_debug.log file in TALKER/logs (keeping app logging separate from microphone logging in talker.log)
- 4-way toggle in the MCM for style of logging: Off/Console-only/File-only/Both (default: File-only).
- On File-only, console printing is reduced to only warnings and errors. Full debug logging is available in the file TALKER/logs/talker_debug.log (which is wiped every session).
- Reduced various logging instances from warnings to info or debug to reduce console spam
- Improved Game Files Load Check
- Load Check now checks the integrity of basic lua script files, along with more accurate error reporting.
- Initialization errors should no longer produce false positives or misleading errors like "TALKER module X not loaded".
- Non-artifact wearables
- Added support to the artifact script for certain non-artifact wearable items such as Camelbaks, backpack frames and armor inserts.
- Added MCM configurable HP threshold for the Injury Trigger
- Added safety trap fallback for long-term memories that have exceeded the character limit due to model glitches or other reasons.
- If a long-term memory text exceeds 8500 characters it is deleted, triggering an immediate use of the already existing 'bootstrapping' function previously only used for backup cases or migration from old versions/base TALKER.
- This tells the LLM to regenerate a new long-term memory text using the complete list of all raw events that character has witnessed.
- This prevents cases where - due to model glitches or using models that are bad at following instructions for non-dialogue - a character could end up with a long-term memory so long that the input of "current long-term memory" exceeded maximum token limit of the output. This would cause model crashes or simply submitting the entire unedited input as output, essentially "freezing" the long-term memory of that character.
Tweaks
- Clarified prompt with better use of section references to the previously implemented XML tags, instead of carelessly duplicating the tags themselves.
- Slight reworking of the compress memories prompt
- Restructured memory management prompt to differentiate between regular memory maintainance and 'bootstrapping' due to migration or memory reset.
Fixes
- Fixed the Task trigger not working with modded tasks and some other tasks like turning in tools to techs.
- Fixed Task trigger not always extracting the display name of the task properly.
- Fixed Hierarchical Memory system to hopefully actually work as intended in all cases (raw events -> compressed memory -> long-term memory)
- Fixed poor implementation that caused unintended behaviour for new characters (less than ~20 events), causing them to trigger memory bootstrapping using the long-term memory management prompt instead of the intended behaviour (which is first compression, then direct promotion of the first compressed memory into long-term memory, then memory management on subsequent compression events).
- Improved compatibility with Gemini 2.5 models
- Added output cleaner for "pick next speaker" function to minimize the chances of malformed output
- Added message normalizer for Gemini, currently only used for the pick speaker function
- Added a FINAL INSTRUCTION to the pick speaker prompt repeating the enforcement of strict json object output.
- Fixed a nil value error in logger.lua
- Better error messaging on failed AI requests in proxy.
- All events now dynamic event info, removing any edge case where traces of Reputation or Ranks for zombies and monsters might have remained.
alpha-testing-00025b
Changelog
- Added a reminder about not using action descriptions like
*chuckles*or*(scratches head)to the Final Instructions in the prompt