Skip to content

Conversation

@Afee2019
Copy link

@Afee2019 Afee2019 commented Dec 2, 2025

Summary

This PR adds throttling to ChatMessagesStore.emitListeners() to prevent "Maximum update depth exceeded"
warnings during SSE streaming.

Fixes #540
Related to #913

Problem

During SSE streaming (common with LLM integrations), each chunk triggers setMessages()emitListeners()
React re-render. When chunks arrive rapidly, React detects nested updates exceeding its depth limit:

Warning: Maximum update depth exceeded. This can happen when a component
calls setState inside useEffect, but useEffect either doesn't have a
dependency array, or one of the dependencies changes on every render.

Solution

Add a 50ms throttle to emitListeners() with:

  • Leading edge: First update triggers immediately for responsive UI
  • Trailing edge: Pending updates are flushed after throttle interval
  • Initialization bypass: No throttle during constructor to avoid startup delay

Changes

  • Added throttleTimer, pendingEmit, throttleInterval private properties
  • Added throttledEmitListeners() private method
  • Added setMessagesInternal() private method with throttle control
  • Modified setMessages() to use throttled emit
  • Modified constructor to use non-throttled initialization

Testing

Before

  • Dozens of "Maximum update depth exceeded" warnings during streaming
  • Console flooded with warnings affecting DX

After

  • Zero warnings
  • Smooth streaming output
  • No perceptible delay in UI updates

Compatibility

  • No breaking changes to public API
  • setMessages() signature unchanged
  • All existing tests should pass

Summary by CodeRabbit

发布说明

  • 新功能
    • 新增 destroy() 方法,用于清理消息存储并释放节流计时器。
  • Bug修复
    • 优化消息流节流机制,防止高频更新时出现深度超限错误,提升流式操作稳定性。

✏️ Tip: You can customize this high-level summary in your review settings.

…th warnings

During SSE streaming, rapid setMessages() calls trigger excessive React
re-renders, causing "Maximum update depth exceeded" warnings.

This commit adds a 50ms throttle to emitListeners() with:
- Leading edge execution for responsive UI
- Trailing edge guarantee to prevent lost updates
- Initialization bypass to avoid startup delay

Fixes ant-design#540
@coderabbitai
Copy link
Contributor

coderabbitai bot commented Dec 2, 2025

Note

Other AI code review bot(s) detected

CodeRabbit has detected other AI code review bot(s) in this pull request and will avoid duplicating their findings in the review comments. This may lead to a less comprehensive review.

📝 Walkthrough

Walkthrough

x-chat/store.ts 中引入节流发射机制与内部消息设置方法,避免流式更新导致的“Maximum update depth exceeded”错误;新增 destroy() 方法用于清理定时器与重置状态;公共 API 签名(getMessages、addMessage 等)保持不变。

Changes

Cohort / File(s) 变更摘要
节流与生命周期管理
packages/x-sdk/src/x-chat/store.ts
添加节流状态(throttleTimerpendingEmitthrottleInterval)与 throttledEmitListeners 实现(前沿立即发射、尾部延迟刷新);新增私有方法 setMessagesInternal(messages, throttle) 用以统一控制节流/非节流的事件发射;构造函数改为通过 setMessagesInternal(defaultMessages, false) 初始化;公开 API 新增 destroy() 用于清理定时器并重置状态;保持现有方法签名不变(getMessagesaddMessagesetMessageremoveMessagegetSnapshotsubscribe)。

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~10 minutes

  • 变更集中且逻辑清晰,主要在单文件内。
  • 重点审查点:
    • 定时器(throttleTimer)的清理与 destroy() 的覆盖是否充分,避免内存泄漏。
    • throttledEmitListeners 在高并发更新下的边界行为(leading/trailing 的一致性)。
    • setMessagesInternal 在所有消息更新路径(包含外部调用或订阅回调)上的调用覆盖情况。

Poem

🐰 风起消息如潮涌,节流轻拍守心跳,
前锋一击稳如弦,尾随收尾不惊扰,
定时器息去无尘埃,存取清晰又安好。

✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@github-actions
Copy link
Contributor

github-actions bot commented Dec 2, 2025

Prepare preview

@dosubot dosubot bot added the bug Something isn't working label Dec 2, 2025
@gemini-code-assist
Copy link
Contributor

Summary of Changes

Hello @Afee2019, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request introduces a crucial performance and stability enhancement to the ChatMessagesStore by implementing a throttling mechanism. The primary goal is to mitigate the 'Maximum update depth exceeded' warnings that arise from rapid, successive updates during real-time data streaming, such as with LLM integrations. By intelligently controlling the frequency of listener emissions, the change ensures a smoother user experience, prevents console clutter, and maintains UI responsiveness without altering the public interface.

Highlights

  • Problem Resolution: This pull request resolves the 'Maximum update depth exceeded' warning that occurs in React during Server-Sent Events (SSE) streaming, particularly with LLM integrations, where rapid data chunks trigger frequent re-renders.
  • Throttling Implementation: A 50ms throttle has been added to the ChatMessagesStore.emitListeners() method. This throttle employs a leading edge for immediate initial updates and a trailing edge to flush any pending updates after the interval, ensuring responsiveness while preventing excessive updates.
  • Initialization Bypass: The throttling mechanism is bypassed during the ChatMessagesStore constructor's initialization phase to avoid any startup delays, ensuring that initial messages are rendered without artificial latency.
  • API Compatibility: The changes introduce no breaking changes to the public API, and the signature of setMessages() remains unchanged, ensuring backward compatibility.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a throttling mechanism to the ChatMessagesStore to prevent "Maximum update depth exceeded" errors in React during server-sent event streaming. The changes are well-structured, introducing a throttledEmitListeners method and refactoring setMessages to use an internal, configurable method. The implementation correctly uses a leading and trailing edge throttle to ensure UI responsiveness while batching rapid updates. My main feedback is a suggestion to consider resource cleanup for the newly introduced timer to improve long-term robustness, although it's not a critical issue for this change.

Comment on lines +28 to +30
private throttleTimer: ReturnType<typeof setTimeout> | null = null;
private pendingEmit: boolean = false;
private readonly throttleInterval: number = 50;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The throttling logic using setTimeout is well-implemented to solve the rapid re-render issue. However, it introduces a timer that is not explicitly cleaned up. If a ChatMessagesStore instance is discarded (e.g., when the conversationKey changes in the useChatStore hook), a pending timer could be left dangling.

While the short 50ms duration makes the impact low (it would likely fire against a store with no listeners), it's a minor resource leak. For improved robustness and to prevent potential issues in the future, consider adding a cleanup mechanism.

A cleanup method could be added to the ChatMessagesStore to clear the timer:

public cleanup() {
  if (this.throttleTimer) {
    clearTimeout(this.throttleTimer);
    this.throttleTimer = null;
  }
}

Integrating this into the component lifecycle via useChatStore is complex due to shared stores. A simpler approach might be to enhance chatMessagesStoreHelper.delete to call this cleanup method when a store is explicitly deleted. This is likely outside the immediate scope of this PR, but it's a valuable consideration for maintainability.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (1)
packages/x-sdk/src/x-chat/store.ts (1)

84-86: 委托实现保持了 API 兼容性。

setMessages 正确委托给 setMessagesInternal 并启用节流。公共 API 签名保持不变,符合 PR 目标中的"no breaking changes to public API"。

需要注意的是,addMessagesetMessageremoveMessage 内部调用 setMessages,因此单条消息的操作也会被节流。对于流式场景这是合理的,但如果将来有非流式的单次更新需求,可能需要考虑添加可选的节流控制参数。

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 8fbbdbb and c1e0858.

📒 Files selected for processing (1)
  • packages/x-sdk/src/x-chat/store.ts (1 hunks)
🧰 Additional context used
🧠 Learnings (1)
📓 Common learnings
Learnt from: wzc520pyfm
Repo: ant-design/x PR: 833
File: components/use-x-chat/index.ts:256-256
Timestamp: 2025-05-17T05:29:02.977Z
Learning: 在 'use-x-chat' 组件中,'updating' 状态的消息应被视为正常的内容返回,不应被过滤掉。与 'loading' 和 'error' 状态不同,'updating' 状态代表内容正在更新的中间状态,但仍然是有效内容。
🔇 Additional comments (3)
packages/x-sdk/src/x-chat/store.ts (3)

27-30: 节流状态属性声明合理。

属性类型定义正确,50ms 的节流间隔对于流式更新场景是合理的选择。


57-63: 构造函数正确绕过节流以避免启动延迟。

使用 setMessagesInternal(defaultMessages, false) 进行初始化是正确的,因为构造时尚未附加监听器,不需要节流保护。这符合 PR 目标中提到的"initialization bypasses throttle to avoid startup delay"。


65-82: Verify throttle/non-throttle concurrency behavior in setMessagesInternal.

The concern about a potential race condition between throttled and non-throttled calls to setMessagesInternal requires verification of the throttleTimer implementation and actual call sites. Specifically, confirm whether:

  • An active throttleTimer could exist when setMessagesInternal is called with throttle=false
  • The constructor (which calls with throttle=false) completes before any throttled updates begin
  • Clearing or synchronizing the timer on non-throttled calls is necessary

Without access to the throttleTimer management code and all call sites, this cannot be conclusively verified.

Comment on lines +38 to +55
private throttledEmitListeners() {
if (!this.throttleTimer) {
// Leading edge: execute immediately
this.emitListeners();
this.pendingEmit = false;

this.throttleTimer = setTimeout(() => {
this.throttleTimer = null;
// Trailing edge: flush pending updates
if (this.pendingEmit) {
this.emitListeners();
this.pendingEmit = false;
}
}, this.throttleInterval);
} else {
this.pendingEmit = true;
}
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

缺少定时器清理机制,存在内存泄漏风险。

当组件卸载或 store 被销毁时,throttleTimer 没有被清理。如果在定时器挂起期间组件卸载,定时器会继续运行并可能调用已失效的监听器,导致内存泄漏和潜在的运行时错误。

建议在 ChatMessagesStore 中添加清理方法:

+  destroy = () => {
+    if (this.throttleTimer) {
+      clearTimeout(this.throttleTimer);
+      this.throttleTimer = null;
+      this.pendingEmit = false;
+    }
+    this.listeners = [];
+    if (this.conversationKey) {
+      chatMessagesStoreHelper.delete(this.conversationKey);
+    }
+  };

同时,在适当的位置(如 React 组件的 cleanup 函数中)调用此方法。

Committable suggestion skipped: line range outside the PR's diff.

🤖 Prompt for AI Agents
In packages/x-sdk/src/x-chat/store.ts around lines 38 to 55, the
throttledEmitListeners uses setTimeout but never clears throttleTimer, risking
memory leaks and stray callbacks after the store/component is destroyed; add a
public cleanup/dispose/destroy method on ChatMessagesStore that clears the
timeout (clearTimeout(this.throttleTimer)), sets throttleTimer to null, clears
pendingEmit and removes or nulls any listeners (or calls existing
removeAllListeners), and ensure React components call this cleanup in their
unmount/cleanup to prevent the timer from firing against torn-down listeners.

Add a destroy() method to ChatMessagesStore for proper resource cleanup:
- Clears pending throttle timer
- Resets pendingEmit state
- Clears listeners array

This ensures no memory leaks when the store is disposed.
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Nitpick comments (3)
packages/x-sdk/src/x-chat/store.ts (3)

27-31: 节流状态与 throttledEmitListeners 设计整体合理,但 setMessages 触发时机已从“同步”变为“最终一致”

这组改动实现了典型的「前沿立即触发 + 尾部合并刷新」节流策略,能有效缓解 SSE/LLM 流式高频更新导致的 Maximum update depth exceeded,实现方式在逻辑上是自洽的(避免了重复 emit,只在节流窗口内合并为一次尾部刷新),初始化阶段通过 setMessagesInternal(defaultMessages, false) 也规避了不必要的节流延迟,整体方向 👍。

需要注意的一点是:现在大多数场景下 setMessages() 返回时,并不能再保证所有订阅监听已经同步执行——在节流窗口内的多次调用会被延后到下一次计时器回调时统一触发。这与原先「调用 setMessages 即同步 emitListeners」的语义略有差异。

建议你确认一下是否存在以下用法:

  • 在非 React 场景下,调用 setMessages() 后立刻依赖订阅回调的副作用(例如马上从外部状态中读取某些派生结果)。
  • 早期代码或文档里是否曾暗示/约定 setMessages 的同步触发语义。

如果确实有类似需求,可以考虑后续补充一个显式绕过节流的公开 API(例如 setMessagesImmediate 或向外暴露 setMessagesInternal(messages, { throttle: false }) 之类),以区分「UI 友好的流式节流更新」与「严格同步更新」两类场景。当前实现本身没有功能性 bug,只是语义上的行为改变,建议在使用方层面多留意一下。

Also applies to: 38-55, 57-58, 65-86


136-148: destroy 补上定时器清理是对的,可以考虑与 chatMessagesStoreHelper.delete 更紧密配合

destroy 里清理了 throttleTimerpendingEmit 以及 listeners,基本解决了之前评论里提到的定时器挂起与监听器残留的问题,这一点是很必要的改进 👍。

不过目前它只负责「内部资源清理」,不会把实例从 chatMessagesStoreHelper._chatMessagesStores 里移除;如果调用方从语义上把 destroy() 当成「销毁这个 store」,那通过同一个 conversationKey 再次从 helper 里取到的其实仍然是这个“被清理过监听器但 messages 仍在”的旧实例,语义上略显含糊。

可以考虑稍微增强一下约定(可在后续 PR 中处理):

  • 要么在明确知道不会有共享使用的场景下,由调用方在销毁时先通过 chatMessagesStoreHelper.delete(key) 删除,然后再视需要调用 destroy()
  • 要么将 chatMessagesStoreHelper.delete 改成负责调用 destroy,统一封装释放逻辑,例如:
-  delete: (key: ConversationKey) => {
-    chatMessagesStoreHelper._chatMessagesStores.delete(key);
-  },
+  delete: (key: ConversationKey) => {
+    const store = chatMessagesStoreHelper._chatMessagesStores.get(key);
+    if (store) {
+      store.destroy();
+    }
+    chatMessagesStoreHelper._chatMessagesStores.delete(key);
+  },

前提是确保外部不会同时对同一个实例既调用 destroy() 又调用 delete(),以免重复清理;从当前文件来看,destroy 尚未被使用,这个调整可以作为后续封装上的小改进。


116-124: removeMessageid 类型与泛型约束不一致,建议放宽到 string | number

这里 T extends { id: number | string },但 removeMessage 的签名是 id: string,与 getMessage(id: string | number) 不一致:如果某个业务实际用的是数字型 id,类型层面就无法直接调用 removeMessage,需要强制转换或额外封装。

建议统一成与泛型约束一致的签名,以避免后续踩坑:

-  removeMessage = (id: string) => {
+  removeMessage = (id: string | number) => {

实现内部已经是用 item.id === id 做比较,这个修改不会引入额外的运行时行为变化,只是让类型定义和实际能力保持一致。

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between c1e0858 and 0a8738b.

📒 Files selected for processing (1)
  • packages/x-sdk/src/x-chat/store.ts (2 hunks)
🧰 Additional context used
🧠 Learnings (1)
📓 Common learnings
Learnt from: wzc520pyfm
Repo: ant-design/x PR: 833
File: components/use-x-chat/index.ts:256-256
Timestamp: 2025-05-17T05:29:02.977Z
Learning: 在 'use-x-chat' 组件中,'updating' 状态的消息应被视为正常的内容返回,不应被过滤掉。与 'loading' 和 'error' 状态不同,'updating' 状态代表内容正在更新的中间状态,但仍然是有效内容。

* Clean up resources (throttle timer) when the store is no longer needed.
* Should be called when the component unmounts or the store is disposed.
*/
destroy = () => {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The destroy method is not called anywhere.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You're right.

The destroy() method is currently exposed as a public API for consumers to call manually when needed (e.g., in React component cleanup). This follows the pattern of providing capability without
forcing usage.

However, I can integrate it into chatMessagesStoreHelper.delete() to ensure automatic cleanup when a store is explicitly deleted:

delete: (key: ConversationKey) => {
const store = chatMessagesStoreHelper._chatMessagesStores.get(key);
if (store) {
store.destroy();
}
chatMessagesStoreHelper._chatMessagesStores.delete(key);
}

Would you prefer this approach? Or is the current "opt-in" design acceptable?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Bubble.List 数据量过多时出现错误:Maximum update depth exceeded

3 participants