Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] Multiple deadlocks were detected during the client shutdown process. #8366

Closed
3 tasks done
YanYunyang opened this issue Jul 5, 2024 · 0 comments · Fixed by #8367
Closed
3 tasks done

[Bug] Multiple deadlocks were detected during the client shutdown process. #8366

YanYunyang opened this issue Jul 5, 2024 · 0 comments · Fixed by #8367

Comments

@YanYunyang
Copy link
Contributor

YanYunyang commented Jul 5, 2024

Before Creating the Bug Report

  • I found a bug, not just asking a question, which should be created in GitHub Discussions.

  • I have searched the GitHub Issues and GitHub Discussions of this repository and believe that this is not a duplicate.

  • I have confirmed that this bug belongs to the current repository, not other repositories of RocketMQ.

Runtime platform environment

4.19.90-23-42.v2101.ky10.x86_64

RocketMQ version

5.2.0

JDK Version

openjdk 1.8

Describe the Bug

During the client shutdown process, deadlocks were detected, which resolved after a period of time.
This phenomenon was observed using jvisualvm and jconsole.
The deadlock occurred between the client shutdown thread and the Netty worker threads.
Code analysis:

  1. NettyRemotingClient holds the lockChannelTables lock, which is responsible for guarding access to channelTables. channelTables caches all channels encapsulated in ChannelWrapper.
  2. The inner class ChannelWrapper within NettyRemotingClientholds a read-write locklock`, responsible for concurrent access to the channel.
  3. The inner class NettyConnectManageHandler in NettyRemotingClient is responsible for handling events like close, connect, and channelInactive. When a channel becomes unavailable, it executes the close or channelInactive methods to remove the channel from channelTables.
截屏2024-07-05 14 11 28 截屏2024-07-05 14 15 06 截屏2024-07-05 14 19 16 截屏2024-07-05 14 26 54 The execution path where deadlock occurs. 截屏2024-07-05 15 38 50

Steps to Reproduce

WechatIMG49
1.Creating multiple consumer clients and shutting them down one by one. The more clients are created, the more likely a deadlock will be triggered.
2.Open jconsole and continuously click on deadlock detection.

What Did You Expect to See?

WechatIMG49
deadlocks were detected.

What Did You See Instead?

deadlocks were detected.

Additional Context

No response

YanYunyang added a commit to YanYunyang/rocketmq that referenced this issue Jul 5, 2024
…er for a channel, no longer acquire a read lock.
YanYunyang added a commit to YanYunyang/rocketmq that referenced this issue Jul 11, 2024
RongtongJin pushed a commit that referenced this issue Jul 18, 2024
…#8367)

* [ISSUE #8366] When determining if `ChannelWrapper` is the wrapper for a channel, no longer acquire a read lock.

* [ISSUE #8366] Compare channels for equality using `isWrapperOf`.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant