Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Performance degrades severely when the number of producers increases. #1119

Open
bigKoki opened this issue Jun 14, 2024 · 2 comments
Open

Performance degrades severely when the number of producers increases. #1119

bigKoki opened this issue Jun 14, 2024 · 2 comments

Comments

@bigKoki
Copy link

bigKoki commented Jun 14, 2024

In the performance test of MPSC, when simulating 2 and 4 producers respectively, Crossbeam's performance is much better than Java-Disruptor, but when simulating 8 and 16 producers, Crossbeam is worse than Java-Disruptor. My testing device has 16 cores and 64GB. May I ask what caused this problem?

Here are the results of two experiments, the first with 2 producers and the second with 8 producers. The abscissa represents the buffer size and the ordinate represents the time consumption.

7b75eea6-aa87-4059-9696-87270de80f8a
f5b664d0-3a6f-40a6-8b87-d1aeb5d3053a

@al8n
Copy link

al8n commented Sep 4, 2024

My answer may not be correct. AFAIK, if you are using the unbounded channel, its underlying is a linked list; each node can hold 32 elements. Let's say,

[block1(32 elements)] -> [block2(32 elements)]

If in your code, your channel now has 32 elements, so the first block is full, and you send a new element, a new node will be created (allocation required), and if the consumer consumes this element immediately, the second node will be destroyed. If this situation repeatedly happens, as Rust does not have a garbage collector, frequent alloc and dealloc may lead to slow performance.

@bigKoki
Copy link
Author

bigKoki commented Sep 5, 2024

My answer may not be correct. AFAIK, if you are using the unbounded channel, its underlying is a linked list; each node can hold 32 elements. Let's say,

[block1(32 elements)] -> [block2(32 elements)]

If in your code, your channel now has 32 elements, so the first block is full, and you send a new element, a new node will be created (allocation required), and if the consumer consumes this element immediately, the second node will be destroyed. If this situation repeatedly happens, as Rust does not have a garbage collector, frequent alloc and dealloc may lead to slow performance.

Thanks for your answer, I'm using a bounded channel, but your answer really made me learn.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests

2 participants