Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Serve/Doc] Add combine nodes with same input in parallel pattern #24760

Conversation

sihanwang41
Copy link
Contributor

@sihanwang41 sihanwang41 commented May 13, 2022

Why are these changes needed?

Add one more pattern for deployment graph

Related issue number

Checks

  • I've run scripts/format.sh to lint the changes in this PR.
  • I've included any doc changes needed for https://docs.ray.io/en/master/.
  • I've made sure the tests are passing. Note that there might be a few flaky tests, see the recent failures at https://flakey-tests.ray.io/
  • Testing Strategy
    • Unit tests
    • Release tests
    • This PR is not tested :(

@jiaodong
Copy link
Member

Screen Shot 2022-05-13 at 2 03 20 PM

I didn't see this new section shows up on the index

Copy link
Member

@jiaodong jiaodong left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

content looks good to me and seems like you're up to speed with adding more docs :)

Copy link
Member

@jiaodong jiaodong left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lg !

@sihanwang41 sihanwang41 force-pushed the cookbook_wide_fanout_send_data_in_parallel branch from 54c8f35 to 623d226 Compare May 16, 2022 16:54

@serve.deployment
def combine(value_refs):
return sum(ray.get(value_refs))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

as a follow-up, we should do a simple benchmark on the wide fanout pattern with larger intermediate data size (let's say 50MB) and ensure the added latency makes sense and confirm we're not:

  1. Making redundant copies of data
  2. Not leaking memory over time with repeated execution

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sg! I will take it as a separate task, not going to follow up in this pr.

@simon-mo
Copy link
Contributor

@simon-mo simon-mo merged commit 830af1f into ray-project:master May 16, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants