Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Benchmarks page #12

Closed
TuanaCelik opened this issue Oct 12, 2022 · 4 comments · Fixed by #54
Closed

Benchmarks page #12

TuanaCelik opened this issue Oct 12, 2022 · 4 comments · Fixed by #54
Assignees
Labels
enhancement New feature or request

Comments

@TuanaCelik
Copy link
Member

On (old) Haystack website we have a Benchmarks page: https://haystack.deepset.ai/benchmarks/latest
You can select between versions of Haystack and look at their benchmarks but it wasn't regularly updated.

@carlosgauci we now have updated benchmarks that are all here: https://github.com/deepset-ai/haystack/tree/main/docs/_src/benchmarks

Do you think it would be possible as a first iteration:

  1. Pull over these benchmarks to haystack-home and create pretty much the same as the previous benchmarks page for 'latest'

As a later iteration (later) we would:

  1. Add versions back
  2. Automate it so that we push new benchmarks from haystack to haystack-home

To give you an idea of how this benchmarks page was created in the previous website here it is: https://github.com/deepset-ai/haystack-website/blob/source/pages/benchmarks/%5B...slug%5D.tsx

@TuanaCelik TuanaCelik added the enhancement New feature or request label Oct 12, 2022
@carlosgauci
Copy link
Collaborator

@TuanaCelik I did a quick test here so its definitely possible to use the same type of charts/data, I can work on it after I finish the community changes :)

@TuanaCelik
Copy link
Member Author

Thank you @carlosgauci - I merged the benchmarks page you created :) - let's leave this issue open until we have the previous versions of benchmarks added too

@TuanaCelik
Copy link
Member Author

@carlosgauci I've determined which pervious versions we should add benchmarks for and here they are. Could we do this so that the latest benchmark is always accessible via both a version tag and simply through https://haystack.deepset.ai/benchmarks

You can see how it was done in the previous website here: https://haystack-website-seven.vercel.app/benchmarks/latest

So I'm imagining another version dropdown somewhere on the top of the page. If it makes sense to include a 'latest' tag too, we can also just keep that. 'latest' can just be https://haystack.deepset.ai/benchmarks and a version could be https://haystack.deepset.ai/benchmarks/v1.9.0 and so on :)

So the versions with benchmarks are (with links to the same json files I sent before). The current benchmarks are from v1.9.0
v0.10.0
v0.9.0
v0.8.0
v0.7.0
v0.6.0
v0.5.0

For all of them, the files to look at are the same for all:

reader_performance.json
retriever_map.json
retriever_performance.json
retriever_speed.json

cc: @masci

@carlosgauci carlosgauci linked a pull request Nov 20, 2022 that will close this issue
@carlosgauci
Copy link
Collaborator

@TuanaCelik I've added the versions in the linked PR. I left the latest benchmarks on /benchmarks but I can put it on /benchmarks/latest if you prefer that. Let me know if I didn't understand correctly :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants