You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the feature or improvement you're requesting
It will be possible to create a benchmark that will evaluate the ability of the model to extract key information from abstracts of scientific articles. For example, information such as:
Object of study (what was studied in the article)
Methodology (what methods were used to conduct the study)
Results (what were the main results of the study)
Conclusions (what conclusions were drawn by the authors based on the results)
Abstracts from real scientific articles can be used as input, and pre-prepared answers that accurately represent the information to be extracted can be used as the ideal answer. This will allow you to assess how well the model is able to understand and analyze scientific texts.
Additional context
No response
The text was updated successfully, but these errors were encountered:
Describe the feature or improvement you're requesting
It will be possible to create a benchmark that will evaluate the ability of the model to extract key information from abstracts of scientific articles. For example, information such as:
Abstracts from real scientific articles can be used as input, and pre-prepared answers that accurately represent the information to be extracted can be used as the ideal answer. This will allow you to assess how well the model is able to understand and analyze scientific texts.
Additional context
No response
The text was updated successfully, but these errors were encountered: