sha
stringlengths 40
40
| text
stringlengths 1
13.4M
| id
stringlengths 2
117
| tags
sequencelengths 1
7.91k
| created_at
stringlengths 25
25
| metadata
stringlengths 2
875k
| last_modified
stringlengths 25
25
| arxiv
sequencelengths 0
25
| languages
sequencelengths 0
7.91k
| tags_str
stringlengths 17
159k
| text_str
stringlengths 1
447k
| text_lists
sequencelengths 0
352
| processed_texts
sequencelengths 1
353
|
---|---|---|---|---|---|---|---|---|---|---|---|---|
211adfc8d884893cc5f825a7ad8111b08f3c9e81 |
# Dataset Card for Evaluation run of ewqr2130/alignment-handbook-zephyr-7b_ppo_5e7step_102
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ewqr2130/alignment-handbook-zephyr-7b_ppo_5e7step_102](https://huggingface.co/ewqr2130/alignment-handbook-zephyr-7b_ppo_5e7step_102) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ewqr2130__alignment-handbook-zephyr-7b_ppo_5e7step_102",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-19T20:51:30.174111](https://huggingface.co/datasets/open-llm-leaderboard/details_ewqr2130__alignment-handbook-zephyr-7b_ppo_5e7step_102/blob/main/results_2024-01-19T20-51-30.174111.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5942231513843124,
"acc_stderr": 0.03336228056397976,
"acc_norm": 0.6000481834929137,
"acc_norm_stderr": 0.03405843191633746,
"mc1": 0.28151774785801714,
"mc1_stderr": 0.01574402724825605,
"mc2": 0.41558392287111307,
"mc2_stderr": 0.01541776331607284
},
"harness|arc:challenge|25": {
"acc": 0.5639931740614335,
"acc_stderr": 0.014491225699230916,
"acc_norm": 0.5921501706484642,
"acc_norm_stderr": 0.014361097288449703
},
"harness|hellaswag|10": {
"acc": 0.6238797052380004,
"acc_stderr": 0.00483420796406132,
"acc_norm": 0.8245369448317068,
"acc_norm_stderr": 0.0037958533012439908
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.042763494943765995,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.042763494943765995
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6641509433962264,
"acc_stderr": 0.029067220146644826,
"acc_norm": 0.6641509433962264,
"acc_norm_stderr": 0.029067220146644826
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.046151869625837026,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.046151869625837026
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.02501074911613759,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.02501074911613759
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7225806451612903,
"acc_stderr": 0.025470196835900055,
"acc_norm": 0.7225806451612903,
"acc_norm_stderr": 0.025470196835900055
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7323232323232324,
"acc_stderr": 0.03154449888270285,
"acc_norm": 0.7323232323232324,
"acc_norm_stderr": 0.03154449888270285
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7823834196891192,
"acc_stderr": 0.029778663037752954,
"acc_norm": 0.7823834196891192,
"acc_norm_stderr": 0.029778663037752954
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5820512820512821,
"acc_stderr": 0.02500732988246121,
"acc_norm": 0.5820512820512821,
"acc_norm_stderr": 0.02500732988246121
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.02889774874113115,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.02889774874113115
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8110091743119267,
"acc_stderr": 0.016785481159203624,
"acc_norm": 0.8110091743119267,
"acc_norm_stderr": 0.016785481159203624
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.03385177976044811,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.03385177976044811
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.030964517926923403,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.030964517926923403
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7257383966244726,
"acc_stderr": 0.029041333510598035,
"acc_norm": 0.7257383966244726,
"acc_norm_stderr": 0.029041333510598035
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.0384985609879409,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.0384985609879409
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.03623089915724147,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.03623089915724147
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8247863247863247,
"acc_stderr": 0.024904439098918242,
"acc_norm": 0.8247863247863247,
"acc_norm_stderr": 0.024904439098918242
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.776500638569604,
"acc_stderr": 0.01489723522945071,
"acc_norm": 0.776500638569604,
"acc_norm_stderr": 0.01489723522945071
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6791907514450867,
"acc_stderr": 0.025131000233647904,
"acc_norm": 0.6791907514450867,
"acc_norm_stderr": 0.025131000233647904
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37988826815642457,
"acc_stderr": 0.016232826818678502,
"acc_norm": 0.37988826815642457,
"acc_norm_stderr": 0.016232826818678502
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.027184498909941616,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.027184498909941616
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6655948553054662,
"acc_stderr": 0.026795422327893934,
"acc_norm": 0.6655948553054662,
"acc_norm_stderr": 0.026795422327893934
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.026229649178821167,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.026229649178821167
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.029583452036284073,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.029583452036284073
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4211212516297262,
"acc_stderr": 0.012610325733489905,
"acc_norm": 0.4211212516297262,
"acc_norm_stderr": 0.012610325733489905
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.029722152099280065,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.029722152099280065
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.019722058939618065,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.019722058939618065
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6571428571428571,
"acc_stderr": 0.030387262919547735,
"acc_norm": 0.6571428571428571,
"acc_norm_stderr": 0.030387262919547735
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.02796267760476892,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.02796267760476892
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28151774785801714,
"mc1_stderr": 0.01574402724825605,
"mc2": 0.41558392287111307,
"mc2_stderr": 0.01541776331607284
},
"harness|winogrande|5": {
"acc": 0.7703235990528808,
"acc_stderr": 0.01182164560183824
},
"harness|gsm8k|5": {
"acc": 0.3032600454890068,
"acc_stderr": 0.0126615026634187
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_ewqr2130__alignment-handbook-zephyr-7b_ppo_5e7step_102 | [
"region:us"
] | 2024-01-19T20:53:51+00:00 | {"pretty_name": "Evaluation run of ewqr2130/alignment-handbook-zephyr-7b_ppo_5e7step_102", "dataset_summary": "Dataset automatically created during the evaluation run of model [ewqr2130/alignment-handbook-zephyr-7b_ppo_5e7step_102](https://huggingface.co/ewqr2130/alignment-handbook-zephyr-7b_ppo_5e7step_102) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ewqr2130__alignment-handbook-zephyr-7b_ppo_5e7step_102\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-19T20:51:30.174111](https://huggingface.co/datasets/open-llm-leaderboard/details_ewqr2130__alignment-handbook-zephyr-7b_ppo_5e7step_102/blob/main/results_2024-01-19T20-51-30.174111.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5942231513843124,\n \"acc_stderr\": 0.03336228056397976,\n \"acc_norm\": 0.6000481834929137,\n \"acc_norm_stderr\": 0.03405843191633746,\n \"mc1\": 0.28151774785801714,\n \"mc1_stderr\": 0.01574402724825605,\n \"mc2\": 0.41558392287111307,\n \"mc2_stderr\": 0.01541776331607284\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5639931740614335,\n \"acc_stderr\": 0.014491225699230916,\n \"acc_norm\": 0.5921501706484642,\n \"acc_norm_stderr\": 0.014361097288449703\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6238797052380004,\n \"acc_stderr\": 0.00483420796406132,\n \"acc_norm\": 0.8245369448317068,\n \"acc_norm_stderr\": 0.0037958533012439908\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.5703703703703704,\n \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6641509433962264,\n \"acc_stderr\": 0.029067220146644826,\n \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.029067220146644826\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.03267862331014063,\n \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.03267862331014063\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n \"acc_stderr\": 0.046151869625837026,\n \"acc_norm\": 0.40350877192982454,\n \"acc_norm_stderr\": 0.046151869625837026\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.02501074911613759,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.02501074911613759\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7225806451612903,\n \"acc_stderr\": 0.025470196835900055,\n \"acc_norm\": 0.7225806451612903,\n \"acc_norm_stderr\": 0.025470196835900055\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7323232323232324,\n \"acc_stderr\": 0.03154449888270285,\n \"acc_norm\": 0.7323232323232324,\n \"acc_norm_stderr\": 0.03154449888270285\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7823834196891192,\n \"acc_stderr\": 0.029778663037752954,\n \"acc_norm\": 0.7823834196891192,\n \"acc_norm_stderr\": 0.029778663037752954\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5820512820512821,\n \"acc_stderr\": 0.02500732988246121,\n \"acc_norm\": 0.5820512820512821,\n \"acc_norm_stderr\": 0.02500732988246121\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.02889774874113115,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.02889774874113115\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8110091743119267,\n \"acc_stderr\": 0.016785481159203624,\n \"acc_norm\": 0.8110091743119267,\n \"acc_norm_stderr\": 0.016785481159203624\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4398148148148148,\n \"acc_stderr\": 0.03385177976044811,\n \"acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.03385177976044811\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.030964517926923403,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.030964517926923403\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7257383966244726,\n \"acc_stderr\": 0.029041333510598035,\n \"acc_norm\": 0.7257383966244726,\n \"acc_norm_stderr\": 0.029041333510598035\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.6367713004484304,\n \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.0384985609879409,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.0384985609879409\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724147,\n \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724147\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8247863247863247,\n \"acc_stderr\": 0.024904439098918242,\n \"acc_norm\": 0.8247863247863247,\n \"acc_norm_stderr\": 0.024904439098918242\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.776500638569604,\n \"acc_stderr\": 0.01489723522945071,\n \"acc_norm\": 0.776500638569604,\n \"acc_norm_stderr\": 0.01489723522945071\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6791907514450867,\n \"acc_stderr\": 0.025131000233647904,\n \"acc_norm\": 0.6791907514450867,\n \"acc_norm_stderr\": 0.025131000233647904\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37988826815642457,\n \"acc_stderr\": 0.016232826818678502,\n \"acc_norm\": 0.37988826815642457,\n \"acc_norm_stderr\": 0.016232826818678502\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6568627450980392,\n \"acc_stderr\": 0.027184498909941616,\n \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.027184498909941616\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6655948553054662,\n \"acc_stderr\": 0.026795422327893934,\n \"acc_norm\": 0.6655948553054662,\n \"acc_norm_stderr\": 0.026795422327893934\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.026229649178821167,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.026229649178821167\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.43617021276595747,\n \"acc_stderr\": 0.029583452036284073,\n \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.029583452036284073\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4211212516297262,\n \"acc_stderr\": 0.012610325733489905,\n \"acc_norm\": 0.4211212516297262,\n \"acc_norm_stderr\": 0.012610325733489905\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6029411764705882,\n \"acc_stderr\": 0.029722152099280065,\n \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.029722152099280065\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.019722058939618065,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.019722058939618065\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6571428571428571,\n \"acc_stderr\": 0.030387262919547735,\n \"acc_norm\": 0.6571428571428571,\n \"acc_norm_stderr\": 0.030387262919547735\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n \"acc_stderr\": 0.02796267760476892,\n \"acc_norm\": 0.8059701492537313,\n \"acc_norm_stderr\": 0.02796267760476892\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28151774785801714,\n \"mc1_stderr\": 0.01574402724825605,\n \"mc2\": 0.41558392287111307,\n \"mc2_stderr\": 0.01541776331607284\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7703235990528808,\n \"acc_stderr\": 0.01182164560183824\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3032600454890068,\n \"acc_stderr\": 0.0126615026634187\n }\n}\n```", "repo_url": "https://huggingface.co/ewqr2130/alignment-handbook-zephyr-7b_ppo_5e7step_102", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|arc:challenge|25_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|gsm8k|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hellaswag|10_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T20-51-30.174111.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["**/details_harness|winogrande|5_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-19T20-51-30.174111.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_19T20_51_30.174111", "path": ["results_2024-01-19T20-51-30.174111.parquet"]}, {"split": "latest", "path": ["results_2024-01-19T20-51-30.174111.parquet"]}]}]} | 2024-01-19T20:54:12+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of ewqr2130/alignment-handbook-zephyr-7b_ppo_5e7step_102
Dataset automatically created during the evaluation run of model ewqr2130/alignment-handbook-zephyr-7b_ppo_5e7step_102 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-19T20:51:30.174111(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of ewqr2130/alignment-handbook-zephyr-7b_ppo_5e7step_102\n\n\n\nDataset automatically created during the evaluation run of model ewqr2130/alignment-handbook-zephyr-7b_ppo_5e7step_102 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-19T20:51:30.174111(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ewqr2130/alignment-handbook-zephyr-7b_ppo_5e7step_102\n\n\n\nDataset automatically created during the evaluation run of model ewqr2130/alignment-handbook-zephyr-7b_ppo_5e7step_102 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-19T20:51:30.174111(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
7ac118fb7d4a6bd7b89dc23774db7c419ce079c9 |
Filtered data from the following subreddits:
"AskAcademia",
"AskComputerScience",
"AskEconomics",
"AskProgramming",
"AskScienceFiction",
"AskSocialScience",
"AskStatistics",
"AskTechnology",
"askmath",
"askphilosophy",
"askpsychology",
"askscience",
"changemyview",
"explainlikeimfive" | euclaise/reddit-instruct | [
"license:mit",
"region:us"
] | 2024-01-19T20:56:20+00:00 | {"license": "mit", "dataset_info": {"features": [{"name": "post_title", "dtype": "string"}, {"name": "post_text", "dtype": "string"}, {"name": "post_scores", "dtype": "int64"}, {"name": "comment_text", "dtype": "string"}, {"name": "comment_score", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 126565640.88161694, "num_examples": 84784}, {"name": "test", "num_bytes": 2985602.021174206, "num_examples": 2000}], "download_size": 67560005, "dataset_size": 129551242.90279114}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-19T20:59:05+00:00 | [] | [] | TAGS
#license-mit #region-us
|
Filtered data from the following subreddits:
"AskAcademia",
"AskComputerScience",
"AskEconomics",
"AskProgramming",
"AskScienceFiction",
"AskSocialScience",
"AskStatistics",
"AskTechnology",
"askmath",
"askphilosophy",
"askpsychology",
"askscience",
"changemyview",
"explainlikeimfive" | [] | [
"TAGS\n#license-mit #region-us \n"
] |
178f2be4bb1ef40b4f66ab484b0eabda1d772f01 | # Dataset Card for "fashion_image_caption-100-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ArmandoGG/fashion_image_caption-100-v2 | [
"region:us"
] | 2024-01-19T21:09:50+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 22820471.0, "num_examples": 100}], "download_size": 22820373, "dataset_size": 22820471.0}} | 2024-01-19T21:09:51+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "fashion_image_caption-100-v2"
More Information needed | [
"# Dataset Card for \"fashion_image_caption-100-v2\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"fashion_image_caption-100-v2\"\n\nMore Information needed"
] |
c900cbc7e70f857d67b978e69f2c877049dae3a7 |
# Dataset Card for Evaluation run of ewqr2130/alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ewqr2130/alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont2](https://huggingface.co/ewqr2130/alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ewqr2130__alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-19T21:12:05.940031](https://huggingface.co/datasets/open-llm-leaderboard/details_ewqr2130__alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont2/blob/main/results_2024-01-19T21-12-05.940031.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5959098573921959,
"acc_stderr": 0.03332692183681072,
"acc_norm": 0.6019409558870633,
"acc_norm_stderr": 0.034019817201103926,
"mc1": 0.2998776009791922,
"mc1_stderr": 0.016040352966713623,
"mc2": 0.42358153067078547,
"mc2_stderr": 0.015672254683217784
},
"harness|arc:challenge|25": {
"acc": 0.5648464163822525,
"acc_stderr": 0.014487986197186043,
"acc_norm": 0.6032423208191127,
"acc_norm_stderr": 0.014296513020180644
},
"harness|hellaswag|10": {
"acc": 0.6366261700856403,
"acc_stderr": 0.004799882248494813,
"acc_norm": 0.8288189603664609,
"acc_norm_stderr": 0.0037589728166275913
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.660377358490566,
"acc_stderr": 0.02914690474779833,
"acc_norm": 0.660377358490566,
"acc_norm_stderr": 0.02914690474779833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.04630653203366595,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.04630653203366595
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.0250437573185202,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.0250437573185202
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7225806451612903,
"acc_stderr": 0.025470196835900055,
"acc_norm": 0.7225806451612903,
"acc_norm_stderr": 0.025470196835900055
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03173071239071724,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03173071239071724
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7875647668393783,
"acc_stderr": 0.02951928261681723,
"acc_norm": 0.7875647668393783,
"acc_norm_stderr": 0.02951928261681723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5846153846153846,
"acc_stderr": 0.024985354923102353,
"acc_norm": 0.5846153846153846,
"acc_norm_stderr": 0.024985354923102353
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131157,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131157
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6260504201680672,
"acc_stderr": 0.03142946637883708,
"acc_norm": 0.6260504201680672,
"acc_norm_stderr": 0.03142946637883708
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8091743119266055,
"acc_stderr": 0.01684767640009109,
"acc_norm": 0.8091743119266055,
"acc_norm_stderr": 0.01684767640009109
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.03372343271653063,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.03372343271653063
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7303921568627451,
"acc_stderr": 0.031145570659486782,
"acc_norm": 0.7303921568627451,
"acc_norm_stderr": 0.031145570659486782
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7130801687763713,
"acc_stderr": 0.02944377302259469,
"acc_norm": 0.7130801687763713,
"acc_norm_stderr": 0.02944377302259469
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.0384985609879409,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.0384985609879409
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.03623089915724147,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.03623089915724147
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.024414947304543674,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.024414947304543674
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7803320561941252,
"acc_stderr": 0.014805384478371155,
"acc_norm": 0.7803320561941252,
"acc_norm_stderr": 0.014805384478371155
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.024946792225272314,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.024946792225272314
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3743016759776536,
"acc_stderr": 0.016185444179457168,
"acc_norm": 0.3743016759776536,
"acc_norm_stderr": 0.016185444179457168
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.027184498909941616,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.027184498909941616
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6697530864197531,
"acc_stderr": 0.026168298456732846,
"acc_norm": 0.6697530864197531,
"acc_norm_stderr": 0.026168298456732846
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42242503259452413,
"acc_stderr": 0.012615600475734921,
"acc_norm": 0.42242503259452413,
"acc_norm_stderr": 0.012615600475734921
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6213235294117647,
"acc_stderr": 0.02946513363977613,
"acc_norm": 0.6213235294117647,
"acc_norm_stderr": 0.02946513363977613
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.019722058939618068,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.019722058939618068
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.04769300568972744,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.04769300568972744
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6693877551020408,
"acc_stderr": 0.030116426296540606,
"acc_norm": 0.6693877551020408,
"acc_norm_stderr": 0.030116426296540606
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786862,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786862
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2998776009791922,
"mc1_stderr": 0.016040352966713623,
"mc2": 0.42358153067078547,
"mc2_stderr": 0.015672254683217784
},
"harness|winogrande|5": {
"acc": 0.7655880031570639,
"acc_stderr": 0.011906130106237986
},
"harness|gsm8k|5": {
"acc": 0.3009855951478393,
"acc_stderr": 0.0126345044652112
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_ewqr2130__alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont2 | [
"region:us"
] | 2024-01-19T21:14:31+00:00 | {"pretty_name": "Evaluation run of ewqr2130/alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont2", "dataset_summary": "Dataset automatically created during the evaluation run of model [ewqr2130/alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont2](https://huggingface.co/ewqr2130/alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ewqr2130__alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-19T21:12:05.940031](https://huggingface.co/datasets/open-llm-leaderboard/details_ewqr2130__alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont2/blob/main/results_2024-01-19T21-12-05.940031.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5959098573921959,\n \"acc_stderr\": 0.03332692183681072,\n \"acc_norm\": 0.6019409558870633,\n \"acc_norm_stderr\": 0.034019817201103926,\n \"mc1\": 0.2998776009791922,\n \"mc1_stderr\": 0.016040352966713623,\n \"mc2\": 0.42358153067078547,\n \"mc2_stderr\": 0.015672254683217784\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5648464163822525,\n \"acc_stderr\": 0.014487986197186043,\n \"acc_norm\": 0.6032423208191127,\n \"acc_norm_stderr\": 0.014296513020180644\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6366261700856403,\n \"acc_stderr\": 0.004799882248494813,\n \"acc_norm\": 0.8288189603664609,\n \"acc_norm_stderr\": 0.0037589728166275913\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.02914690474779833,\n \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.02914690474779833\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.03267862331014063,\n \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.03267862331014063\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n \"acc_stderr\": 0.04630653203366595,\n \"acc_norm\": 0.41228070175438597,\n \"acc_norm_stderr\": 0.04630653203366595\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3835978835978836,\n \"acc_stderr\": 0.0250437573185202,\n \"acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.0250437573185202\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7225806451612903,\n \"acc_stderr\": 0.025470196835900055,\n \"acc_norm\": 0.7225806451612903,\n \"acc_norm_stderr\": 0.025470196835900055\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.03173071239071724,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03173071239071724\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7875647668393783,\n \"acc_stderr\": 0.02951928261681723,\n \"acc_norm\": 0.7875647668393783,\n \"acc_norm_stderr\": 0.02951928261681723\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5846153846153846,\n \"acc_stderr\": 0.024985354923102353,\n \"acc_norm\": 0.5846153846153846,\n \"acc_norm_stderr\": 0.024985354923102353\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131157,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131157\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6260504201680672,\n \"acc_stderr\": 0.03142946637883708,\n \"acc_norm\": 0.6260504201680672,\n \"acc_norm_stderr\": 0.03142946637883708\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8091743119266055,\n \"acc_stderr\": 0.01684767640009109,\n \"acc_norm\": 0.8091743119266055,\n \"acc_norm_stderr\": 0.01684767640009109\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.03372343271653063,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.03372343271653063\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7303921568627451,\n \"acc_stderr\": 0.031145570659486782,\n \"acc_norm\": 0.7303921568627451,\n \"acc_norm_stderr\": 0.031145570659486782\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7130801687763713,\n \"acc_stderr\": 0.02944377302259469,\n \"acc_norm\": 0.7130801687763713,\n \"acc_norm_stderr\": 0.02944377302259469\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.041423137719966634,\n \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.041423137719966634\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.0384985609879409,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.0384985609879409\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724147,\n \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724147\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.024414947304543674,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.024414947304543674\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7803320561941252,\n \"acc_stderr\": 0.014805384478371155,\n \"acc_norm\": 0.7803320561941252,\n \"acc_norm_stderr\": 0.014805384478371155\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.024946792225272314,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.024946792225272314\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3743016759776536,\n \"acc_stderr\": 0.016185444179457168,\n \"acc_norm\": 0.3743016759776536,\n \"acc_norm_stderr\": 0.016185444179457168\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6568627450980392,\n \"acc_stderr\": 0.027184498909941616,\n \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.027184498909941616\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6697530864197531,\n \"acc_stderr\": 0.026168298456732846,\n \"acc_norm\": 0.6697530864197531,\n \"acc_norm_stderr\": 0.026168298456732846\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42242503259452413,\n \"acc_stderr\": 0.012615600475734921,\n \"acc_norm\": 0.42242503259452413,\n \"acc_norm_stderr\": 0.012615600475734921\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6213235294117647,\n \"acc_stderr\": 0.02946513363977613,\n \"acc_norm\": 0.6213235294117647,\n \"acc_norm_stderr\": 0.02946513363977613\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.019722058939618068,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.019722058939618068\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5454545454545454,\n \"acc_stderr\": 0.04769300568972744,\n \"acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.04769300568972744\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6693877551020408,\n \"acc_stderr\": 0.030116426296540606,\n \"acc_norm\": 0.6693877551020408,\n \"acc_norm_stderr\": 0.030116426296540606\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.027403859410786862,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.027403859410786862\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2998776009791922,\n \"mc1_stderr\": 0.016040352966713623,\n \"mc2\": 0.42358153067078547,\n \"mc2_stderr\": 0.015672254683217784\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7655880031570639,\n \"acc_stderr\": 0.011906130106237986\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3009855951478393,\n \"acc_stderr\": 0.0126345044652112\n }\n}\n```", "repo_url": "https://huggingface.co/ewqr2130/alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|arc:challenge|25_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|gsm8k|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hellaswag|10_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T21-12-05.940031.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["**/details_harness|winogrande|5_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-19T21-12-05.940031.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_19T21_12_05.940031", "path": ["results_2024-01-19T21-12-05.940031.parquet"]}, {"split": "latest", "path": ["results_2024-01-19T21-12-05.940031.parquet"]}]}]} | 2024-01-19T21:14:57+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of ewqr2130/alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont2
Dataset automatically created during the evaluation run of model ewqr2130/alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-19T21:12:05.940031(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of ewqr2130/alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont2\n\n\n\nDataset automatically created during the evaluation run of model ewqr2130/alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-19T21:12:05.940031(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ewqr2130/alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont2\n\n\n\nDataset automatically created during the evaluation run of model ewqr2130/alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-19T21:12:05.940031(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
e8aba9086aa85acdde36ef60075f6c46c47709bb |
# Dataset Card for Evaluation run of AA051610/A0120
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AA051610/A0120](https://huggingface.co/AA051610/A0120) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AA051610__A0120",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-19T21:18:32.527803](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__A0120/blob/main/results_2024-01-19T21-18-32.527803.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7417714462373405,
"acc_stderr": 0.028978132404413697,
"acc_norm": 0.7454834976249006,
"acc_norm_stderr": 0.029530861672701678,
"mc1": 0.39657282741738065,
"mc1_stderr": 0.017124930942023518,
"mc2": 0.5748062315132791,
"mc2_stderr": 0.015431295873654757
},
"harness|arc:challenge|25": {
"acc": 0.6407849829351536,
"acc_stderr": 0.014020224155839159,
"acc_norm": 0.6706484641638225,
"acc_norm_stderr": 0.013734057652635474
},
"harness|hellaswag|10": {
"acc": 0.6612228639713205,
"acc_stderr": 0.004723266971563396,
"acc_norm": 0.8515236008763195,
"acc_norm_stderr": 0.0035484490542860105
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.038201699145179055,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.038201699145179055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8618421052631579,
"acc_stderr": 0.028081042939576552,
"acc_norm": 0.8618421052631579,
"acc_norm_stderr": 0.028081042939576552
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7886792452830189,
"acc_stderr": 0.025125766484827845,
"acc_norm": 0.7886792452830189,
"acc_norm_stderr": 0.025125766484827845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8263888888888888,
"acc_stderr": 0.03167473383795718,
"acc_norm": 0.8263888888888888,
"acc_norm_stderr": 0.03167473383795718
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.035149425512674394,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.035149425512674394
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5098039215686274,
"acc_stderr": 0.04974229460422817,
"acc_norm": 0.5098039215686274,
"acc_norm_stderr": 0.04974229460422817
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.774468085106383,
"acc_stderr": 0.027321078417387536,
"acc_norm": 0.774468085106383,
"acc_norm_stderr": 0.027321078417387536
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6140350877192983,
"acc_stderr": 0.04579639422070434,
"acc_norm": 0.6140350877192983,
"acc_norm_stderr": 0.04579639422070434
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7310344827586207,
"acc_stderr": 0.036951833116502325,
"acc_norm": 0.7310344827586207,
"acc_norm_stderr": 0.036951833116502325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6640211640211641,
"acc_stderr": 0.02432631052914914,
"acc_norm": 0.6640211640211641,
"acc_norm_stderr": 0.02432631052914914
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5317460317460317,
"acc_stderr": 0.04463112720677173,
"acc_norm": 0.5317460317460317,
"acc_norm_stderr": 0.04463112720677173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8709677419354839,
"acc_stderr": 0.019070889254792747,
"acc_norm": 0.8709677419354839,
"acc_norm_stderr": 0.019070889254792747
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5763546798029556,
"acc_stderr": 0.03476725747649037,
"acc_norm": 0.5763546798029556,
"acc_norm_stderr": 0.03476725747649037
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.8,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.8,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.027998073798781657,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.027998073798781657
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9141414141414141,
"acc_stderr": 0.01996022556317289,
"acc_norm": 0.9141414141414141,
"acc_norm_stderr": 0.01996022556317289
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9585492227979274,
"acc_stderr": 0.014385432857476442,
"acc_norm": 0.9585492227979274,
"acc_norm_stderr": 0.014385432857476442
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.019982347208637303,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.019982347208637303
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.029723278961476664,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.029723278961476664
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8487394957983193,
"acc_stderr": 0.023274255898707946,
"acc_norm": 0.8487394957983193,
"acc_norm_stderr": 0.023274255898707946
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.46357615894039733,
"acc_stderr": 0.04071636065944216,
"acc_norm": 0.46357615894039733,
"acc_norm_stderr": 0.04071636065944216
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9100917431192661,
"acc_stderr": 0.012264304540230435,
"acc_norm": 0.9100917431192661,
"acc_norm_stderr": 0.012264304540230435
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03214952147802749,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03214952147802749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9019607843137255,
"acc_stderr": 0.020871118455552097,
"acc_norm": 0.9019607843137255,
"acc_norm_stderr": 0.020871118455552097
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8945147679324894,
"acc_stderr": 0.019995560723758535,
"acc_norm": 0.8945147679324894,
"acc_norm_stderr": 0.019995560723758535
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7757847533632287,
"acc_stderr": 0.02799153425851952,
"acc_norm": 0.7757847533632287,
"acc_norm_stderr": 0.02799153425851952
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9008264462809917,
"acc_stderr": 0.02728524631275896,
"acc_norm": 0.9008264462809917,
"acc_norm_stderr": 0.02728524631275896
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8703703703703703,
"acc_stderr": 0.03247224389917948,
"acc_norm": 0.8703703703703703,
"acc_norm_stderr": 0.03247224389917948
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8650306748466258,
"acc_stderr": 0.026845765054553838,
"acc_norm": 0.8650306748466258,
"acc_norm_stderr": 0.026845765054553838
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5535714285714286,
"acc_stderr": 0.047184714852195865,
"acc_norm": 0.5535714285714286,
"acc_norm_stderr": 0.047184714852195865
},
"harness|hendrycksTest-management|5": {
"acc": 0.883495145631068,
"acc_stderr": 0.031766839486404054,
"acc_norm": 0.883495145631068,
"acc_norm_stderr": 0.031766839486404054
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9358974358974359,
"acc_stderr": 0.01604626163167314,
"acc_norm": 0.9358974358974359,
"acc_norm_stderr": 0.01604626163167314
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263714,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263714
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9067688378033205,
"acc_stderr": 0.010397417087292847,
"acc_norm": 0.9067688378033205,
"acc_norm_stderr": 0.010397417087292847
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8005780346820809,
"acc_stderr": 0.021511900654252555,
"acc_norm": 0.8005780346820809,
"acc_norm_stderr": 0.021511900654252555
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6770949720670391,
"acc_stderr": 0.015638440380241474,
"acc_norm": 0.6770949720670391,
"acc_norm_stderr": 0.015638440380241474
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8300653594771242,
"acc_stderr": 0.02150538312123138,
"acc_norm": 0.8300653594771242,
"acc_norm_stderr": 0.02150538312123138
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8135048231511254,
"acc_stderr": 0.022122439772480774,
"acc_norm": 0.8135048231511254,
"acc_norm_stderr": 0.022122439772480774
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8179012345679012,
"acc_stderr": 0.021473491834808355,
"acc_norm": 0.8179012345679012,
"acc_norm_stderr": 0.021473491834808355
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6028368794326241,
"acc_stderr": 0.0291898056735871,
"acc_norm": 0.6028368794326241,
"acc_norm_stderr": 0.0291898056735871
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5645371577574967,
"acc_stderr": 0.012663412101248345,
"acc_norm": 0.5645371577574967,
"acc_norm_stderr": 0.012663412101248345
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8345588235294118,
"acc_stderr": 0.022571771025494757,
"acc_norm": 0.8345588235294118,
"acc_norm_stderr": 0.022571771025494757
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.016819028375736383,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.016819028375736383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8204081632653061,
"acc_stderr": 0.024573293589585637,
"acc_norm": 0.8204081632653061,
"acc_norm_stderr": 0.024573293589585637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101716,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101716
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.9005847953216374,
"acc_stderr": 0.022949025579355027,
"acc_norm": 0.9005847953216374,
"acc_norm_stderr": 0.022949025579355027
},
"harness|truthfulqa:mc|0": {
"mc1": 0.39657282741738065,
"mc1_stderr": 0.017124930942023518,
"mc2": 0.5748062315132791,
"mc2_stderr": 0.015431295873654757
},
"harness|winogrande|5": {
"acc": 0.813733228097869,
"acc_stderr": 0.01094187795567621
},
"harness|gsm8k|5": {
"acc": 0.6709628506444276,
"acc_stderr": 0.012942375603679368
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_AA051610__A0120 | [
"region:us"
] | 2024-01-19T21:20:44+00:00 | {"pretty_name": "Evaluation run of AA051610/A0120", "dataset_summary": "Dataset automatically created during the evaluation run of model [AA051610/A0120](https://huggingface.co/AA051610/A0120) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051610__A0120\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-19T21:18:32.527803](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__A0120/blob/main/results_2024-01-19T21-18-32.527803.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7417714462373405,\n \"acc_stderr\": 0.028978132404413697,\n \"acc_norm\": 0.7454834976249006,\n \"acc_norm_stderr\": 0.029530861672701678,\n \"mc1\": 0.39657282741738065,\n \"mc1_stderr\": 0.017124930942023518,\n \"mc2\": 0.5748062315132791,\n \"mc2_stderr\": 0.015431295873654757\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6407849829351536,\n \"acc_stderr\": 0.014020224155839159,\n \"acc_norm\": 0.6706484641638225,\n \"acc_norm_stderr\": 0.013734057652635474\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6612228639713205,\n \"acc_stderr\": 0.004723266971563396,\n \"acc_norm\": 0.8515236008763195,\n \"acc_norm_stderr\": 0.0035484490542860105\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.038201699145179055,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.038201699145179055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8618421052631579,\n \"acc_stderr\": 0.028081042939576552,\n \"acc_norm\": 0.8618421052631579,\n \"acc_norm_stderr\": 0.028081042939576552\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7886792452830189,\n \"acc_stderr\": 0.025125766484827845,\n \"acc_norm\": 0.7886792452830189,\n \"acc_norm_stderr\": 0.025125766484827845\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8263888888888888,\n \"acc_stderr\": 0.03167473383795718,\n \"acc_norm\": 0.8263888888888888,\n \"acc_norm_stderr\": 0.03167473383795718\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.035149425512674394,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.035149425512674394\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5098039215686274,\n \"acc_stderr\": 0.04974229460422817,\n \"acc_norm\": 0.5098039215686274,\n \"acc_norm_stderr\": 0.04974229460422817\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.774468085106383,\n \"acc_stderr\": 0.027321078417387536,\n \"acc_norm\": 0.774468085106383,\n \"acc_norm_stderr\": 0.027321078417387536\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6140350877192983,\n \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.6140350877192983,\n \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7310344827586207,\n \"acc_stderr\": 0.036951833116502325,\n \"acc_norm\": 0.7310344827586207,\n \"acc_norm_stderr\": 0.036951833116502325\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6640211640211641,\n \"acc_stderr\": 0.02432631052914914,\n \"acc_norm\": 0.6640211640211641,\n \"acc_norm_stderr\": 0.02432631052914914\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5317460317460317,\n \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.5317460317460317,\n \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8709677419354839,\n \"acc_stderr\": 0.019070889254792747,\n \"acc_norm\": 0.8709677419354839,\n \"acc_norm_stderr\": 0.019070889254792747\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5763546798029556,\n \"acc_stderr\": 0.03476725747649037,\n \"acc_norm\": 0.5763546798029556,\n \"acc_norm_stderr\": 0.03476725747649037\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781657,\n \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781657\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9141414141414141,\n \"acc_stderr\": 0.01996022556317289,\n \"acc_norm\": 0.9141414141414141,\n \"acc_norm_stderr\": 0.01996022556317289\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9585492227979274,\n \"acc_stderr\": 0.014385432857476442,\n \"acc_norm\": 0.9585492227979274,\n \"acc_norm_stderr\": 0.014385432857476442\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8076923076923077,\n \"acc_stderr\": 0.019982347208637303,\n \"acc_norm\": 0.8076923076923077,\n \"acc_norm_stderr\": 0.019982347208637303\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.029723278961476664,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.029723278961476664\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8487394957983193,\n \"acc_stderr\": 0.023274255898707946,\n \"acc_norm\": 0.8487394957983193,\n \"acc_norm_stderr\": 0.023274255898707946\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.46357615894039733,\n \"acc_stderr\": 0.04071636065944216,\n \"acc_norm\": 0.46357615894039733,\n \"acc_norm_stderr\": 0.04071636065944216\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9100917431192661,\n \"acc_stderr\": 0.012264304540230435,\n \"acc_norm\": 0.9100917431192661,\n \"acc_norm_stderr\": 0.012264304540230435\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03214952147802749,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03214952147802749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9019607843137255,\n \"acc_stderr\": 0.020871118455552097,\n \"acc_norm\": 0.9019607843137255,\n \"acc_norm_stderr\": 0.020871118455552097\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8945147679324894,\n \"acc_stderr\": 0.019995560723758535,\n \"acc_norm\": 0.8945147679324894,\n \"acc_norm_stderr\": 0.019995560723758535\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7757847533632287,\n \"acc_stderr\": 0.02799153425851952,\n \"acc_norm\": 0.7757847533632287,\n \"acc_norm_stderr\": 0.02799153425851952\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9008264462809917,\n \"acc_stderr\": 0.02728524631275896,\n \"acc_norm\": 0.9008264462809917,\n \"acc_norm_stderr\": 0.02728524631275896\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8703703703703703,\n \"acc_stderr\": 0.03247224389917948,\n \"acc_norm\": 0.8703703703703703,\n \"acc_norm_stderr\": 0.03247224389917948\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8650306748466258,\n \"acc_stderr\": 0.026845765054553838,\n \"acc_norm\": 0.8650306748466258,\n \"acc_norm_stderr\": 0.026845765054553838\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5535714285714286,\n \"acc_stderr\": 0.047184714852195865,\n \"acc_norm\": 0.5535714285714286,\n \"acc_norm_stderr\": 0.047184714852195865\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.883495145631068,\n \"acc_stderr\": 0.031766839486404054,\n \"acc_norm\": 0.883495145631068,\n \"acc_norm_stderr\": 0.031766839486404054\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9358974358974359,\n \"acc_stderr\": 0.01604626163167314,\n \"acc_norm\": 0.9358974358974359,\n \"acc_norm_stderr\": 0.01604626163167314\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263714,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263714\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9067688378033205,\n \"acc_stderr\": 0.010397417087292847,\n \"acc_norm\": 0.9067688378033205,\n \"acc_norm_stderr\": 0.010397417087292847\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8005780346820809,\n \"acc_stderr\": 0.021511900654252555,\n \"acc_norm\": 0.8005780346820809,\n \"acc_norm_stderr\": 0.021511900654252555\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6770949720670391,\n \"acc_stderr\": 0.015638440380241474,\n \"acc_norm\": 0.6770949720670391,\n \"acc_norm_stderr\": 0.015638440380241474\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8300653594771242,\n \"acc_stderr\": 0.02150538312123138,\n \"acc_norm\": 0.8300653594771242,\n \"acc_norm_stderr\": 0.02150538312123138\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8135048231511254,\n \"acc_stderr\": 0.022122439772480774,\n \"acc_norm\": 0.8135048231511254,\n \"acc_norm_stderr\": 0.022122439772480774\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8179012345679012,\n \"acc_stderr\": 0.021473491834808355,\n \"acc_norm\": 0.8179012345679012,\n \"acc_norm_stderr\": 0.021473491834808355\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6028368794326241,\n \"acc_stderr\": 0.0291898056735871,\n \"acc_norm\": 0.6028368794326241,\n \"acc_norm_stderr\": 0.0291898056735871\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5645371577574967,\n \"acc_stderr\": 0.012663412101248345,\n \"acc_norm\": 0.5645371577574967,\n \"acc_norm_stderr\": 0.012663412101248345\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8345588235294118,\n \"acc_stderr\": 0.022571771025494757,\n \"acc_norm\": 0.8345588235294118,\n \"acc_norm_stderr\": 0.022571771025494757\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.016819028375736383,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.016819028375736383\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8204081632653061,\n \"acc_stderr\": 0.024573293589585637,\n \"acc_norm\": 0.8204081632653061,\n \"acc_norm_stderr\": 0.024573293589585637\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n \"acc_stderr\": 0.022509345325101716,\n \"acc_norm\": 0.8855721393034826,\n \"acc_norm_stderr\": 0.022509345325101716\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.9005847953216374,\n \"acc_stderr\": 0.022949025579355027,\n \"acc_norm\": 0.9005847953216374,\n \"acc_norm_stderr\": 0.022949025579355027\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.39657282741738065,\n \"mc1_stderr\": 0.017124930942023518,\n \"mc2\": 0.5748062315132791,\n \"mc2_stderr\": 0.015431295873654757\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.813733228097869,\n \"acc_stderr\": 0.01094187795567621\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6709628506444276,\n \"acc_stderr\": 0.012942375603679368\n }\n}\n```", "repo_url": "https://huggingface.co/AA051610/A0120", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|arc:challenge|25_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|gsm8k|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hellaswag|10_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T21-18-32.527803.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["**/details_harness|winogrande|5_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-19T21-18-32.527803.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_19T21_18_32.527803", "path": ["results_2024-01-19T21-18-32.527803.parquet"]}, {"split": "latest", "path": ["results_2024-01-19T21-18-32.527803.parquet"]}]}]} | 2024-01-19T21:21:07+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of AA051610/A0120
Dataset automatically created during the evaluation run of model AA051610/A0120 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-19T21:18:32.527803(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of AA051610/A0120\n\n\n\nDataset automatically created during the evaluation run of model AA051610/A0120 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-19T21:18:32.527803(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of AA051610/A0120\n\n\n\nDataset automatically created during the evaluation run of model AA051610/A0120 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-19T21:18:32.527803(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
757877b1abbdcd407cfd672fdda563dc6d058241 | # lilac/Capybara
This dataset is a [Lilac](http://lilacml.com) processed dataset. Original dataset: [https://huggingface.co/datasets/LDJnr/Capybara](https://huggingface.co/datasets/LDJnr/Capybara)
To download the dataset to a local directory:
```bash
lilac download lilacai/lilac-Capybara
```
or from python with:
```py
ll.download("lilacai/lilac-Capybara")
```
| lilacai/lilac-Capybara | [
"Lilac",
"region:us"
] | 2024-01-19T21:35:17+00:00 | {"tags": ["Lilac"]} | 2024-01-20T13:32:31+00:00 | [] | [] | TAGS
#Lilac #region-us
| # lilac/Capybara
This dataset is a Lilac processed dataset. Original dataset: URL
To download the dataset to a local directory:
or from python with:
| [
"# lilac/Capybara\nThis dataset is a Lilac processed dataset. Original dataset: URL\n\nTo download the dataset to a local directory:\n\n\n\nor from python with:"
] | [
"TAGS\n#Lilac #region-us \n",
"# lilac/Capybara\nThis dataset is a Lilac processed dataset. Original dataset: URL\n\nTo download the dataset to a local directory:\n\n\n\nor from python with:"
] |
798ebc6d9987909c59455a20ae392ac127d3ce44 |
# Anki-Global Voices English-French Translation Dataset
## Description
The Anki-Global Voices English-French Translation Dataset is a comprehensive collection of over 500,000 translation pairs, merging the Anki English to French dataset with the Global Voices English to French dataset. This unique dataset provides a wide range of sentences, suitable for training and evaluating machine translation models in both informal and formal language contexts.
### Languages
This dataset includes sentence pairs in English (ISO 639-1: EN) and French (ISO 639-1: FR).
## Dataset Structure
### Data Instances
Each data instance consists of an English sentence and its French translation. Example:
```json
{
"source": "He is a good boy.",
"target": "C'est un bon garçon."
}
```
### Data Fields
- `source`: The English sentence.
- `target`: The corresponding French translation.
### Data Splits
The dataset is divided into training, validation, and test sets. The distribution is as follows (illustrative numbers):
- Train: 439,000 pairs
- Validation: 54,900 pairs
- Test: 54,900 pairs
## Dataset Creation
### Curation Rationale
This dataset aims to aid the development of machine translation models by providing a rich source of linguistic variations and expressions in English and French, covering both everyday language and formal contexts.
### Source Data
#### Initial Data Collection and Normalization
The dataset combines two sources:
1. **Anki English to French Dataset**: Comprised of curated translation pairs submitted to the Tatoeba Project, mostly short and with casual language. [More information](https://www.manythings.org/anki/)
2. **Global Voices English to French Dataset**: Derived from news and cultural articles on the Global Voices websites, featuring more formal, complicated language. [More information](https://data.europa.eu/data/datasets/elrc_2006?locale=en)
Approximately 21,000 pairs were removed from the Global Voices dataset before merging with the Anki Dataset due to severe misalignment. Samples have already been shuffled.
### Annotations
The dataset contains translations but no additional annotations.
#### Annotation Process
Translations in the Anki dataset were community-contributed, and those in the Global Voices dataset were done by professional translators and journalists.
### Personal and Sensitive Information
The dataset does not contain personal or sensitive information.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset can help improve the accuracy of machine translation systems, facilitating communication across different language speakers.
### Discussion of Biases
The dataset covers both informal and formal language contexts, providing a balanced linguistic range. However, source datasets' biases might be present.
### Other Known Limitations
The dataset focuses exclusively on English-French translations, which may not generalize to other language pairs or cultural contexts.
## Additional Information
### Dataset Curators
Curated by Ariel Guerra-Adames, this dataset combines the Anki and Global Voices datasets into a comprehensive resource for machine translation.
### Licensing Information
See LICENSE
| arielogg/anki_globalvoices_en_fr | [
"task_categories:translation",
"size_categories:100K<n<1M",
"language:fr",
"language:en",
"license:mit",
"region:us"
] | 2024-01-19T21:46:58+00:00 | {"language": ["fr", "en"], "license": "mit", "size_categories": ["100K<n<1M"], "task_categories": ["translation"]} | 2024-01-24T09:50:35+00:00 | [] | [
"fr",
"en"
] | TAGS
#task_categories-translation #size_categories-100K<n<1M #language-French #language-English #license-mit #region-us
|
# Anki-Global Voices English-French Translation Dataset
## Description
The Anki-Global Voices English-French Translation Dataset is a comprehensive collection of over 500,000 translation pairs, merging the Anki English to French dataset with the Global Voices English to French dataset. This unique dataset provides a wide range of sentences, suitable for training and evaluating machine translation models in both informal and formal language contexts.
### Languages
This dataset includes sentence pairs in English (ISO 639-1: EN) and French (ISO 639-1: FR).
## Dataset Structure
### Data Instances
Each data instance consists of an English sentence and its French translation. Example:
### Data Fields
- 'source': The English sentence.
- 'target': The corresponding French translation.
### Data Splits
The dataset is divided into training, validation, and test sets. The distribution is as follows (illustrative numbers):
- Train: 439,000 pairs
- Validation: 54,900 pairs
- Test: 54,900 pairs
## Dataset Creation
### Curation Rationale
This dataset aims to aid the development of machine translation models by providing a rich source of linguistic variations and expressions in English and French, covering both everyday language and formal contexts.
### Source Data
#### Initial Data Collection and Normalization
The dataset combines two sources:
1. Anki English to French Dataset: Comprised of curated translation pairs submitted to the Tatoeba Project, mostly short and with casual language. More information
2. Global Voices English to French Dataset: Derived from news and cultural articles on the Global Voices websites, featuring more formal, complicated language. More information
Approximately 21,000 pairs were removed from the Global Voices dataset before merging with the Anki Dataset due to severe misalignment. Samples have already been shuffled.
### Annotations
The dataset contains translations but no additional annotations.
#### Annotation Process
Translations in the Anki dataset were community-contributed, and those in the Global Voices dataset were done by professional translators and journalists.
### Personal and Sensitive Information
The dataset does not contain personal or sensitive information.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset can help improve the accuracy of machine translation systems, facilitating communication across different language speakers.
### Discussion of Biases
The dataset covers both informal and formal language contexts, providing a balanced linguistic range. However, source datasets' biases might be present.
### Other Known Limitations
The dataset focuses exclusively on English-French translations, which may not generalize to other language pairs or cultural contexts.
## Additional Information
### Dataset Curators
Curated by Ariel Guerra-Adames, this dataset combines the Anki and Global Voices datasets into a comprehensive resource for machine translation.
### Licensing Information
See LICENSE
| [
"# Anki-Global Voices English-French Translation Dataset",
"## Description\n\nThe Anki-Global Voices English-French Translation Dataset is a comprehensive collection of over 500,000 translation pairs, merging the Anki English to French dataset with the Global Voices English to French dataset. This unique dataset provides a wide range of sentences, suitable for training and evaluating machine translation models in both informal and formal language contexts.",
"### Languages\n\nThis dataset includes sentence pairs in English (ISO 639-1: EN) and French (ISO 639-1: FR).",
"## Dataset Structure",
"### Data Instances\n\nEach data instance consists of an English sentence and its French translation. Example:",
"### Data Fields\n\n- 'source': The English sentence.\n- 'target': The corresponding French translation.",
"### Data Splits\n\nThe dataset is divided into training, validation, and test sets. The distribution is as follows (illustrative numbers):\n\n- Train: 439,000 pairs\n- Validation: 54,900 pairs\n- Test: 54,900 pairs",
"## Dataset Creation",
"### Curation Rationale\n\nThis dataset aims to aid the development of machine translation models by providing a rich source of linguistic variations and expressions in English and French, covering both everyday language and formal contexts.",
"### Source Data",
"#### Initial Data Collection and Normalization\n\nThe dataset combines two sources:\n\n1. Anki English to French Dataset: Comprised of curated translation pairs submitted to the Tatoeba Project, mostly short and with casual language. More information\n2. Global Voices English to French Dataset: Derived from news and cultural articles on the Global Voices websites, featuring more formal, complicated language. More information\n\nApproximately 21,000 pairs were removed from the Global Voices dataset before merging with the Anki Dataset due to severe misalignment. Samples have already been shuffled.",
"### Annotations\n\nThe dataset contains translations but no additional annotations.",
"#### Annotation Process\n\nTranslations in the Anki dataset were community-contributed, and those in the Global Voices dataset were done by professional translators and journalists.",
"### Personal and Sensitive Information\n\nThe dataset does not contain personal or sensitive information.",
"## Considerations for Using the Data",
"### Social Impact of Dataset\n\nThis dataset can help improve the accuracy of machine translation systems, facilitating communication across different language speakers.",
"### Discussion of Biases\n\nThe dataset covers both informal and formal language contexts, providing a balanced linguistic range. However, source datasets' biases might be present.",
"### Other Known Limitations\n\nThe dataset focuses exclusively on English-French translations, which may not generalize to other language pairs or cultural contexts.",
"## Additional Information",
"### Dataset Curators\n\nCurated by Ariel Guerra-Adames, this dataset combines the Anki and Global Voices datasets into a comprehensive resource for machine translation.",
"### Licensing Information\n\nSee LICENSE"
] | [
"TAGS\n#task_categories-translation #size_categories-100K<n<1M #language-French #language-English #license-mit #region-us \n",
"# Anki-Global Voices English-French Translation Dataset",
"## Description\n\nThe Anki-Global Voices English-French Translation Dataset is a comprehensive collection of over 500,000 translation pairs, merging the Anki English to French dataset with the Global Voices English to French dataset. This unique dataset provides a wide range of sentences, suitable for training and evaluating machine translation models in both informal and formal language contexts.",
"### Languages\n\nThis dataset includes sentence pairs in English (ISO 639-1: EN) and French (ISO 639-1: FR).",
"## Dataset Structure",
"### Data Instances\n\nEach data instance consists of an English sentence and its French translation. Example:",
"### Data Fields\n\n- 'source': The English sentence.\n- 'target': The corresponding French translation.",
"### Data Splits\n\nThe dataset is divided into training, validation, and test sets. The distribution is as follows (illustrative numbers):\n\n- Train: 439,000 pairs\n- Validation: 54,900 pairs\n- Test: 54,900 pairs",
"## Dataset Creation",
"### Curation Rationale\n\nThis dataset aims to aid the development of machine translation models by providing a rich source of linguistic variations and expressions in English and French, covering both everyday language and formal contexts.",
"### Source Data",
"#### Initial Data Collection and Normalization\n\nThe dataset combines two sources:\n\n1. Anki English to French Dataset: Comprised of curated translation pairs submitted to the Tatoeba Project, mostly short and with casual language. More information\n2. Global Voices English to French Dataset: Derived from news and cultural articles on the Global Voices websites, featuring more formal, complicated language. More information\n\nApproximately 21,000 pairs were removed from the Global Voices dataset before merging with the Anki Dataset due to severe misalignment. Samples have already been shuffled.",
"### Annotations\n\nThe dataset contains translations but no additional annotations.",
"#### Annotation Process\n\nTranslations in the Anki dataset were community-contributed, and those in the Global Voices dataset were done by professional translators and journalists.",
"### Personal and Sensitive Information\n\nThe dataset does not contain personal or sensitive information.",
"## Considerations for Using the Data",
"### Social Impact of Dataset\n\nThis dataset can help improve the accuracy of machine translation systems, facilitating communication across different language speakers.",
"### Discussion of Biases\n\nThe dataset covers both informal and formal language contexts, providing a balanced linguistic range. However, source datasets' biases might be present.",
"### Other Known Limitations\n\nThe dataset focuses exclusively on English-French translations, which may not generalize to other language pairs or cultural contexts.",
"## Additional Information",
"### Dataset Curators\n\nCurated by Ariel Guerra-Adames, this dataset combines the Anki and Global Voices datasets into a comprehensive resource for machine translation.",
"### Licensing Information\n\nSee LICENSE"
] |
2425ed74f71a9c536cb68969aa1cfda9e3ccb5e6 |
# Dataset Card for Evaluation run of freecs/ThetaWave-7B-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [freecs/ThetaWave-7B-v1](https://huggingface.co/freecs/ThetaWave-7B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_freecs__ThetaWave-7B-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-19T21:50:23.328123](https://huggingface.co/datasets/open-llm-leaderboard/details_freecs__ThetaWave-7B-v1/blob/main/results_2024-01-19T21-50-23.328123.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6183532771987861,
"acc_stderr": 0.03276673514568147,
"acc_norm": 0.6210579003605273,
"acc_norm_stderr": 0.03342744429879846,
"mc1": 0.386780905752754,
"mc1_stderr": 0.01704885701051511,
"mc2": 0.5596233363143864,
"mc2_stderr": 0.015539275276985304
},
"harness|arc:challenge|25": {
"acc": 0.6177474402730375,
"acc_stderr": 0.014200454049979282,
"acc_norm": 0.6689419795221843,
"acc_norm_stderr": 0.013752062419817834
},
"harness|hellaswag|10": {
"acc": 0.6462856004779924,
"acc_stderr": 0.004771447244095127,
"acc_norm": 0.849133638717387,
"acc_norm_stderr": 0.003571870848731712
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.04655010411319617,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.04655010411319617
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467383,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467383
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.040703290137070705,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.040703290137070705
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055263,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055263
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5806451612903226,
"acc_stderr": 0.028071588901091852,
"acc_norm": 0.5806451612903226,
"acc_norm_stderr": 0.028071588901091852
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.035107665979592174,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.035107665979592174
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.03008862949021749,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.03008862949021749
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306433,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306433
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6435897435897436,
"acc_stderr": 0.0242831405294673,
"acc_norm": 0.6435897435897436,
"acc_norm_stderr": 0.0242831405294673
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683512,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683512
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.03120469122515002,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.03120469122515002
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8256880733944955,
"acc_stderr": 0.01626567563201035,
"acc_norm": 0.8256880733944955,
"acc_norm_stderr": 0.01626567563201035
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.03457272836917669,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.03457272836917669
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459754,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459754
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.69,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.024476994076247323,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.024476994076247323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38212290502793295,
"acc_stderr": 0.016251139711570772,
"acc_norm": 0.38212290502793295,
"acc_norm_stderr": 0.016251139711570772
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6527331189710611,
"acc_stderr": 0.027040745502307336,
"acc_norm": 0.6527331189710611,
"acc_norm_stderr": 0.027040745502307336
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.02517104191530968,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.02517104191530968
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45827900912646674,
"acc_stderr": 0.01272570165695364,
"acc_norm": 0.45827900912646674,
"acc_norm_stderr": 0.01272570165695364
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6544117647058824,
"acc_stderr": 0.028888193103988637,
"acc_norm": 0.6544117647058824,
"acc_norm_stderr": 0.028888193103988637
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6519607843137255,
"acc_stderr": 0.019270998708223977,
"acc_norm": 0.6519607843137255,
"acc_norm_stderr": 0.019270998708223977
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291293,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291293
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5920398009950248,
"acc_stderr": 0.03475116365194092,
"acc_norm": 0.5920398009950248,
"acc_norm_stderr": 0.03475116365194092
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368036,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368036
},
"harness|truthfulqa:mc|0": {
"mc1": 0.386780905752754,
"mc1_stderr": 0.01704885701051511,
"mc2": 0.5596233363143864,
"mc2_stderr": 0.015539275276985304
},
"harness|winogrande|5": {
"acc": 0.8042620363062352,
"acc_stderr": 0.011151145042218327
},
"harness|gsm8k|5": {
"acc": 0.5269143290371494,
"acc_stderr": 0.013752517189717458
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_freecs__ThetaWave-7B-v1 | [
"region:us"
] | 2024-01-19T21:52:45+00:00 | {"pretty_name": "Evaluation run of freecs/ThetaWave-7B-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [freecs/ThetaWave-7B-v1](https://huggingface.co/freecs/ThetaWave-7B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_freecs__ThetaWave-7B-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-19T21:50:23.328123](https://huggingface.co/datasets/open-llm-leaderboard/details_freecs__ThetaWave-7B-v1/blob/main/results_2024-01-19T21-50-23.328123.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6183532771987861,\n \"acc_stderr\": 0.03276673514568147,\n \"acc_norm\": 0.6210579003605273,\n \"acc_norm_stderr\": 0.03342744429879846,\n \"mc1\": 0.386780905752754,\n \"mc1_stderr\": 0.01704885701051511,\n \"mc2\": 0.5596233363143864,\n \"mc2_stderr\": 0.015539275276985304\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6177474402730375,\n \"acc_stderr\": 0.014200454049979282,\n \"acc_norm\": 0.6689419795221843,\n \"acc_norm_stderr\": 0.013752062419817834\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6462856004779924,\n \"acc_stderr\": 0.004771447244095127,\n \"acc_norm\": 0.849133638717387,\n \"acc_norm_stderr\": 0.003571870848731712\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.04655010411319617,\n \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.04655010411319617\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467383,\n \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467383\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.040703290137070705,\n \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.040703290137070705\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055263,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055263\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5806451612903226,\n \"acc_stderr\": 0.028071588901091852,\n \"acc_norm\": 0.5806451612903226,\n \"acc_norm_stderr\": 0.028071588901091852\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.035107665979592174,\n \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.035107665979592174\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.03008862949021749,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.03008862949021749\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306433,\n \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306433\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6435897435897436,\n \"acc_stderr\": 0.0242831405294673,\n \"acc_norm\": 0.6435897435897436,\n \"acc_norm_stderr\": 0.0242831405294673\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683512,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683512\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.03120469122515002,\n \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.03120469122515002\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8256880733944955,\n \"acc_stderr\": 0.01626567563201035,\n \"acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.01626567563201035\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917669,\n \"acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917669\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.02280138253459754,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.02280138253459754\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.046482319871173156,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.046482319871173156\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.024476994076247323,\n \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.024476994076247323\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38212290502793295,\n \"acc_stderr\": 0.016251139711570772,\n \"acc_norm\": 0.38212290502793295,\n \"acc_norm_stderr\": 0.016251139711570772\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6527331189710611,\n \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.6527331189710611,\n \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.02517104191530968,\n \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.02517104191530968\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45827900912646674,\n \"acc_stderr\": 0.01272570165695364,\n \"acc_norm\": 0.45827900912646674,\n \"acc_norm_stderr\": 0.01272570165695364\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6544117647058824,\n \"acc_stderr\": 0.028888193103988637,\n \"acc_norm\": 0.6544117647058824,\n \"acc_norm_stderr\": 0.028888193103988637\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6519607843137255,\n \"acc_stderr\": 0.019270998708223977,\n \"acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.019270998708223977\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291293,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291293\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5920398009950248,\n \"acc_stderr\": 0.03475116365194092,\n \"acc_norm\": 0.5920398009950248,\n \"acc_norm_stderr\": 0.03475116365194092\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368036,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368036\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.386780905752754,\n \"mc1_stderr\": 0.01704885701051511,\n \"mc2\": 0.5596233363143864,\n \"mc2_stderr\": 0.015539275276985304\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8042620363062352,\n \"acc_stderr\": 0.011151145042218327\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5269143290371494,\n \"acc_stderr\": 0.013752517189717458\n }\n}\n```", "repo_url": "https://huggingface.co/freecs/ThetaWave-7B-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|arc:challenge|25_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|gsm8k|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hellaswag|10_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T21-50-23.328123.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["**/details_harness|winogrande|5_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-19T21-50-23.328123.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_19T21_50_23.328123", "path": ["results_2024-01-19T21-50-23.328123.parquet"]}, {"split": "latest", "path": ["results_2024-01-19T21-50-23.328123.parquet"]}]}]} | 2024-01-19T21:53:07+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of freecs/ThetaWave-7B-v1
Dataset automatically created during the evaluation run of model freecs/ThetaWave-7B-v1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-19T21:50:23.328123(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of freecs/ThetaWave-7B-v1\n\n\n\nDataset automatically created during the evaluation run of model freecs/ThetaWave-7B-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-19T21:50:23.328123(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of freecs/ThetaWave-7B-v1\n\n\n\nDataset automatically created during the evaluation run of model freecs/ThetaWave-7B-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-19T21:50:23.328123(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
5b217458be81582d1834b5e16bd7550a8b1901cd |
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | ChloeZeng/PromptLitTrainData | [
"size_categories:n<1K",
"doi:10.57967/hf/1672",
"region:us"
] | 2024-01-19T21:53:50+00:00 | {"size_categories": ["n<1K"]} | 2024-02-05T16:33:55+00:00 | [] | [] | TAGS
#size_categories-n<1K #doi-10.57967/hf/1672 #region-us
|
# Dataset Card for Dataset Name
This dataset card aims to be a base template for new datasets. It has been generated using this raw template.
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#size_categories-n<1K #doi-10.57967/hf/1672 #region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
f0801b3e201d4f195960b554d2a9c4ed3a68da71 | # Dataset Card for "Instance_Seg_DB"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | izou3/Instance_Seg_DB | [
"region:us"
] | 2024-01-19T22:07:55+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "annotation", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 8716961.0, "num_examples": 453}, {"name": "test", "num_bytes": 3899703.0, "num_examples": 133}], "download_size": 12403234, "dataset_size": 12616664.0}} | 2024-01-19T22:32:08+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "Instance_Seg_DB"
More Information needed | [
"# Dataset Card for \"Instance_Seg_DB\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"Instance_Seg_DB\"\n\nMore Information needed"
] |
89f3b06a02e71e7e9dfec11f9221fa7c9d875d1a | # Dataset Card for "New_Instance_Seg_DB"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | izou3/New_Instance_Seg_DB | [
"region:us"
] | 2024-01-19T22:39:28+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "annotation", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 8716961.0, "num_examples": 453}, {"name": "test", "num_bytes": 3899703.0, "num_examples": 133}], "download_size": 12403234, "dataset_size": 12616664.0}} | 2024-01-19T22:39:30+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "New_Instance_Seg_DB"
More Information needed | [
"# Dataset Card for \"New_Instance_Seg_DB\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"New_Instance_Seg_DB\"\n\nMore Information needed"
] |
6b82709d4feaf3397ec0fda2a2b16734208dd2df |
# Dataset Card for Evaluation run of soniox/Soniox-7B-v1.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [soniox/Soniox-7B-v1.0](https://huggingface.co/soniox/Soniox-7B-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_soniox__Soniox-7B-v1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-19T23:00:10.357078](https://huggingface.co/datasets/open-llm-leaderboard/details_soniox__Soniox-7B-v1.0/blob/main/results_2024-01-19T23-00-10.357078.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6439461862864996,
"acc_stderr": 0.032256074282620416,
"acc_norm": 0.6467921625949077,
"acc_norm_stderr": 0.032899600410563404,
"mc1": 0.37209302325581395,
"mc1_stderr": 0.016921090118814035,
"mc2": 0.5384060653321814,
"mc2_stderr": 0.015406940325739558
},
"harness|arc:challenge|25": {
"acc": 0.6092150170648464,
"acc_stderr": 0.014258563880513785,
"acc_norm": 0.6390784982935154,
"acc_norm_stderr": 0.014034761386175452
},
"harness|hellaswag|10": {
"acc": 0.632244572794264,
"acc_stderr": 0.004812088620277182,
"acc_norm": 0.8255327623979287,
"acc_norm_stderr": 0.0037873515193708137
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933713,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268528,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268528
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.02985751567338642,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.02985751567338642
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121437,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121437
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.02407869658063548,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.02407869658063548
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465073,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465073
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.02995382389188704,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.02995382389188704
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.03983798306659806,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.03983798306659806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.015776239256163248,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.015776239256163248
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.034086558679777494,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.034086558679777494
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.02646056956124064,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.02646056956124064
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8270042194092827,
"acc_stderr": 0.024621562866768427,
"acc_norm": 0.8270042194092827,
"acc_norm_stderr": 0.024621562866768427
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7085201793721974,
"acc_stderr": 0.03050028317654585,
"acc_norm": 0.7085201793721974,
"acc_norm_stderr": 0.03050028317654585
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728744,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728744
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709697,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709697
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.0349260647662379,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.0349260647662379
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281376,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281376
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8135376756066411,
"acc_stderr": 0.013927751372001505,
"acc_norm": 0.8135376756066411,
"acc_norm_stderr": 0.013927751372001505
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.02468531686725781,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.02468531686725781
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4,
"acc_stderr": 0.016384638410380823,
"acc_norm": 0.4,
"acc_norm_stderr": 0.016384638410380823
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.026256053835718964,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.026256053835718964
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7623456790123457,
"acc_stderr": 0.02368359183700856,
"acc_norm": 0.7623456790123457,
"acc_norm_stderr": 0.02368359183700856
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.470013037809648,
"acc_stderr": 0.012747248967079067,
"acc_norm": 0.470013037809648,
"acc_norm_stderr": 0.012747248967079067
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.027971541370170595,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.027971541370170595
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724553,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142773,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578327,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37209302325581395,
"mc1_stderr": 0.016921090118814035,
"mc2": 0.5384060653321814,
"mc2_stderr": 0.015406940325739558
},
"harness|winogrande|5": {
"acc": 0.7805840568271507,
"acc_stderr": 0.01163126836060778
},
"harness|gsm8k|5": {
"acc": 0.5625473843821076,
"acc_stderr": 0.013664299060751915
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_soniox__Soniox-7B-v1.0 | [
"region:us"
] | 2024-01-19T23:02:28+00:00 | {"pretty_name": "Evaluation run of soniox/Soniox-7B-v1.0", "dataset_summary": "Dataset automatically created during the evaluation run of model [soniox/Soniox-7B-v1.0](https://huggingface.co/soniox/Soniox-7B-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_soniox__Soniox-7B-v1.0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-19T23:00:10.357078](https://huggingface.co/datasets/open-llm-leaderboard/details_soniox__Soniox-7B-v1.0/blob/main/results_2024-01-19T23-00-10.357078.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6439461862864996,\n \"acc_stderr\": 0.032256074282620416,\n \"acc_norm\": 0.6467921625949077,\n \"acc_norm_stderr\": 0.032899600410563404,\n \"mc1\": 0.37209302325581395,\n \"mc1_stderr\": 0.016921090118814035,\n \"mc2\": 0.5384060653321814,\n \"mc2_stderr\": 0.015406940325739558\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6092150170648464,\n \"acc_stderr\": 0.014258563880513785,\n \"acc_norm\": 0.6390784982935154,\n \"acc_norm_stderr\": 0.014034761386175452\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.632244572794264,\n \"acc_stderr\": 0.004812088620277182,\n \"acc_norm\": 0.8255327623979287,\n \"acc_norm_stderr\": 0.0037873515193708137\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933713,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933713\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268528,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268528\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338642,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338642\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121437,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121437\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.02407869658063548,\n \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.02407869658063548\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465073,\n \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465073\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.02995382389188704,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.02995382389188704\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.39072847682119205,\n \"acc_stderr\": 0.03983798306659806,\n \"acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.03983798306659806\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163248,\n \"acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163248\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.034086558679777494,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.034086558679777494\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.02646056956124064,\n \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.02646056956124064\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8270042194092827,\n \"acc_stderr\": 0.024621562866768427,\n \"acc_norm\": 0.8270042194092827,\n \"acc_norm_stderr\": 0.024621562866768427\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n \"acc_stderr\": 0.03050028317654585,\n \"acc_norm\": 0.7085201793721974,\n \"acc_norm_stderr\": 0.03050028317654585\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728744,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728744\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709697,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709697\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.0349260647662379,\n \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.0349260647662379\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281376,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281376\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8135376756066411,\n \"acc_stderr\": 0.013927751372001505,\n \"acc_norm\": 0.8135376756066411,\n \"acc_norm_stderr\": 0.013927751372001505\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.02468531686725781,\n \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.02468531686725781\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.016384638410380823,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.016384638410380823\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.026256053835718964,\n \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.026256053835718964\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7623456790123457,\n \"acc_stderr\": 0.02368359183700856,\n \"acc_norm\": 0.7623456790123457,\n \"acc_norm_stderr\": 0.02368359183700856\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.470013037809648,\n \"acc_stderr\": 0.012747248967079067,\n \"acc_norm\": 0.470013037809648,\n \"acc_norm_stderr\": 0.012747248967079067\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170595,\n \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170595\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142773,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142773\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578327,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37209302325581395,\n \"mc1_stderr\": 0.016921090118814035,\n \"mc2\": 0.5384060653321814,\n \"mc2_stderr\": 0.015406940325739558\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7805840568271507,\n \"acc_stderr\": 0.01163126836060778\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5625473843821076,\n \"acc_stderr\": 0.013664299060751915\n }\n}\n```", "repo_url": "https://huggingface.co/soniox/Soniox-7B-v1.0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|arc:challenge|25_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|gsm8k|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hellaswag|10_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T23-00-10.357078.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["**/details_harness|winogrande|5_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-19T23-00-10.357078.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_19T23_00_10.357078", "path": ["results_2024-01-19T23-00-10.357078.parquet"]}, {"split": "latest", "path": ["results_2024-01-19T23-00-10.357078.parquet"]}]}]} | 2024-01-19T23:03:11+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of soniox/Soniox-7B-v1.0
Dataset automatically created during the evaluation run of model soniox/Soniox-7B-v1.0 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-19T23:00:10.357078(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of soniox/Soniox-7B-v1.0\n\n\n\nDataset automatically created during the evaluation run of model soniox/Soniox-7B-v1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-19T23:00:10.357078(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of soniox/Soniox-7B-v1.0\n\n\n\nDataset automatically created during the evaluation run of model soniox/Soniox-7B-v1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-19T23:00:10.357078(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
d223a227ee21fca3dbab9605d3fbcdcc2e33abc5 |
# Dataset Card for "Global Wheat Head Dataset 2021" 😊
If you want any update on the Global Wheat Dataset Community, go on https://www.global-wheat.com/
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Composition](#dataset-composition)
- [Usage](#usage)
- [Citation](#citation)
- [Acknowledgements](#acknowledgements)
## Dataset Description
- **Creators**: Etienne David and others
- **Published**: July 12, 2021 | Version 1.0
- **Availability**: [Zenodo Link](https://doi.org/10.5281/zenodo.5092309)
- **Keywords**: Deep Learning, Wheat Counting, Plant Phenotyping
### Introduction
Wheat is essential for a large part of humanity. The "Global Wheat Head Dataset 2021" aims to support the development of deep learning models for wheat head detection. This dataset addresses challenges like overlapping plants and varying conditions across global wheat fields. It's a step towards automating plant phenotyping and enhancing agricultural practices. 🌾
### Dataset Composition
- **Images**: Over 6000, Resolution - 1024x1024 pixels
- **Annotations**: 300k+ unique wheat heads with bounding boxes
- **Geographic Coverage**: Images from 11 countries
- **Domains**: Various, including sensor types and locations
- **Splits**: Training (Europe & Canada), Test (Other regions)
## Dataset Composition
### Files and Structure
- **Images**: Folder containing all images (`.png`)
- **CSV Files**: `competition_train.csv`, `competition_val.csv`, `competition_test.csv` for different dataset splits
- **Metadata**: `Metadata.csv` with additional details
### Labels
- **Format**: CSV with columns - image_name, BoxesString, domain
- **BoxesString**: `[x_min,y_min, x_max,y_max]` format for bounding boxes
- **Domain**: Specifies the image domain
## Usage
### Tutorials and Resources
- Tutorials available at [AIcrowd Challenge Page](https://www.aicrowd.com/challenges/global-wheat-challenge-2021)
### License
- **Type**: Creative Commons Attribution 4.0 International (cc-by-4.0)
- **Details**: Free to use with attribution
## Citation
If you use this dataset in your research, please cite the following:
```bibtex
@article{david2020global,
title={Global Wheat Head Detection (GWHD) dataset: a large and diverse dataset of high-resolution RGB-labelled images to develop and benchmark wheat head detection methods},
author={David, Etienne and others},
journal={Plant Phenomics},
volume={2020},
year={2020},
publisher={Science Partner Journal}
}
@misc{david2021global,
title={Global Wheat Head Dataset 2021: more diversity to improve the benchmarking of wheat head localization methods},
author={Etienne David and others},
year={2021},
eprint={2105.07660},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
## Acknowledgements
Special thanks to all the contributors, researchers, and institutions that played a pivotal role in the creation of this dataset. Your efforts are helping to advance the field of agricultural sciences and technology. 👏
| Etienne-David/GlobalWheatHeadDataset2021 | [
"task_categories:object-detection",
"language:en",
"license:cc-by-4.0",
"agriculture",
"biology",
"arxiv:2105.07660",
"region:us"
] | 2024-01-19T23:23:31+00:00 | {"language": ["en"], "license": "cc-by-4.0", "task_categories": ["object-detection"], "pretty_name": "Global Wheat Head", "tags": ["agriculture", "biology"], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "domain", "dtype": "string"}, {"name": "country", "dtype": "string"}, {"name": "location", "dtype": "string"}, {"name": "development_stage", "dtype": "string"}, {"name": "objects", "struct": [{"name": "boxes", "sequence": {"sequence": "int64"}}, {"name": "categories", "sequence": "int64"}]}], "splits": [{"name": "train", "num_bytes": 701105106.93, "num_examples": 3655}, {"name": "validation", "num_bytes": 264366740.324, "num_examples": 1476}, {"name": "test", "num_bytes": 301053063.17, "num_examples": 1381}], "download_size": 1260938177, "dataset_size": 1266524910.424}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-19T23:27:41+00:00 | [
"2105.07660"
] | [
"en"
] | TAGS
#task_categories-object-detection #language-English #license-cc-by-4.0 #agriculture #biology #arxiv-2105.07660 #region-us
|
# Dataset Card for "Global Wheat Head Dataset 2021"
If you want any update on the Global Wheat Dataset Community, go on URL
## Table of Contents
- Dataset Description
- Dataset Composition
- Usage
- Citation
- Acknowledgements
## Dataset Description
- Creators: Etienne David and others
- Published: July 12, 2021 | Version 1.0
- Availability: Zenodo Link
- Keywords: Deep Learning, Wheat Counting, Plant Phenotyping
### Introduction
Wheat is essential for a large part of humanity. The "Global Wheat Head Dataset 2021" aims to support the development of deep learning models for wheat head detection. This dataset addresses challenges like overlapping plants and varying conditions across global wheat fields. It's a step towards automating plant phenotyping and enhancing agricultural practices.
### Dataset Composition
- Images: Over 6000, Resolution - 1024x1024 pixels
- Annotations: 300k+ unique wheat heads with bounding boxes
- Geographic Coverage: Images from 11 countries
- Domains: Various, including sensor types and locations
- Splits: Training (Europe & Canada), Test (Other regions)
## Dataset Composition
### Files and Structure
- Images: Folder containing all images ('.png')
- CSV Files: 'competition_train.csv', 'competition_val.csv', 'competition_test.csv' for different dataset splits
- Metadata: 'URL' with additional details
### Labels
- Format: CSV with columns - image_name, BoxesString, domain
- BoxesString: '[x_min,y_min, x_max,y_max]' format for bounding boxes
- Domain: Specifies the image domain
## Usage
### Tutorials and Resources
- Tutorials available at AIcrowd Challenge Page
### License
- Type: Creative Commons Attribution 4.0 International (cc-by-4.0)
- Details: Free to use with attribution
If you use this dataset in your research, please cite the following:
## Acknowledgements
Special thanks to all the contributors, researchers, and institutions that played a pivotal role in the creation of this dataset. Your efforts are helping to advance the field of agricultural sciences and technology.
| [
"# Dataset Card for \"Global Wheat Head Dataset 2021\" \n\nIf you want any update on the Global Wheat Dataset Community, go on URL",
"## Table of Contents\n- Dataset Description\n- Dataset Composition\n- Usage\n- Citation\n- Acknowledgements",
"## Dataset Description\n\n- Creators: Etienne David and others\n- Published: July 12, 2021 | Version 1.0\n- Availability: Zenodo Link\n- Keywords: Deep Learning, Wheat Counting, Plant Phenotyping",
"### Introduction\nWheat is essential for a large part of humanity. The \"Global Wheat Head Dataset 2021\" aims to support the development of deep learning models for wheat head detection. This dataset addresses challenges like overlapping plants and varying conditions across global wheat fields. It's a step towards automating plant phenotyping and enhancing agricultural practices.",
"### Dataset Composition\n- Images: Over 6000, Resolution - 1024x1024 pixels\n- Annotations: 300k+ unique wheat heads with bounding boxes\n- Geographic Coverage: Images from 11 countries\n- Domains: Various, including sensor types and locations\n- Splits: Training (Europe & Canada), Test (Other regions)",
"## Dataset Composition",
"### Files and Structure\n- Images: Folder containing all images ('.png')\n- CSV Files: 'competition_train.csv', 'competition_val.csv', 'competition_test.csv' for different dataset splits\n- Metadata: 'URL' with additional details",
"### Labels\n- Format: CSV with columns - image_name, BoxesString, domain\n- BoxesString: '[x_min,y_min, x_max,y_max]' format for bounding boxes\n- Domain: Specifies the image domain",
"## Usage",
"### Tutorials and Resources\n- Tutorials available at AIcrowd Challenge Page",
"### License\n- Type: Creative Commons Attribution 4.0 International (cc-by-4.0)\n- Details: Free to use with attribution\n\nIf you use this dataset in your research, please cite the following:",
"## Acknowledgements\n\nSpecial thanks to all the contributors, researchers, and institutions that played a pivotal role in the creation of this dataset. Your efforts are helping to advance the field of agricultural sciences and technology."
] | [
"TAGS\n#task_categories-object-detection #language-English #license-cc-by-4.0 #agriculture #biology #arxiv-2105.07660 #region-us \n",
"# Dataset Card for \"Global Wheat Head Dataset 2021\" \n\nIf you want any update on the Global Wheat Dataset Community, go on URL",
"## Table of Contents\n- Dataset Description\n- Dataset Composition\n- Usage\n- Citation\n- Acknowledgements",
"## Dataset Description\n\n- Creators: Etienne David and others\n- Published: July 12, 2021 | Version 1.0\n- Availability: Zenodo Link\n- Keywords: Deep Learning, Wheat Counting, Plant Phenotyping",
"### Introduction\nWheat is essential for a large part of humanity. The \"Global Wheat Head Dataset 2021\" aims to support the development of deep learning models for wheat head detection. This dataset addresses challenges like overlapping plants and varying conditions across global wheat fields. It's a step towards automating plant phenotyping and enhancing agricultural practices.",
"### Dataset Composition\n- Images: Over 6000, Resolution - 1024x1024 pixels\n- Annotations: 300k+ unique wheat heads with bounding boxes\n- Geographic Coverage: Images from 11 countries\n- Domains: Various, including sensor types and locations\n- Splits: Training (Europe & Canada), Test (Other regions)",
"## Dataset Composition",
"### Files and Structure\n- Images: Folder containing all images ('.png')\n- CSV Files: 'competition_train.csv', 'competition_val.csv', 'competition_test.csv' for different dataset splits\n- Metadata: 'URL' with additional details",
"### Labels\n- Format: CSV with columns - image_name, BoxesString, domain\n- BoxesString: '[x_min,y_min, x_max,y_max]' format for bounding boxes\n- Domain: Specifies the image domain",
"## Usage",
"### Tutorials and Resources\n- Tutorials available at AIcrowd Challenge Page",
"### License\n- Type: Creative Commons Attribution 4.0 International (cc-by-4.0)\n- Details: Free to use with attribution\n\nIf you use this dataset in your research, please cite the following:",
"## Acknowledgements\n\nSpecial thanks to all the contributors, researchers, and institutions that played a pivotal role in the creation of this dataset. Your efforts are helping to advance the field of agricultural sciences and technology."
] |
a1bd4ebf9f5c6227eb4c8f13b02bda89c65c56c7 |
Conversion of [starfishmedical/webGPT_x_dolly](https://huggingface.co/datasets/starfishmedical/webGPT_x_dolly) dataset to be used in pretraining.
Python code used for conversion:
```python
from datasets import load_dataset
import pandas
dataset = load_dataset("starfishmedical/webGPT_x_dolly", split="train")
def format(columns):
question = columns["instruction"].strip()
answer = columns["output"].strip()
return f"{question}\n\n{answer}"
pandas.DataFrame({"text": [format(columns) for columns in dataset]}).to_csv("train.csv", index=False)
```
| Felladrin/pretrain-webGPT_x_dolly | [
"source_datasets:starfishmedical/webGPT_x_dolly",
"license:cc-by-sa-3.0",
"region:us"
] | 2024-01-19T23:35:12+00:00 | {"license": "cc-by-sa-3.0", "source_datasets": ["starfishmedical/webGPT_x_dolly"]} | 2024-01-20T19:02:19+00:00 | [] | [] | TAGS
#source_datasets-starfishmedical/webGPT_x_dolly #license-cc-by-sa-3.0 #region-us
|
Conversion of starfishmedical/webGPT_x_dolly dataset to be used in pretraining.
Python code used for conversion:
| [] | [
"TAGS\n#source_datasets-starfishmedical/webGPT_x_dolly #license-cc-by-sa-3.0 #region-us \n"
] |
dd35505d38e7b1f1c7a410ae792983dad35626f7 |
[Natural Instructions v2](https://instructions.apps.allenai.org/) processed for preference learning, using the negatives from the example outputs | euclaise/naturalinstructions2_preferences | [
"region:us"
] | 2024-01-20T00:01:45+00:00 | {"dataset_info": {"features": [{"name": "input", "dtype": "string"}, {"name": "positive", "dtype": "string"}, {"name": "negative", "dtype": "string"}, {"name": "positive_explanation", "dtype": "string"}, {"name": "negative_explanation", "dtype": "string"}, {"name": "source", "sequence": "string"}, {"name": "instruction", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 1577137, "num_examples": 2040}], "download_size": 430836, "dataset_size": 1577137}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-20T00:13:16+00:00 | [] | [] | TAGS
#region-us
|
Natural Instructions v2 processed for preference learning, using the negatives from the example outputs | [] | [
"TAGS\n#region-us \n"
] |
edf82557e770d733843a6cd9eb4780f4ba7b6e74 |
Conversion of [euclaise/reddit-instruct](https://huggingface.co/datasets/euclaise/reddit-instruct) dataset to be used in pretraining.
Python code used for conversion:
```python
from datasets import load_dataset
import pandas
import html
dataset = load_dataset("euclaise/reddit-instruct", split="train")
def format(columns):
return html.unescape(columns["comment_text"].strip())
pandas.DataFrame({"text": [format(columns) for columns in dataset]}).to_csv("train.csv", index=False)
```
| Felladrin/pretrain-reddit-instruct | [
"source_datasets:euclaise/reddit-instruct",
"license:mit",
"region:us"
] | 2024-01-20T00:09:19+00:00 | {"license": "mit", "source_datasets": ["euclaise/reddit-instruct"]} | 2024-01-20T19:01:51+00:00 | [] | [] | TAGS
#source_datasets-euclaise/reddit-instruct #license-mit #region-us
|
Conversion of euclaise/reddit-instruct dataset to be used in pretraining.
Python code used for conversion:
| [] | [
"TAGS\n#source_datasets-euclaise/reddit-instruct #license-mit #region-us \n"
] |
2d5009523581f1143c10276591698df4be8071a3 |
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | DharamArora/GAMSAT-Essays | [
"region:us"
] | 2024-01-20T00:19:29+00:00 | {} | 2024-01-20T01:24:02+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Dataset Name
This dataset card aims to be a base template for new datasets. It has been generated using this raw template.
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
39e7b4326c31ab0b9d3fbbbd832c7e3d184f4fee | This dataset is based on the Japanese version of Wikipedia dataset and converted into a multi-turn conversation format using llama2Pro8B.
Since it is a llama2 license, it can be used commercially for services.
Some strange dialogue may be included as it has not been screened by humans.
We generated 60,000 conversations 18 days on an A100 80GBx7 machine and automatically screened them.
# Model
https://huggingface.co/spaces/TencentARC/LLaMA-Pro-8B-Instruct-Chat
# Dataset
https://huggingface.co/datasets/izumi-lab/wikipedia-ja-20230720
# Compute by
Tsuginosuke AI SuperComputer
FreeAI Ltd.
https://free-ai.ltd | shi3z/ja_conv_wikipedia_llama2pro8b_20k | [
"task_categories:conversational",
"size_categories:10K<n<100K",
"language:ja",
"license:llama2",
"region:us"
] | 2024-01-20T01:15:26+00:00 | {"language": ["ja"], "license": "llama2", "size_categories": ["10K<n<100K"], "task_categories": ["conversational"]} | 2024-01-20T01:17:11+00:00 | [] | [
"ja"
] | TAGS
#task_categories-conversational #size_categories-10K<n<100K #language-Japanese #license-llama2 #region-us
| This dataset is based on the Japanese version of Wikipedia dataset and converted into a multi-turn conversation format using llama2Pro8B.
Since it is a llama2 license, it can be used commercially for services.
Some strange dialogue may be included as it has not been screened by humans.
We generated 60,000 conversations 18 days on an A100 80GBx7 machine and automatically screened them.
# Model
URL
# Dataset
URL
# Compute by
Tsuginosuke AI SuperComputer
FreeAI Ltd.
URL | [
"# Model\nURL",
"# Dataset\nURL",
"# Compute by\nTsuginosuke AI SuperComputer\nFreeAI Ltd.\n\nURL"
] | [
"TAGS\n#task_categories-conversational #size_categories-10K<n<100K #language-Japanese #license-llama2 #region-us \n",
"# Model\nURL",
"# Dataset\nURL",
"# Compute by\nTsuginosuke AI SuperComputer\nFreeAI Ltd.\n\nURL"
] |
1ed1229b97478c41ba1a5b21acad20fc32aa9a74 |
# Dataset Card for Evaluation run of sonthenguyen/NeuralHermes-2.5-Mistral-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [sonthenguyen/NeuralHermes-2.5-Mistral-7B](https://huggingface.co/sonthenguyen/NeuralHermes-2.5-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_sonthenguyen__NeuralHermes-2.5-Mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-20T01:20:50.595952](https://huggingface.co/datasets/open-llm-leaderboard/details_sonthenguyen__NeuralHermes-2.5-Mistral-7B/blob/main/results_2024-01-20T01-20-50.595952.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.634329258699238,
"acc_stderr": 0.032396506106842936,
"acc_norm": 0.6387564807018344,
"acc_norm_stderr": 0.033037752578380465,
"mc1": 0.386780905752754,
"mc1_stderr": 0.017048857010515107,
"mc2": 0.5597610850445724,
"mc2_stderr": 0.015477994996792073
},
"harness|arc:challenge|25": {
"acc": 0.6356655290102389,
"acc_stderr": 0.01406326027988242,
"acc_norm": 0.6757679180887372,
"acc_norm_stderr": 0.013678810399518824
},
"harness|hellaswag|10": {
"acc": 0.6674965146385182,
"acc_stderr": 0.004701474865207031,
"acc_norm": 0.8569010157339175,
"acc_norm_stderr": 0.0034945810763985403
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.02872750295788027,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.02872750295788027
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.048108401480826346,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.048108401480826346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483016,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483016
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026704,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026704
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6102564102564103,
"acc_stderr": 0.024726967886647074,
"acc_norm": 0.6102564102564103,
"acc_norm_stderr": 0.024726967886647074
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114986,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114986
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.01599015488507338,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.01599015488507338
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477518,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477518
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.039578354719809805,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.039578354719809805
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597528,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597528
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993457,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993457
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.024257901705323378,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.024257901705323378
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.32849162011173183,
"acc_stderr": 0.015707935398496447,
"acc_norm": 0.32849162011173183,
"acc_norm_stderr": 0.015707935398496447
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.024630048979824782,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.024630048979824782
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.023891879541959607,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.023891879541959607
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5141843971631206,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.5141843971631206,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46740547588005216,
"acc_stderr": 0.01274307294265334,
"acc_norm": 0.46740547588005216,
"acc_norm_stderr": 0.01274307294265334
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406762,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406762
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724553,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142773,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8009950248756219,
"acc_stderr": 0.028231365092758406,
"acc_norm": 0.8009950248756219,
"acc_norm_stderr": 0.028231365092758406
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.386780905752754,
"mc1_stderr": 0.017048857010515107,
"mc2": 0.5597610850445724,
"mc2_stderr": 0.015477994996792073
},
"harness|winogrande|5": {
"acc": 0.7797947908445146,
"acc_stderr": 0.011646276755089688
},
"harness|gsm8k|5": {
"acc": 0.45716451857467777,
"acc_stderr": 0.013721849968709723
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_sonthenguyen__NeuralHermes-2.5-Mistral-7B | [
"region:us"
] | 2024-01-20T01:23:09+00:00 | {"pretty_name": "Evaluation run of sonthenguyen/NeuralHermes-2.5-Mistral-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [sonthenguyen/NeuralHermes-2.5-Mistral-7B](https://huggingface.co/sonthenguyen/NeuralHermes-2.5-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_sonthenguyen__NeuralHermes-2.5-Mistral-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-20T01:20:50.595952](https://huggingface.co/datasets/open-llm-leaderboard/details_sonthenguyen__NeuralHermes-2.5-Mistral-7B/blob/main/results_2024-01-20T01-20-50.595952.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.634329258699238,\n \"acc_stderr\": 0.032396506106842936,\n \"acc_norm\": 0.6387564807018344,\n \"acc_norm_stderr\": 0.033037752578380465,\n \"mc1\": 0.386780905752754,\n \"mc1_stderr\": 0.017048857010515107,\n \"mc2\": 0.5597610850445724,\n \"mc2_stderr\": 0.015477994996792073\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6356655290102389,\n \"acc_stderr\": 0.01406326027988242,\n \"acc_norm\": 0.6757679180887372,\n \"acc_norm_stderr\": 0.013678810399518824\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6674965146385182,\n \"acc_stderr\": 0.004701474865207031,\n \"acc_norm\": 0.8569010157339175,\n \"acc_norm_stderr\": 0.0034945810763985403\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.02872750295788027,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.02872750295788027\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.048108401480826346,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.048108401480826346\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483016,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483016\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026704,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026704\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6102564102564103,\n \"acc_stderr\": 0.024726967886647074,\n \"acc_norm\": 0.6102564102564103,\n \"acc_norm_stderr\": 0.024726967886647074\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114986,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114986\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8330275229357799,\n \"acc_stderr\": 0.01599015488507338,\n \"acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.01599015488507338\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477518,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477518\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.039578354719809805,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.039578354719809805\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993457,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993457\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323378,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323378\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.32849162011173183,\n \"acc_stderr\": 0.015707935398496447,\n \"acc_norm\": 0.32849162011173183,\n \"acc_norm_stderr\": 0.015707935398496447\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.024630048979824782,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.024630048979824782\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959607,\n \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959607\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5141843971631206,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.5141843971631206,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n \"acc_stderr\": 0.01274307294265334,\n \"acc_norm\": 0.46740547588005216,\n \"acc_norm_stderr\": 0.01274307294265334\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406762,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406762\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142773,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142773\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8009950248756219,\n \"acc_stderr\": 0.028231365092758406,\n \"acc_norm\": 0.8009950248756219,\n \"acc_norm_stderr\": 0.028231365092758406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.386780905752754,\n \"mc1_stderr\": 0.017048857010515107,\n \"mc2\": 0.5597610850445724,\n \"mc2_stderr\": 0.015477994996792073\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7797947908445146,\n \"acc_stderr\": 0.011646276755089688\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.45716451857467777,\n \"acc_stderr\": 0.013721849968709723\n }\n}\n```", "repo_url": "https://huggingface.co/sonthenguyen/NeuralHermes-2.5-Mistral-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|arc:challenge|25_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|gsm8k|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hellaswag|10_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T01-20-50.595952.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["**/details_harness|winogrande|5_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-20T01-20-50.595952.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_20T01_20_50.595952", "path": ["results_2024-01-20T01-20-50.595952.parquet"]}, {"split": "latest", "path": ["results_2024-01-20T01-20-50.595952.parquet"]}]}]} | 2024-01-20T01:23:33+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of sonthenguyen/NeuralHermes-2.5-Mistral-7B
Dataset automatically created during the evaluation run of model sonthenguyen/NeuralHermes-2.5-Mistral-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-20T01:20:50.595952(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of sonthenguyen/NeuralHermes-2.5-Mistral-7B\n\n\n\nDataset automatically created during the evaluation run of model sonthenguyen/NeuralHermes-2.5-Mistral-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T01:20:50.595952(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of sonthenguyen/NeuralHermes-2.5-Mistral-7B\n\n\n\nDataset automatically created during the evaluation run of model sonthenguyen/NeuralHermes-2.5-Mistral-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T01:20:50.595952(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
6f54da5a189c0d2da97e38c72cb8a55b7ade4f54 |
## Python Copilot Large Coding Dataset
This dataset is a subset of the matlok python copilot datasets. Please refer to the [Multimodal Python Copilot Training Overview](https://huggingface.co/datasets/matlok/multimodal-python-copilot-training-overview) for more details on how to use this dataset.
### Details
Each row contains python code, either a class method or a global function, imported modules, base classes (if any), exceptions (ordered based off the code), returns (ordered based off the code), arguments (ordered based off the code), and more.
- Rows: 2350782
- Size: 3.1 GB
- Data type: text
- Format: Extracted code using python AST
### Schema
```json
{
"args": "string",
"class_bases": "string",
"class_docstr": "string",
"class_docstr_tok": "string",
"class_name": "string",
"code": "string",
"code_tok": "string",
"docstr": "string",
"docstr_tok": "string",
"file_path": "string",
"filename": "string",
"imports": "string",
"is_member": "bool",
"label_desc": "string",
"label_desc_len": "int64",
"label_id": "string",
"lend": "int64",
"lstart": "int64",
"name": "string",
"num_all_bases": "float64",
"num_bases": "float64",
"num_classes": "float64",
"num_functions": "int64",
"num_imports": "int64",
"num_methods": "float64",
"raises": "string",
"returns": "string",
"total_objects": "int64"
}
```
### How to use the dataset
```python
from datasets import load_dataset
ds = load_dataset("matlok/python-copilot-training-from-many-repos-large", data_dir="files")
```
| matlok/python-copilot-training-from-many-repos-large | [
"task_categories:text-generation",
"task_ids:parsing",
"size_categories:100K<n<1M",
"size_categories:1M<n<10M",
"license:other",
"python-copilot",
"python-coding",
"fine-tuning",
"training",
"alpaca",
"text",
"coding",
"region:us"
] | 2024-01-20T02:02:03+00:00 | {"license": ["other"], "size_categories": ["100K<n<1M", "1M<n<10M"], "task_categories": ["text-generation"], "task_ids": ["parsing"], "pretty_name": "python copilot large coding dataset", "dataset_info": [{"config_name": "view_schema", "splits": [{"name": "view_schema"}]}], "configs": [{"config_name": "view_schema", "data_files": [{"split": "view_schema", "path": "files/lok-python-code-large-v1_00000013.parquet"}]}], "tags": ["python-copilot", "python-coding", "fine-tuning", "training", "alpaca", "text", "coding"]} | 2024-01-25T21:08:22+00:00 | [] | [] | TAGS
#task_categories-text-generation #task_ids-parsing #size_categories-100K<n<1M #size_categories-1M<n<10M #license-other #python-copilot #python-coding #fine-tuning #training #alpaca #text #coding #region-us
|
## Python Copilot Large Coding Dataset
This dataset is a subset of the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.
### Details
Each row contains python code, either a class method or a global function, imported modules, base classes (if any), exceptions (ordered based off the code), returns (ordered based off the code), arguments (ordered based off the code), and more.
- Rows: 2350782
- Size: 3.1 GB
- Data type: text
- Format: Extracted code using python AST
### Schema
### How to use the dataset
| [
"## Python Copilot Large Coding Dataset\n\nThis dataset is a subset of the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.",
"### Details\n\nEach row contains python code, either a class method or a global function, imported modules, base classes (if any), exceptions (ordered based off the code), returns (ordered based off the code), arguments (ordered based off the code), and more.\n\n- Rows: 2350782\n- Size: 3.1 GB\n- Data type: text\n- Format: Extracted code using python AST",
"### Schema",
"### How to use the dataset"
] | [
"TAGS\n#task_categories-text-generation #task_ids-parsing #size_categories-100K<n<1M #size_categories-1M<n<10M #license-other #python-copilot #python-coding #fine-tuning #training #alpaca #text #coding #region-us \n",
"## Python Copilot Large Coding Dataset\n\nThis dataset is a subset of the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.",
"### Details\n\nEach row contains python code, either a class method or a global function, imported modules, base classes (if any), exceptions (ordered based off the code), returns (ordered based off the code), arguments (ordered based off the code), and more.\n\n- Rows: 2350782\n- Size: 3.1 GB\n- Data type: text\n- Format: Extracted code using python AST",
"### Schema",
"### How to use the dataset"
] |
be8216a952126c050c8ab501918b7cdab2e0f479 |
## Python Copilot Image Training using Class Knowledge Graphs
This dataset is a subset of the matlok python copilot datasets. Please refer to the [Multimodal Python Copilot Training Overview](https://huggingface.co/datasets/matlok/multimodal-python-copilot-training-overview) for more details on how to use this dataset.
### Details
Each row contains a png file in the **dbytes** column.
- Rows: 312277
- Size: 304.3 GB
- Data type: png
- Format: Knowledge graph using NetworkX with alpaca text box
### Schema
The png is in the **dbytes** column:
```
{
"dbytes": "binary",
"dbytes_len": "int64",
"dbytes_mb": "float64",
"filename": "string",
"path": "string",
"repo": "string",
"type": "string"
}
```
### How to use the dataset
```python
from datasets import load_dataset
ds = load_dataset("matlok/python-image-copilot-training-using-class-knowledge-graphs", data_dir="files")
```
| matlok/python-image-copilot-training-using-class-knowledge-graphs | [
"task_categories:text-to-image",
"task_categories:image-to-image",
"task_categories:question-answering",
"task_ids:parsing",
"size_categories:100K<n<1M",
"license:other",
"python-copilot",
"python-coding",
"python-architecture",
"knowledge-graphs",
"multimodal",
"text-image-audio",
"fine-tuning",
"training",
"question-answering",
"image-knowledge-graph",
"alpaca",
"mp3",
"png",
"text",
"instruct",
"class",
"classes",
"region:us"
] | 2024-01-20T02:03:43+00:00 | {"license": ["other"], "size_categories": ["100K<n<1M"], "task_categories": ["text-to-image", "image-to-image", "question-answering"], "task_ids": ["parsing"], "pretty_name": "python copilot image training using class knowledge graphs", "dataset_info": [{"config_name": "view_schema", "splits": [{"name": "view_schema"}]}], "configs": [{"config_name": "view_schema", "data_files": [{"split": "view_schema", "path": "files/lok-python-copilot-img.class-v1-00003130.parquet"}]}], "tags": ["python-copilot", "python-coding", "python-architecture", "knowledge-graphs", "multimodal", "text-image-audio", "fine-tuning", "training", "question-answering", "image-knowledge-graph", "alpaca", "mp3", "png", "text", "instruct", "class", "classes"]} | 2024-01-25T18:49:09+00:00 | [] | [] | TAGS
#task_categories-text-to-image #task_categories-image-to-image #task_categories-question-answering #task_ids-parsing #size_categories-100K<n<1M #license-other #python-copilot #python-coding #python-architecture #knowledge-graphs #multimodal #text-image-audio #fine-tuning #training #question-answering #image-knowledge-graph #alpaca #mp3 #png #text #instruct #class #classes #region-us
|
## Python Copilot Image Training using Class Knowledge Graphs
This dataset is a subset of the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.
### Details
Each row contains a png file in the dbytes column.
- Rows: 312277
- Size: 304.3 GB
- Data type: png
- Format: Knowledge graph using NetworkX with alpaca text box
### Schema
The png is in the dbytes column:
### How to use the dataset
| [
"## Python Copilot Image Training using Class Knowledge Graphs\n\nThis dataset is a subset of the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.",
"### Details\n\nEach row contains a png file in the dbytes column.\n\n- Rows: 312277\n- Size: 304.3 GB\n- Data type: png\n- Format: Knowledge graph using NetworkX with alpaca text box",
"### Schema\n\nThe png is in the dbytes column:",
"### How to use the dataset"
] | [
"TAGS\n#task_categories-text-to-image #task_categories-image-to-image #task_categories-question-answering #task_ids-parsing #size_categories-100K<n<1M #license-other #python-copilot #python-coding #python-architecture #knowledge-graphs #multimodal #text-image-audio #fine-tuning #training #question-answering #image-knowledge-graph #alpaca #mp3 #png #text #instruct #class #classes #region-us \n",
"## Python Copilot Image Training using Class Knowledge Graphs\n\nThis dataset is a subset of the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.",
"### Details\n\nEach row contains a png file in the dbytes column.\n\n- Rows: 312277\n- Size: 304.3 GB\n- Data type: png\n- Format: Knowledge graph using NetworkX with alpaca text box",
"### Schema\n\nThe png is in the dbytes column:",
"### How to use the dataset"
] |
6ab9f30422de2a8ae8a35890938af57ff0de852f |
## Python Copilot Image Training using Function Knowledge Graphs
This dataset is a subset of the matlok python copilot datasets. Please refer to the [Multimodal Python Copilot Training Overview](https://huggingface.co/datasets/matlok/multimodal-python-copilot-training-overview) for more details on how to use this dataset.
### Details
Each row contains a png file in the **dbytes** column.
- Rows: 134357
- Size: 130.5 GB
- Data type: png
- Format: Knowledge graph using NetworkX with alpaca text box
### Schema
The png is in the **dbytes** column:
```
{
"dbytes": "binary",
"dbytes_len": "int64",
"dbytes_mb": "float64",
"filename": "string",
"path": "string",
"repo": "string",
"type": "string"
}
```
### How to use the dataset
```python
from datasets import load_dataset
ds = load_dataset("matlok/python-image-copilot-training-using-function-knowledge-graphs", data_dir="files")
```
| matlok/python-image-copilot-training-using-function-knowledge-graphs | [
"task_categories:text-to-image",
"task_categories:image-to-image",
"task_categories:question-answering",
"task_ids:parsing",
"size_categories:100K<n<1M",
"license:other",
"python-copilot",
"python-coding",
"python-architecture",
"knowledge-graphs",
"multimodal",
"text-image-audio",
"fine-tuning",
"training",
"question-answering",
"image-knowledge-graph",
"alpaca",
"mp3",
"png",
"text",
"instruct",
"function",
"functions",
"region:us"
] | 2024-01-20T02:06:17+00:00 | {"license": ["other"], "size_categories": ["100K<n<1M"], "task_categories": ["text-to-image", "image-to-image", "question-answering"], "task_ids": ["parsing"], "pretty_name": "python copilot image training using function knowledge graphs", "dataset_info": [{"config_name": "view_schema", "splits": [{"name": "view_schema"}]}], "configs": [{"config_name": "view_schema", "data_files": [{"split": "view_schema", "path": "files/lok-python-copilot-img.func-v1_00001364.parquet"}]}], "tags": ["python-copilot", "python-coding", "python-architecture", "knowledge-graphs", "multimodal", "text-image-audio", "fine-tuning", "training", "question-answering", "image-knowledge-graph", "alpaca", "mp3", "png", "text", "instruct", "function", "functions"]} | 2024-01-25T18:51:56+00:00 | [] | [] | TAGS
#task_categories-text-to-image #task_categories-image-to-image #task_categories-question-answering #task_ids-parsing #size_categories-100K<n<1M #license-other #python-copilot #python-coding #python-architecture #knowledge-graphs #multimodal #text-image-audio #fine-tuning #training #question-answering #image-knowledge-graph #alpaca #mp3 #png #text #instruct #function #functions #region-us
|
## Python Copilot Image Training using Function Knowledge Graphs
This dataset is a subset of the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.
### Details
Each row contains a png file in the dbytes column.
- Rows: 134357
- Size: 130.5 GB
- Data type: png
- Format: Knowledge graph using NetworkX with alpaca text box
### Schema
The png is in the dbytes column:
### How to use the dataset
| [
"## Python Copilot Image Training using Function Knowledge Graphs\n\nThis dataset is a subset of the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.",
"### Details\n\nEach row contains a png file in the dbytes column.\n\n- Rows: 134357\n- Size: 130.5 GB\n- Data type: png\n- Format: Knowledge graph using NetworkX with alpaca text box",
"### Schema\n\nThe png is in the dbytes column:",
"### How to use the dataset"
] | [
"TAGS\n#task_categories-text-to-image #task_categories-image-to-image #task_categories-question-answering #task_ids-parsing #size_categories-100K<n<1M #license-other #python-copilot #python-coding #python-architecture #knowledge-graphs #multimodal #text-image-audio #fine-tuning #training #question-answering #image-knowledge-graph #alpaca #mp3 #png #text #instruct #function #functions #region-us \n",
"## Python Copilot Image Training using Function Knowledge Graphs\n\nThis dataset is a subset of the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.",
"### Details\n\nEach row contains a png file in the dbytes column.\n\n- Rows: 134357\n- Size: 130.5 GB\n- Data type: png\n- Format: Knowledge graph using NetworkX with alpaca text box",
"### Schema\n\nThe png is in the dbytes column:",
"### How to use the dataset"
] |
6da5011859ce06240d4eda2234358545f3b59a86 |
## Python Copilot Image Training using Inheritance and Polymorphism Knowledge Graphs
This dataset is a subset of the matlok python copilot datasets. Please refer to the [Multimodal Python Copilot Training Overview](https://huggingface.co/datasets/matlok/multimodal-python-copilot-training-overview) for more details on how to use this dataset.
### Details
Each row contains a png file in the **dbytes** column.
- Rows: 259017
- Size: 135.2 GB
- Data type: png
- Format: Knowledge graph using NetworkX with alpaca text box
### Schema
The png is in the **dbytes** column:
```
{
"dbytes": "binary",
"dbytes_len": "int64",
"dbytes_mb": "float64",
"filename": "string",
"path": "string",
"repo": "string",
"type": "string"
}
```
### How to use the dataset
```python
from datasets import load_dataset
ds = load_dataset("matlok/python-image-copilot-training-using-inheritance-knowledge-graphs", data_dir="files")
```
| matlok/python-image-copilot-training-using-inheritance-knowledge-graphs | [
"task_categories:text-to-image",
"task_categories:image-to-image",
"task_categories:question-answering",
"task_ids:parsing",
"size_categories:100K<n<1M",
"license:other",
"python-copilot",
"python-coding",
"python-architecture",
"knowledge-graphs",
"multimodal",
"text-image-audio",
"fine-tuning",
"training",
"question-answering",
"image-knowledge-graph",
"alpaca",
"mp3",
"png",
"text",
"instruct",
"base_class",
"base_classes",
"inheritance",
"polymorphism",
"region:us"
] | 2024-01-20T02:11:17+00:00 | {"license": ["other"], "size_categories": ["100K<n<1M"], "task_categories": ["text-to-image", "image-to-image", "question-answering"], "task_ids": ["parsing"], "pretty_name": "python copilot image training using inheritance and polymorphism knowledge graphs", "dataset_info": [{"config_name": "view_schema", "splits": [{"name": "view_schema"}]}], "configs": [{"config_name": "view_schema", "data_files": [{"split": "view_schema", "path": "files/lok-python-copilot-img.base-v1-00000610.parquet"}]}], "tags": ["python-copilot", "python-coding", "python-architecture", "knowledge-graphs", "multimodal", "text-image-audio", "fine-tuning", "training", "question-answering", "image-knowledge-graph", "alpaca", "mp3", "png", "text", "instruct", "base_class", "base_classes", "inheritance", "polymorphism"]} | 2024-01-25T18:52:50+00:00 | [] | [] | TAGS
#task_categories-text-to-image #task_categories-image-to-image #task_categories-question-answering #task_ids-parsing #size_categories-100K<n<1M #license-other #python-copilot #python-coding #python-architecture #knowledge-graphs #multimodal #text-image-audio #fine-tuning #training #question-answering #image-knowledge-graph #alpaca #mp3 #png #text #instruct #base_class #base_classes #inheritance #polymorphism #region-us
|
## Python Copilot Image Training using Inheritance and Polymorphism Knowledge Graphs
This dataset is a subset of the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.
### Details
Each row contains a png file in the dbytes column.
- Rows: 259017
- Size: 135.2 GB
- Data type: png
- Format: Knowledge graph using NetworkX with alpaca text box
### Schema
The png is in the dbytes column:
### How to use the dataset
| [
"## Python Copilot Image Training using Inheritance and Polymorphism Knowledge Graphs\n\nThis dataset is a subset of the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.",
"### Details\n\nEach row contains a png file in the dbytes column.\n\n- Rows: 259017\n- Size: 135.2 GB\n- Data type: png\n- Format: Knowledge graph using NetworkX with alpaca text box",
"### Schema\n\nThe png is in the dbytes column:",
"### How to use the dataset"
] | [
"TAGS\n#task_categories-text-to-image #task_categories-image-to-image #task_categories-question-answering #task_ids-parsing #size_categories-100K<n<1M #license-other #python-copilot #python-coding #python-architecture #knowledge-graphs #multimodal #text-image-audio #fine-tuning #training #question-answering #image-knowledge-graph #alpaca #mp3 #png #text #instruct #base_class #base_classes #inheritance #polymorphism #region-us \n",
"## Python Copilot Image Training using Inheritance and Polymorphism Knowledge Graphs\n\nThis dataset is a subset of the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.",
"### Details\n\nEach row contains a png file in the dbytes column.\n\n- Rows: 259017\n- Size: 135.2 GB\n- Data type: png\n- Format: Knowledge graph using NetworkX with alpaca text box",
"### Schema\n\nThe png is in the dbytes column:",
"### How to use the dataset"
] |
27b4bbfb803537fb4009d4b9908030b47b92b490 |
## Python Copilot Audio Training using Class with Knowledge Graphs
This dataset is a subset of the matlok python copilot datasets. Please refer to the [Multimodal Python Copilot Training Overview](https://huggingface.co/datasets/matlok/multimodal-python-copilot-training-overview) for more details on how to use this dataset.
### Details
Each class method has a question and answer mp3 where one voice reads the question and another voice reads the answer. Both mp3s are stored in the parquet **dbytes** column and the associated source code **file_path** identifier.
- Rows: 211020
- Size: 95.3 GB
- Data type: mp3
- Format: narrated alpaca question and answers using two voices
### Schema
```
{
"audio_path": "string",
"audio_type": "string",
"dbytes": "string",
"dbytes_len": "int64",
"file_path": "string",
"file_path_len": "int64",
"lang": "string",
"lang_len": "int64",
"recsize": "int64"
}
```
### How to use the dataset
```python
from datasets import load_dataset
ds = load_dataset("matlok/python-audio-copilot-training-using-class-knowledge-graphs", data_dir="files")
```
| matlok/python-audio-copilot-training-using-class-knowledge-graphs | [
"task_categories:text-to-audio",
"task_categories:audio-to-audio",
"task_categories:question-answering",
"task_ids:parsing",
"size_categories:100K<n<1M",
"license:other",
"python-copilot",
"python-coding",
"python-architecture",
"knowledge-graphs",
"multimodal",
"text-image-audio",
"fine-tuning",
"training",
"question-answering",
"image-knowledge-graph",
"alpaca",
"mp3",
"png",
"text",
"instruct",
"class",
"classes",
"region:us"
] | 2024-01-20T02:15:55+00:00 | {"license": ["other"], "size_categories": ["100K<n<1M"], "task_categories": ["text-to-audio", "audio-to-audio", "question-answering"], "task_ids": ["parsing"], "pretty_name": "python copilot audio training using class with knowledge graphs", "dataset_info": [{"config_name": "view_schema", "splits": [{"name": "view_schema"}]}], "configs": [{"config_name": "view_schema", "data_files": [{"split": "view_schema", "path": "files/lok-python-copilot-audio.class-v1_00000717.parquet"}]}], "tags": ["python-copilot", "python-coding", "python-architecture", "knowledge-graphs", "multimodal", "text-image-audio", "fine-tuning", "training", "question-answering", "image-knowledge-graph", "alpaca", "mp3", "png", "text", "instruct", "class", "classes"]} | 2024-01-25T18:56:02+00:00 | [] | [] | TAGS
#task_categories-text-to-audio #task_categories-audio-to-audio #task_categories-question-answering #task_ids-parsing #size_categories-100K<n<1M #license-other #python-copilot #python-coding #python-architecture #knowledge-graphs #multimodal #text-image-audio #fine-tuning #training #question-answering #image-knowledge-graph #alpaca #mp3 #png #text #instruct #class #classes #region-us
|
## Python Copilot Audio Training using Class with Knowledge Graphs
This dataset is a subset of the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.
### Details
Each class method has a question and answer mp3 where one voice reads the question and another voice reads the answer. Both mp3s are stored in the parquet dbytes column and the associated source code file_path identifier.
- Rows: 211020
- Size: 95.3 GB
- Data type: mp3
- Format: narrated alpaca question and answers using two voices
### Schema
### How to use the dataset
| [
"## Python Copilot Audio Training using Class with Knowledge Graphs\n\nThis dataset is a subset of the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.",
"### Details\n\nEach class method has a question and answer mp3 where one voice reads the question and another voice reads the answer. Both mp3s are stored in the parquet dbytes column and the associated source code file_path identifier.\n\n- Rows: 211020\n- Size: 95.3 GB\n- Data type: mp3\n- Format: narrated alpaca question and answers using two voices",
"### Schema",
"### How to use the dataset"
] | [
"TAGS\n#task_categories-text-to-audio #task_categories-audio-to-audio #task_categories-question-answering #task_ids-parsing #size_categories-100K<n<1M #license-other #python-copilot #python-coding #python-architecture #knowledge-graphs #multimodal #text-image-audio #fine-tuning #training #question-answering #image-knowledge-graph #alpaca #mp3 #png #text #instruct #class #classes #region-us \n",
"## Python Copilot Audio Training using Class with Knowledge Graphs\n\nThis dataset is a subset of the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.",
"### Details\n\nEach class method has a question and answer mp3 where one voice reads the question and another voice reads the answer. Both mp3s are stored in the parquet dbytes column and the associated source code file_path identifier.\n\n- Rows: 211020\n- Size: 95.3 GB\n- Data type: mp3\n- Format: narrated alpaca question and answers using two voices",
"### Schema",
"### How to use the dataset"
] |
8600295bd691e036f1a7c16c10290b33ca426ad8 |
# Dataset Card for Evaluation run of KnutJaegersberg/Qwen-1_8B-Chat-llama
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [KnutJaegersberg/Qwen-1_8B-Chat-llama](https://huggingface.co/KnutJaegersberg/Qwen-1_8B-Chat-llama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__Qwen-1_8B-Chat-llama",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-20T02:47:07.832828](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Qwen-1_8B-Chat-llama/blob/main/results_2024-01-20T02-47-07.832828.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4417307007712396,
"acc_stderr": 0.03457643291788475,
"acc_norm": 0.4458531507999814,
"acc_norm_stderr": 0.03533462860998811,
"mc1": 0.2741738066095471,
"mc1_stderr": 0.015616518497219371,
"mc2": 0.436959909496514,
"mc2_stderr": 0.01509621411098862
},
"harness|arc:challenge|25": {
"acc": 0.34215017064846415,
"acc_stderr": 0.01386415215917728,
"acc_norm": 0.36945392491467577,
"acc_norm_stderr": 0.014104578366491899
},
"harness|hellaswag|10": {
"acc": 0.42959569806811393,
"acc_stderr": 0.004940067402031043,
"acc_norm": 0.5434176458872735,
"acc_norm_stderr": 0.004970933420231931
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.04244633238353229,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.04244633238353229
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.506578947368421,
"acc_stderr": 0.04068590050224971,
"acc_norm": 0.506578947368421,
"acc_norm_stderr": 0.04068590050224971
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.45660377358490567,
"acc_stderr": 0.030656748696739435,
"acc_norm": 0.45660377358490567,
"acc_norm_stderr": 0.030656748696739435
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3680555555555556,
"acc_stderr": 0.04032999053960719,
"acc_norm": 0.3680555555555556,
"acc_norm_stderr": 0.04032999053960719
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.37572254335260113,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.37572254335260113,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39574468085106385,
"acc_stderr": 0.03196758697835363,
"acc_norm": 0.39574468085106385,
"acc_norm_stderr": 0.03196758697835363
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4413793103448276,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.4413793103448276,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.023517294335963286,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.023517294335963286
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.039325376803928704,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.039325376803928704
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.49032258064516127,
"acc_stderr": 0.028438677998909558,
"acc_norm": 0.49032258064516127,
"acc_norm_stderr": 0.028438677998909558
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3694581280788177,
"acc_stderr": 0.03395970381998574,
"acc_norm": 0.3694581280788177,
"acc_norm_stderr": 0.03395970381998574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6060606060606061,
"acc_stderr": 0.038154943086889305,
"acc_norm": 0.6060606060606061,
"acc_norm_stderr": 0.038154943086889305
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5757575757575758,
"acc_stderr": 0.03521224908841586,
"acc_norm": 0.5757575757575758,
"acc_norm_stderr": 0.03521224908841586
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.538860103626943,
"acc_stderr": 0.035975244117345775,
"acc_norm": 0.538860103626943,
"acc_norm_stderr": 0.035975244117345775
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.36923076923076925,
"acc_stderr": 0.024468615241478905,
"acc_norm": 0.36923076923076925,
"acc_norm_stderr": 0.024468615241478905
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871927,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871927
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.40756302521008403,
"acc_stderr": 0.03191863374478466,
"acc_norm": 0.40756302521008403,
"acc_norm_stderr": 0.03191863374478466
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.037101857261199946,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.037101857261199946
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5357798165137615,
"acc_stderr": 0.021382364775701893,
"acc_norm": 0.5357798165137615,
"acc_norm_stderr": 0.021382364775701893
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.30092592592592593,
"acc_stderr": 0.031280390843298825,
"acc_norm": 0.30092592592592593,
"acc_norm_stderr": 0.031280390843298825
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4852941176470588,
"acc_stderr": 0.03507793834791324,
"acc_norm": 0.4852941176470588,
"acc_norm_stderr": 0.03507793834791324
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6118143459915611,
"acc_stderr": 0.031722950043323296,
"acc_norm": 0.6118143459915611,
"acc_norm_stderr": 0.031722950043323296
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.515695067264574,
"acc_stderr": 0.0335412657542081,
"acc_norm": 0.515695067264574,
"acc_norm_stderr": 0.0335412657542081
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5343511450381679,
"acc_stderr": 0.04374928560599738,
"acc_norm": 0.5343511450381679,
"acc_norm_stderr": 0.04374928560599738
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.045454545454545484,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.045454545454545484
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.04832853553437056,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.04832853553437056
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4785276073619632,
"acc_stderr": 0.0392474687675113,
"acc_norm": 0.4785276073619632,
"acc_norm_stderr": 0.0392474687675113
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.6019417475728155,
"acc_stderr": 0.04846748253977238,
"acc_norm": 0.6019417475728155,
"acc_norm_stderr": 0.04846748253977238
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.029343114798094472,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.029343114798094472
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5325670498084292,
"acc_stderr": 0.017841995750520874,
"acc_norm": 0.5325670498084292,
"acc_norm_stderr": 0.017841995750520874
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5086705202312138,
"acc_stderr": 0.026915047355369818,
"acc_norm": 0.5086705202312138,
"acc_norm_stderr": 0.026915047355369818
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331144,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5392156862745098,
"acc_stderr": 0.028541722692618874,
"acc_norm": 0.5392156862745098,
"acc_norm_stderr": 0.028541722692618874
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4662379421221865,
"acc_stderr": 0.028333277109562793,
"acc_norm": 0.4662379421221865,
"acc_norm_stderr": 0.028333277109562793
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4104938271604938,
"acc_stderr": 0.027371350925124768,
"acc_norm": 0.4104938271604938,
"acc_norm_stderr": 0.027371350925124768
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3475177304964539,
"acc_stderr": 0.028406627809590947,
"acc_norm": 0.3475177304964539,
"acc_norm_stderr": 0.028406627809590947
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3474576271186441,
"acc_stderr": 0.012161417729749798,
"acc_norm": 0.3474576271186441,
"acc_norm_stderr": 0.012161417729749798
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.39705882352941174,
"acc_stderr": 0.02972215209928007,
"acc_norm": 0.39705882352941174,
"acc_norm_stderr": 0.02972215209928007
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.019910377463105932,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.019910377463105932
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4816326530612245,
"acc_stderr": 0.031987615467631264,
"acc_norm": 0.4816326530612245,
"acc_norm_stderr": 0.031987615467631264
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.572139303482587,
"acc_stderr": 0.03498541988407795,
"acc_norm": 0.572139303482587,
"acc_norm_stderr": 0.03498541988407795
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39156626506024095,
"acc_stderr": 0.03799857454479636,
"acc_norm": 0.39156626506024095,
"acc_norm_stderr": 0.03799857454479636
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5029239766081871,
"acc_stderr": 0.03834759370936839,
"acc_norm": 0.5029239766081871,
"acc_norm_stderr": 0.03834759370936839
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2741738066095471,
"mc1_stderr": 0.015616518497219371,
"mc2": 0.436959909496514,
"mc2_stderr": 0.01509621411098862
},
"harness|winogrande|5": {
"acc": 0.5887924230465666,
"acc_stderr": 0.013829128358676878
},
"harness|gsm8k|5": {
"acc": 0.19257012888551933,
"acc_stderr": 0.010861483868509925
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_KnutJaegersberg__Qwen-1_8B-Chat-llama | [
"region:us"
] | 2024-01-20T02:49:15+00:00 | {"pretty_name": "Evaluation run of KnutJaegersberg/Qwen-1_8B-Chat-llama", "dataset_summary": "Dataset automatically created during the evaluation run of model [KnutJaegersberg/Qwen-1_8B-Chat-llama](https://huggingface.co/KnutJaegersberg/Qwen-1_8B-Chat-llama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__Qwen-1_8B-Chat-llama\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-20T02:47:07.832828](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Qwen-1_8B-Chat-llama/blob/main/results_2024-01-20T02-47-07.832828.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4417307007712396,\n \"acc_stderr\": 0.03457643291788475,\n \"acc_norm\": 0.4458531507999814,\n \"acc_norm_stderr\": 0.03533462860998811,\n \"mc1\": 0.2741738066095471,\n \"mc1_stderr\": 0.015616518497219371,\n \"mc2\": 0.436959909496514,\n \"mc2_stderr\": 0.01509621411098862\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.34215017064846415,\n \"acc_stderr\": 0.01386415215917728,\n \"acc_norm\": 0.36945392491467577,\n \"acc_norm_stderr\": 0.014104578366491899\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.42959569806811393,\n \"acc_stderr\": 0.004940067402031043,\n \"acc_norm\": 0.5434176458872735,\n \"acc_norm_stderr\": 0.004970933420231931\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.04244633238353229,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.04244633238353229\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.506578947368421,\n \"acc_stderr\": 0.04068590050224971,\n \"acc_norm\": 0.506578947368421,\n \"acc_norm_stderr\": 0.04068590050224971\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.45660377358490567,\n \"acc_stderr\": 0.030656748696739435,\n \"acc_norm\": 0.45660377358490567,\n \"acc_norm_stderr\": 0.030656748696739435\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3680555555555556,\n \"acc_stderr\": 0.04032999053960719,\n \"acc_norm\": 0.3680555555555556,\n \"acc_norm_stderr\": 0.04032999053960719\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.37572254335260113,\n \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.37572254335260113,\n \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.39574468085106385,\n \"acc_stderr\": 0.03196758697835363,\n \"acc_norm\": 0.39574468085106385,\n \"acc_norm_stderr\": 0.03196758697835363\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482758,\n \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482758\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.023517294335963286,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.023517294335963286\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n \"acc_stderr\": 0.039325376803928704,\n \"acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.039325376803928704\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.49032258064516127,\n \"acc_stderr\": 0.028438677998909558,\n \"acc_norm\": 0.49032258064516127,\n \"acc_norm_stderr\": 0.028438677998909558\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3694581280788177,\n \"acc_stderr\": 0.03395970381998574,\n \"acc_norm\": 0.3694581280788177,\n \"acc_norm_stderr\": 0.03395970381998574\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6060606060606061,\n \"acc_stderr\": 0.038154943086889305,\n \"acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.038154943086889305\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5757575757575758,\n \"acc_stderr\": 0.03521224908841586,\n \"acc_norm\": 0.5757575757575758,\n \"acc_norm_stderr\": 0.03521224908841586\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.538860103626943,\n \"acc_stderr\": 0.035975244117345775,\n \"acc_norm\": 0.538860103626943,\n \"acc_norm_stderr\": 0.035975244117345775\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.36923076923076925,\n \"acc_stderr\": 0.024468615241478905,\n \"acc_norm\": 0.36923076923076925,\n \"acc_norm_stderr\": 0.024468615241478905\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871927,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871927\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.40756302521008403,\n \"acc_stderr\": 0.03191863374478466,\n \"acc_norm\": 0.40756302521008403,\n \"acc_norm_stderr\": 0.03191863374478466\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2913907284768212,\n \"acc_stderr\": 0.037101857261199946,\n \"acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.037101857261199946\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.5357798165137615,\n \"acc_stderr\": 0.021382364775701893,\n \"acc_norm\": 0.5357798165137615,\n \"acc_norm_stderr\": 0.021382364775701893\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.30092592592592593,\n \"acc_stderr\": 0.031280390843298825,\n \"acc_norm\": 0.30092592592592593,\n \"acc_norm_stderr\": 0.031280390843298825\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.4852941176470588,\n \"acc_stderr\": 0.03507793834791324,\n \"acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.03507793834791324\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6118143459915611,\n \"acc_stderr\": 0.031722950043323296,\n \"acc_norm\": 0.6118143459915611,\n \"acc_norm_stderr\": 0.031722950043323296\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.515695067264574,\n \"acc_stderr\": 0.0335412657542081,\n \"acc_norm\": 0.515695067264574,\n \"acc_norm_stderr\": 0.0335412657542081\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5343511450381679,\n \"acc_stderr\": 0.04374928560599738,\n \"acc_norm\": 0.5343511450381679,\n \"acc_norm_stderr\": 0.04374928560599738\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.5454545454545454,\n \"acc_stderr\": 0.045454545454545484,\n \"acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.045454545454545484\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.04832853553437056,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.04832853553437056\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.4785276073619632,\n \"acc_stderr\": 0.0392474687675113,\n \"acc_norm\": 0.4785276073619632,\n \"acc_norm_stderr\": 0.0392474687675113\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6019417475728155,\n \"acc_stderr\": 0.04846748253977238,\n \"acc_norm\": 0.6019417475728155,\n \"acc_norm_stderr\": 0.04846748253977238\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.029343114798094472,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.029343114798094472\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5325670498084292,\n \"acc_stderr\": 0.017841995750520874,\n \"acc_norm\": 0.5325670498084292,\n \"acc_norm_stderr\": 0.017841995750520874\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5086705202312138,\n \"acc_stderr\": 0.026915047355369818,\n \"acc_norm\": 0.5086705202312138,\n \"acc_norm_stderr\": 0.026915047355369818\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5392156862745098,\n \"acc_stderr\": 0.028541722692618874,\n \"acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.028541722692618874\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4662379421221865,\n \"acc_stderr\": 0.028333277109562793,\n \"acc_norm\": 0.4662379421221865,\n \"acc_norm_stderr\": 0.028333277109562793\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.4104938271604938,\n \"acc_stderr\": 0.027371350925124768,\n \"acc_norm\": 0.4104938271604938,\n \"acc_norm_stderr\": 0.027371350925124768\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3475177304964539,\n \"acc_stderr\": 0.028406627809590947,\n \"acc_norm\": 0.3475177304964539,\n \"acc_norm_stderr\": 0.028406627809590947\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3474576271186441,\n \"acc_stderr\": 0.012161417729749798,\n \"acc_norm\": 0.3474576271186441,\n \"acc_norm_stderr\": 0.012161417729749798\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.39705882352941174,\n \"acc_stderr\": 0.02972215209928007,\n \"acc_norm\": 0.39705882352941174,\n \"acc_norm_stderr\": 0.02972215209928007\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.019910377463105932,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.019910377463105932\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.4816326530612245,\n \"acc_stderr\": 0.031987615467631264,\n \"acc_norm\": 0.4816326530612245,\n \"acc_norm_stderr\": 0.031987615467631264\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.572139303482587,\n \"acc_stderr\": 0.03498541988407795,\n \"acc_norm\": 0.572139303482587,\n \"acc_norm_stderr\": 0.03498541988407795\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39156626506024095,\n \"acc_stderr\": 0.03799857454479636,\n \"acc_norm\": 0.39156626506024095,\n \"acc_norm_stderr\": 0.03799857454479636\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.5029239766081871,\n \"acc_stderr\": 0.03834759370936839,\n \"acc_norm\": 0.5029239766081871,\n \"acc_norm_stderr\": 0.03834759370936839\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2741738066095471,\n \"mc1_stderr\": 0.015616518497219371,\n \"mc2\": 0.436959909496514,\n \"mc2_stderr\": 0.01509621411098862\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5887924230465666,\n \"acc_stderr\": 0.013829128358676878\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.19257012888551933,\n \"acc_stderr\": 0.010861483868509925\n }\n}\n```", "repo_url": "https://huggingface.co/KnutJaegersberg/Qwen-1_8B-Chat-llama", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|arc:challenge|25_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|gsm8k|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hellaswag|10_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T02-47-07.832828.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["**/details_harness|winogrande|5_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-20T02-47-07.832828.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_20T02_47_07.832828", "path": ["results_2024-01-20T02-47-07.832828.parquet"]}, {"split": "latest", "path": ["results_2024-01-20T02-47-07.832828.parquet"]}]}]} | 2024-01-20T02:49:35+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of KnutJaegersberg/Qwen-1_8B-Chat-llama
Dataset automatically created during the evaluation run of model KnutJaegersberg/Qwen-1_8B-Chat-llama on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-20T02:47:07.832828(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of KnutJaegersberg/Qwen-1_8B-Chat-llama\n\n\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/Qwen-1_8B-Chat-llama on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T02:47:07.832828(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of KnutJaegersberg/Qwen-1_8B-Chat-llama\n\n\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/Qwen-1_8B-Chat-llama on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T02:47:07.832828(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
13c86312fb9fd8db3c6701c7f79442599c587d66 |
# Dataset Card for Evaluation run of RatanRohith/NeuralMathChat-7B-V0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [RatanRohith/NeuralMathChat-7B-V0.2](https://huggingface.co/RatanRohith/NeuralMathChat-7B-V0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_RatanRohith__NeuralMathChat-7B-V0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-20T02:57:06.316295](https://huggingface.co/datasets/open-llm-leaderboard/details_RatanRohith__NeuralMathChat-7B-V0.2/blob/main/results_2024-01-20T02-57-06.316295.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6545807950439831,
"acc_stderr": 0.03196607532013957,
"acc_norm": 0.6547967745208924,
"acc_norm_stderr": 0.032622292763018784,
"mc1": 0.4357405140758874,
"mc1_stderr": 0.017358345398863124,
"mc2": 0.5908768103853224,
"mc2_stderr": 0.015352173627750884
},
"harness|arc:challenge|25": {
"acc": 0.6493174061433447,
"acc_stderr": 0.013944635930726092,
"acc_norm": 0.674061433447099,
"acc_norm_stderr": 0.01369743246669325
},
"harness|hellaswag|10": {
"acc": 0.6734714200358495,
"acc_stderr": 0.004679847503411347,
"acc_norm": 0.8577972515435173,
"acc_norm_stderr": 0.0034854418127129535
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754406,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754406
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.049512182523962625,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.049512182523962625
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055256,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055256
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.02390491431178265,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.02390491431178265
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8232323232323232,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.8232323232323232,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.02371088850197057,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.02371088850197057
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.02938162072646507,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.02938162072646507
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.02646056956124064,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.02646056956124064
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601436,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601436
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159464,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159464
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909456,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281372,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281372
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8326947637292464,
"acc_stderr": 0.013347327202920332,
"acc_norm": 0.8326947637292464,
"acc_norm_stderr": 0.013347327202920332
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468365,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468365
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4100558659217877,
"acc_stderr": 0.01644970820902608,
"acc_norm": 0.4100558659217877,
"acc_norm_stderr": 0.01644970820902608
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.025261691219729477,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.025261691219729477
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.023891879541959617,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.023891879541959617
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291467,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291467
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.0127397115540457,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.0127397115540457
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.028582709753898445,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.028582709753898445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.01887568293806945,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.01887568293806945
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.0287951855742913,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.0287951855742913
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8656716417910447,
"acc_stderr": 0.02411267824090083,
"acc_norm": 0.8656716417910447,
"acc_norm_stderr": 0.02411267824090083
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4357405140758874,
"mc1_stderr": 0.017358345398863124,
"mc2": 0.5908768103853224,
"mc2_stderr": 0.015352173627750884
},
"harness|winogrande|5": {
"acc": 0.8026835043409629,
"acc_stderr": 0.011185026389050374
},
"harness|gsm8k|5": {
"acc": 0.7028051554207733,
"acc_stderr": 0.012588685966624184
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_RatanRohith__NeuralMathChat-7B-V0.2 | [
"region:us"
] | 2024-01-20T02:59:29+00:00 | {"pretty_name": "Evaluation run of RatanRohith/NeuralMathChat-7B-V0.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [RatanRohith/NeuralMathChat-7B-V0.2](https://huggingface.co/RatanRohith/NeuralMathChat-7B-V0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_RatanRohith__NeuralMathChat-7B-V0.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-20T02:57:06.316295](https://huggingface.co/datasets/open-llm-leaderboard/details_RatanRohith__NeuralMathChat-7B-V0.2/blob/main/results_2024-01-20T02-57-06.316295.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6545807950439831,\n \"acc_stderr\": 0.03196607532013957,\n \"acc_norm\": 0.6547967745208924,\n \"acc_norm_stderr\": 0.032622292763018784,\n \"mc1\": 0.4357405140758874,\n \"mc1_stderr\": 0.017358345398863124,\n \"mc2\": 0.5908768103853224,\n \"mc2_stderr\": 0.015352173627750884\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6493174061433447,\n \"acc_stderr\": 0.013944635930726092,\n \"acc_norm\": 0.674061433447099,\n \"acc_norm_stderr\": 0.01369743246669325\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6734714200358495,\n \"acc_stderr\": 0.004679847503411347,\n \"acc_norm\": 0.8577972515435173,\n \"acc_norm_stderr\": 0.0034854418127129535\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754406,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754406\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.049512182523962625,\n \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.049512182523962625\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055256,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055256\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n \"acc_stderr\": 0.02390491431178265,\n \"acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.02390491431178265\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8232323232323232,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.8232323232323232,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.36666666666666664,\n \"acc_stderr\": 0.02938162072646507,\n \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.02938162072646507\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.02646056956124064,\n \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.02646056956124064\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601436,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601436\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159464,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159464\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909456,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909456\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281372,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281372\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8326947637292464,\n \"acc_stderr\": 0.013347327202920332,\n \"acc_norm\": 0.8326947637292464,\n \"acc_norm_stderr\": 0.013347327202920332\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468365,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468365\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4100558659217877,\n \"acc_stderr\": 0.01644970820902608,\n \"acc_norm\": 0.4100558659217877,\n \"acc_norm_stderr\": 0.01644970820902608\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.025261691219729477,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.025261691219729477\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959617,\n \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959617\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291467,\n \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291467\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n \"acc_stderr\": 0.0127397115540457,\n \"acc_norm\": 0.4654498044328553,\n \"acc_norm_stderr\": 0.0127397115540457\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898445,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898445\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.01887568293806945,\n \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.01887568293806945\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.0287951855742913,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.0287951855742913\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8656716417910447,\n \"acc_stderr\": 0.02411267824090083,\n \"acc_norm\": 0.8656716417910447,\n \"acc_norm_stderr\": 0.02411267824090083\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4357405140758874,\n \"mc1_stderr\": 0.017358345398863124,\n \"mc2\": 0.5908768103853224,\n \"mc2_stderr\": 0.015352173627750884\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8026835043409629,\n \"acc_stderr\": 0.011185026389050374\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7028051554207733,\n \"acc_stderr\": 0.012588685966624184\n }\n}\n```", "repo_url": "https://huggingface.co/RatanRohith/NeuralMathChat-7B-V0.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|arc:challenge|25_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|gsm8k|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hellaswag|10_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T02-57-06.316295.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["**/details_harness|winogrande|5_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-20T02-57-06.316295.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_20T02_57_06.316295", "path": ["results_2024-01-20T02-57-06.316295.parquet"]}, {"split": "latest", "path": ["results_2024-01-20T02-57-06.316295.parquet"]}]}]} | 2024-01-20T02:59:51+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of RatanRohith/NeuralMathChat-7B-V0.2
Dataset automatically created during the evaluation run of model RatanRohith/NeuralMathChat-7B-V0.2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-20T02:57:06.316295(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of RatanRohith/NeuralMathChat-7B-V0.2\n\n\n\nDataset automatically created during the evaluation run of model RatanRohith/NeuralMathChat-7B-V0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T02:57:06.316295(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of RatanRohith/NeuralMathChat-7B-V0.2\n\n\n\nDataset automatically created during the evaluation run of model RatanRohith/NeuralMathChat-7B-V0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T02:57:06.316295(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
eeb997009ae40ca43dbaa93a37491cc2b65e7373 |
# Dataset Card for Evaluation run of 222gate/BrurryDog-7b-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [222gate/BrurryDog-7b-v0.1](https://huggingface.co/222gate/BrurryDog-7b-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_222gate__BrurryDog-7b-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-20T03:26:36.549937](https://huggingface.co/datasets/open-llm-leaderboard/details_222gate__BrurryDog-7b-v0.1/blob/main/results_2024-01-20T03-26-36.549937.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6527842705257364,
"acc_stderr": 0.03215067286738653,
"acc_norm": 0.652698073988589,
"acc_norm_stderr": 0.03281396676337054,
"mc1": 0.5777233782129743,
"mc1_stderr": 0.017290733254248177,
"mc2": 0.7004617737856811,
"mc2_stderr": 0.01511981164818303
},
"harness|arc:challenge|25": {
"acc": 0.7005119453924915,
"acc_stderr": 0.01338502163731357,
"acc_norm": 0.7252559726962458,
"acc_norm_stderr": 0.013044617212771227
},
"harness|hellaswag|10": {
"acc": 0.7216689902409879,
"acc_stderr": 0.004472613148508909,
"acc_norm": 0.8836885082652858,
"acc_norm_stderr": 0.0031994286759858682
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.041539484047423976,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.041539484047423976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724057,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356852,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356852
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494563,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494563
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033477,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033477
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606649,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606649
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.03038835355188679,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.03038835355188679
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601446,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159464,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159464
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8339719029374202,
"acc_stderr": 0.013306478243066302,
"acc_norm": 0.8339719029374202,
"acc_norm_stderr": 0.013306478243066302
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044283,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044283
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4480446927374302,
"acc_stderr": 0.016631976628930595,
"acc_norm": 0.4480446927374302,
"acc_norm_stderr": 0.016631976628930595
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600713,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600713
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.0127397115540457,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.0127397115540457
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.02858270975389845,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.02858270975389845
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.019139943748487046,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.019139943748487046
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7551020408163265,
"acc_stderr": 0.02752963744017493,
"acc_norm": 0.7551020408163265,
"acc_norm_stderr": 0.02752963744017493
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5777233782129743,
"mc1_stderr": 0.017290733254248177,
"mc2": 0.7004617737856811,
"mc2_stderr": 0.01511981164818303
},
"harness|winogrande|5": {
"acc": 0.8287292817679558,
"acc_stderr": 0.010588417294962524
},
"harness|gsm8k|5": {
"acc": 0.66868840030326,
"acc_stderr": 0.012964999679688664
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_222gate__BrurryDog-7b-v0.1 | [
"region:us"
] | 2024-01-20T03:28:56+00:00 | {"pretty_name": "Evaluation run of 222gate/BrurryDog-7b-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [222gate/BrurryDog-7b-v0.1](https://huggingface.co/222gate/BrurryDog-7b-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_222gate__BrurryDog-7b-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-20T03:26:36.549937](https://huggingface.co/datasets/open-llm-leaderboard/details_222gate__BrurryDog-7b-v0.1/blob/main/results_2024-01-20T03-26-36.549937.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6527842705257364,\n \"acc_stderr\": 0.03215067286738653,\n \"acc_norm\": 0.652698073988589,\n \"acc_norm_stderr\": 0.03281396676337054,\n \"mc1\": 0.5777233782129743,\n \"mc1_stderr\": 0.017290733254248177,\n \"mc2\": 0.7004617737856811,\n \"mc2_stderr\": 0.01511981164818303\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7005119453924915,\n \"acc_stderr\": 0.01338502163731357,\n \"acc_norm\": 0.7252559726962458,\n \"acc_norm_stderr\": 0.013044617212771227\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7216689902409879,\n \"acc_stderr\": 0.004472613148508909,\n \"acc_norm\": 0.8836885082652858,\n \"acc_norm_stderr\": 0.0031994286759858682\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.041539484047423976,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.041539484047423976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724057,\n \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724057\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.02341529343356852,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.02341529343356852\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033477,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033477\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606649,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606649\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.03038835355188679,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03038835355188679\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5509259259259259,\n \"acc_stderr\": 0.03392238405321617,\n \"acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.03392238405321617\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601446,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601446\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159464,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159464\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n \"acc_stderr\": 0.013306478243066302,\n \"acc_norm\": 0.8339719029374202,\n \"acc_norm_stderr\": 0.013306478243066302\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044283,\n \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044283\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4480446927374302,\n \"acc_stderr\": 0.016631976628930595,\n \"acc_norm\": 0.4480446927374302,\n \"acc_norm_stderr\": 0.016631976628930595\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n \"acc_stderr\": 0.0127397115540457,\n \"acc_norm\": 0.4654498044328553,\n \"acc_norm_stderr\": 0.0127397115540457\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389845,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389845\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.019139943748487046,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.019139943748487046\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.02752963744017493,\n \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.02752963744017493\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5777233782129743,\n \"mc1_stderr\": 0.017290733254248177,\n \"mc2\": 0.7004617737856811,\n \"mc2_stderr\": 0.01511981164818303\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8287292817679558,\n \"acc_stderr\": 0.010588417294962524\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.66868840030326,\n \"acc_stderr\": 0.012964999679688664\n }\n}\n```", "repo_url": "https://huggingface.co/222gate/BrurryDog-7b-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|arc:challenge|25_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|gsm8k|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hellaswag|10_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T03-26-36.549937.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["**/details_harness|winogrande|5_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-20T03-26-36.549937.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_20T03_26_36.549937", "path": ["results_2024-01-20T03-26-36.549937.parquet"]}, {"split": "latest", "path": ["results_2024-01-20T03-26-36.549937.parquet"]}]}]} | 2024-01-20T03:29:17+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of 222gate/BrurryDog-7b-v0.1
Dataset automatically created during the evaluation run of model 222gate/BrurryDog-7b-v0.1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-20T03:26:36.549937(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of 222gate/BrurryDog-7b-v0.1\n\n\n\nDataset automatically created during the evaluation run of model 222gate/BrurryDog-7b-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T03:26:36.549937(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of 222gate/BrurryDog-7b-v0.1\n\n\n\nDataset automatically created during the evaluation run of model 222gate/BrurryDog-7b-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T03:26:36.549937(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
43bb8339b6a0d74698b71db9526e2f1403179136 | # Dataset Card for "chatbot-arena-ja-calm2-7b-chat"
## Chatbot Arena Conversations JA (calm2) Dataset
Chatbot Arena Conversations JA (calm2)はRLHFのための日本語Instructionデータセットです。
「英語で公開されているデータセットをオープンソースのツール・モデルのみを使って日本語用に転用し、日本語LLMの学習に役立てることができるか」を検証する目的で作成しております。
指示文(prompt)は[lmsys/chatbot_arena_conversations](https://huggingface.co/datasets/lmsys/chatbot_arena_conversations)のユーザ入力(CC-BY 4.0)を和訳したものです。これは[Chatbot Arena](https://chat.lmsys.org/)を通して人間が作成した指示文であり、CC-BY 4.0で公開されているものです。複数ターンの対話の場合は最初のユーザ入力のみを使っています(そのため、このデータセットはすべて1ターンの対話のみになっております)。
和訳には[facebookの翻訳モデル](https://huggingface.co/facebook/wmt21-dense-24-wide-en-x)(MIT License)を使っています。
応答文 (chosen, rejected) は上記の指示文に対する[calm2-7b-chat](https://huggingface.co/cyberagent/calm2-7b-chat)(Apache 2.0)の出力です。[lmsys/chatbot_arena_conversations](https://huggingface.co/datasets/lmsys/chatbot_arena_conversations)のデータセットにあるモデル出力は利用しておりません。そのため、GPT-4などの出力を含んでおりません。
Preferenceはreward model [OASST](https://huggingface.co/OpenAssistant/reward-model-deberta-v3-large-v2)(MIT License)を利用し、報酬が大きい方をchosenとしています。OASSTへは日本語のままで入力しています。
## Usage
```python
import datasets
dataset = datasets.load_dataset("cyberagent/chatbot-arena-ja-calm2-7b-chat-experimental", use_auth_token=HF_READ_TOKEN)
```
## なぜこのデータセットを構築したのか?
現在もそして将来も、英語のデータセットは日本語のそれよりも量・質ともに優れているだろうと考えられます。
英語と同程度に日本語に堪能なLLMを構築および評価するためには、英語のデータセットと同等の日本語データセットがあることが理想的です。
その手段の一つとして、日本語のデータセットだけでなく、英語のデータセット・モデルもうまく利用して、日本語の学習・評価のために転用する手段を確保することが有用なのではないかと考えています。
英語のデータセットから日本語のデータセットを構築する手段としては、英語のデータセットの指示文と応答文の両方を自動翻訳で和訳する方法も考えられます。
この方法では優れた英語LLMの応答文を利用できるというメリットがあります。
一方、この方法の問題点は和訳化された日本語([Translationese](https://arxiv.org/abs/2004.06063))が応答文になってしまうという点です。日本語LLMの多くはTranslationeseのような応答を出すことはまれであるため、データの分布がLLMの出力分布と異なっています。
本データセットは「指示文がTranslationeseであっても、応答文が自然な日本語であればRLHFの学習には有効である」という仮説のもと作りました。
また、指示文だけの翻訳であれば、翻訳精度は必ずしも高い必要はないと考えられます。元の英語の指示文と異なっていても、翻訳した日本語の指示文と応答文の意味が対応しているのであれば、翻訳精度は大きな問題にはならないと考えられます。
Chatbot Arenaにおけるユーザの指示文を見ると、必ずしもきれいな指示文になっておらず、文章として完結していないものも多いです。そうだとすると、クオリティの高くない指示文に対してもちゃんと応答するように学習をすることもユーザのためには重要なのではないかと思います。
## 実験結果
このデータセットを用いて[calm2-7b-chat](https://huggingface.co/cyberagent/calm2-7b-chat)に対して[Direct Preference Optimization (DPO)](https://arxiv.org/abs/2305.18290)を行い、[calm2-7b-chat-dpo](https://huggingface.co/ddyuudd/calm2-7b-chat-dpo-experimental)を作成しました。
Instruction Tuningの評価用タスクである[ELYZA-tasks-100](https://huggingface.co/datasets/elyza/ELYZA-tasks-100)と[Japanese MT-Bench](https://github.com/Stability-AI/FastChat/tree/jp-stable/fastchat/llm_judge/data/japanese_mt_bench)を用いてGPT-4による自動評価を行ったところ、どちらのデータセットでもcalm2-7b-chat-dpoの方がcalm2-7b-chatよりも高いスコアが得られました。
### ELYZA-tasks-100 (GPT-4 eval)
| calm2-7b-chat | calm2-7b-chat-dpo |
| ---- | ---- |
| 2.67 | 2.85 |
### Japanese MT-Bench
| | calm2-7b-chat | calm2-7b-chat-dpo |
| ---- | ---- | ---- |
| MEAN | 6.1 | 6.7 |
| extraction | 4.1 | 5.4 |
| humanities | 8.2 | 8.4 |
| reasoning | 3.9 | 4.3 |
| roleplay | 6.4 | 7.0 |
| stem | 6.3 | 6.2 |
| writing | 7.7 | 9.1 |
## Disclaimers and Terms
- This dataset contains conversations that may be considered unsafe,
offensive, or upsetting. It is not intended for training dialogue agents
without applying appropriate filtering measures. We are not responsible for
any outputs of the models trained on this dataset.
- Statements or opinions made in this dataset do not reflect the views of
researchers or institutions involved in the data collection effort.
- Users of this data are responsible for ensuring its appropriate use, which
includes abiding by any applicable laws and regulations.
- Users of this data should adhere to the terms of use for a specific model
when using its direct outputs.
- Users of this data agree to not attempt to determine the identity of
individuals in this dataset.
- このデータセットはキュレーションを行っておりません。重複した入力や出力が含まれます。
## Releases
1.0: v1 release (Jan 24, 2024)
## Author
Yuu Jinnai ([email protected]), Standing on the shoulders of giants
| cyberagent/chatbot-arena-ja-calm2-7b-chat-experimental | [
"language:ja",
"license:cc-by-4.0",
"arxiv:2004.06063",
"arxiv:2305.18290",
"region:us"
] | 2024-01-20T04:07:21+00:00 | {"language": ["ja"], "license": "cc-by-4.0", "dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "chosen", "dtype": "string"}, {"name": "rejected", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 54325011, "num_examples": 29204}], "download_size": 24880061, "dataset_size": 54325011}, "extra_gated_prompt": "Disclaimers and Terms\n- This dataset contains conversations that may be considered unsafe, offensive, or upsetting. It is not intended for training dialogue agents without applying appropriate filtering measures. We are not responsible for any outputs of the models trained on this dataset.\n- Statements or opinions made in this dataset do not reflect the views of researchers or institutions involved in the data collection effort.\n- Users of this data are responsible for ensuring its appropriate use, which includes abiding by any applicable laws and regulations.\n- Users of this data should adhere to the terms of use for a specific model when using its direct outputs.\n- Users of this data agree to not attempt to determine the identity of individuals in this dataset."} | 2024-01-24T08:15:42+00:00 | [
"2004.06063",
"2305.18290"
] | [
"ja"
] | TAGS
#language-Japanese #license-cc-by-4.0 #arxiv-2004.06063 #arxiv-2305.18290 #region-us
| Dataset Card for "chatbot-arena-ja-calm2-7b-chat"
=================================================
Chatbot Arena Conversations JA (calm2) Dataset
----------------------------------------------
Chatbot Arena Conversations JA (calm2)はRLHFのための日本語Instructionデータセットです。
「英語で公開されているデータセットをオープンソースのツール・モデルのみを使って日本語用に転用し、日本語LLMの学習に役立てることができるか」を検証する目的で作成しております。
指示文(prompt)はlmsys/chatbot\_arena\_conversationsのユーザ入力(CC-BY 4.0)を和訳したものです。これはChatbot Arenaを通して人間が作成した指示文であり、CC-BY 4.0で公開されているものです。複数ターンの対話の場合は最初のユーザ入力のみを使っています(そのため、このデータセットはすべて1ターンの対話のみになっております)。
和訳にはfacebookの翻訳モデル(MIT License)を使っています。
応答文 (chosen, rejected) は上記の指示文に対するcalm2-7b-chat(Apache 2.0)の出力です。lmsys/chatbot\_arena\_conversationsのデータセットにあるモデル出力は利用しておりません。そのため、GPT-4などの出力を含んでおりません。
Preferenceはreward model OASST(MIT License)を利用し、報酬が大きい方をchosenとしています。OASSTへは日本語のままで入力しています。
Usage
-----
なぜこのデータセットを構築したのか?
------------------
現在もそして将来も、英語のデータセットは日本語のそれよりも量・質ともに優れているだろうと考えられます。
英語と同程度に日本語に堪能なLLMを構築および評価するためには、英語のデータセットと同等の日本語データセットがあることが理想的です。
その手段の一つとして、日本語のデータセットだけでなく、英語のデータセット・モデルもうまく利用して、日本語の学習・評価のために転用する手段を確保することが有用なのではないかと考えています。
英語のデータセットから日本語のデータセットを構築する手段としては、英語のデータセットの指示文と応答文の両方を自動翻訳で和訳する方法も考えられます。
この方法では優れた英語LLMの応答文を利用できるというメリットがあります。
一方、この方法の問題点は和訳化された日本語(Translationese)が応答文になってしまうという点です。日本語LLMの多くはTranslationeseのような応答を出すことはまれであるため、データの分布がLLMの出力分布と異なっています。
本データセットは「指示文がTranslationeseであっても、応答文が自然な日本語であればRLHFの学習には有効である」という仮説のもと作りました。
また、指示文だけの翻訳であれば、翻訳精度は必ずしも高い必要はないと考えられます。元の英語の指示文と異なっていても、翻訳した日本語の指示文と応答文の意味が対応しているのであれば、翻訳精度は大きな問題にはならないと考えられます。
Chatbot Arenaにおけるユーザの指示文を見ると、必ずしもきれいな指示文になっておらず、文章として完結していないものも多いです。そうだとすると、クオリティの高くない指示文に対してもちゃんと応答するように学習をすることもユーザのためには重要なのではないかと思います。
実験結果
----
このデータセットを用いてcalm2-7b-chatに対してDirect Preference Optimization (DPO)を行い、calm2-7b-chat-dpoを作成しました。
Instruction Tuningの評価用タスクであるELYZA-tasks-100とJapanese MT-Benchを用いてGPT-4による自動評価を行ったところ、どちらのデータセットでもcalm2-7b-chat-dpoの方がcalm2-7b-chatよりも高いスコアが得られました。
### ELYZA-tasks-100 (GPT-4 eval)
### Japanese MT-Bench
calm2-7b-chat: MEAN, calm2-7b-chat-dpo: 6.1
calm2-7b-chat: extraction, calm2-7b-chat-dpo: 4.1
calm2-7b-chat: humanities, calm2-7b-chat-dpo: 8.2
calm2-7b-chat: reasoning, calm2-7b-chat-dpo: 3.9
calm2-7b-chat: roleplay, calm2-7b-chat-dpo: 6.4
calm2-7b-chat: stem, calm2-7b-chat-dpo: 6.3
calm2-7b-chat: writing, calm2-7b-chat-dpo: 7.7
Disclaimers and Terms
---------------------
* This dataset contains conversations that may be considered unsafe,
offensive, or upsetting. It is not intended for training dialogue agents
without applying appropriate filtering measures. We are not responsible for
any outputs of the models trained on this dataset.
* Statements or opinions made in this dataset do not reflect the views of
researchers or institutions involved in the data collection effort.
* Users of this data are responsible for ensuring its appropriate use, which
includes abiding by any applicable laws and regulations.
* Users of this data should adhere to the terms of use for a specific model
when using its direct outputs.
* Users of this data agree to not attempt to determine the identity of
individuals in this dataset.
* このデータセットはキュレーションを行っておりません。重複した入力や出力が含まれます。
Releases
--------
1.0: v1 release (Jan 24, 2024)
Author
------
Yuu Jinnai (jinnai\_yu@URL), Standing on the shoulders of giants
| [
"### ELYZA-tasks-100 (GPT-4 eval)",
"### Japanese MT-Bench\n\n\ncalm2-7b-chat: MEAN, calm2-7b-chat-dpo: 6.1\ncalm2-7b-chat: extraction, calm2-7b-chat-dpo: 4.1\ncalm2-7b-chat: humanities, calm2-7b-chat-dpo: 8.2\ncalm2-7b-chat: reasoning, calm2-7b-chat-dpo: 3.9\ncalm2-7b-chat: roleplay, calm2-7b-chat-dpo: 6.4\ncalm2-7b-chat: stem, calm2-7b-chat-dpo: 6.3\ncalm2-7b-chat: writing, calm2-7b-chat-dpo: 7.7\n\n\nDisclaimers and Terms\n---------------------\n\n\n* This dataset contains conversations that may be considered unsafe,\noffensive, or upsetting. It is not intended for training dialogue agents\nwithout applying appropriate filtering measures. We are not responsible for\nany outputs of the models trained on this dataset.\n* Statements or opinions made in this dataset do not reflect the views of\nresearchers or institutions involved in the data collection effort.\n* Users of this data are responsible for ensuring its appropriate use, which\nincludes abiding by any applicable laws and regulations.\n* Users of this data should adhere to the terms of use for a specific model\nwhen using its direct outputs.\n* Users of this data agree to not attempt to determine the identity of\nindividuals in this dataset.\n* このデータセットはキュレーションを行っておりません。重複した入力や出力が含まれます。\n\n\nReleases\n--------\n\n\n1.0: v1 release (Jan 24, 2024)\n\n\nAuthor\n------\n\n\nYuu Jinnai (jinnai\\_yu@URL), Standing on the shoulders of giants"
] | [
"TAGS\n#language-Japanese #license-cc-by-4.0 #arxiv-2004.06063 #arxiv-2305.18290 #region-us \n",
"### ELYZA-tasks-100 (GPT-4 eval)",
"### Japanese MT-Bench\n\n\ncalm2-7b-chat: MEAN, calm2-7b-chat-dpo: 6.1\ncalm2-7b-chat: extraction, calm2-7b-chat-dpo: 4.1\ncalm2-7b-chat: humanities, calm2-7b-chat-dpo: 8.2\ncalm2-7b-chat: reasoning, calm2-7b-chat-dpo: 3.9\ncalm2-7b-chat: roleplay, calm2-7b-chat-dpo: 6.4\ncalm2-7b-chat: stem, calm2-7b-chat-dpo: 6.3\ncalm2-7b-chat: writing, calm2-7b-chat-dpo: 7.7\n\n\nDisclaimers and Terms\n---------------------\n\n\n* This dataset contains conversations that may be considered unsafe,\noffensive, or upsetting. It is not intended for training dialogue agents\nwithout applying appropriate filtering measures. We are not responsible for\nany outputs of the models trained on this dataset.\n* Statements or opinions made in this dataset do not reflect the views of\nresearchers or institutions involved in the data collection effort.\n* Users of this data are responsible for ensuring its appropriate use, which\nincludes abiding by any applicable laws and regulations.\n* Users of this data should adhere to the terms of use for a specific model\nwhen using its direct outputs.\n* Users of this data agree to not attempt to determine the identity of\nindividuals in this dataset.\n* このデータセットはキュレーションを行っておりません。重複した入力や出力が含まれます。\n\n\nReleases\n--------\n\n\n1.0: v1 release (Jan 24, 2024)\n\n\nAuthor\n------\n\n\nYuu Jinnai (jinnai\\_yu@URL), Standing on the shoulders of giants"
] |
c1d32a4c7d71e11ece2bbaf7fa28591a8efeda78 | # Summary
- wikipedia page : https://en.wikipedia.org/wiki/Category:Malaysian_politicians
- Number of Politicians : 110
- Null Images of Politicians : 16
- link to dataset : https://huggingface.co/datasets/Englios/Wikipedia-Malaysian-Politicians
- date of creation: 2024-20-01 | Englios/Wikipedia-Malaysian-Politicians | [
"language:en",
"region:us"
] | 2024-01-20T04:11:15+00:00 | {"language": ["en"]} | 2024-01-20T04:42:21+00:00 | [] | [
"en"
] | TAGS
#language-English #region-us
| # Summary
- wikipedia page : URL
- Number of Politicians : 110
- Null Images of Politicians : 16
- link to dataset : URL
- date of creation: 2024-20-01 | [
"# Summary\n- wikipedia page : URL\n- Number of Politicians : 110\n- Null Images of Politicians : 16\n\n- link to dataset : URL\n- date of creation: 2024-20-01"
] | [
"TAGS\n#language-English #region-us \n",
"# Summary\n- wikipedia page : URL\n- Number of Politicians : 110\n- Null Images of Politicians : 16\n\n- link to dataset : URL\n- date of creation: 2024-20-01"
] |
3894dfea66c945f3b3223e9a643c62c7834f12ac |
# Dataset Card for Evaluation run of bartowski/internlm2-chat-20b-llama
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [bartowski/internlm2-chat-20b-llama](https://huggingface.co/bartowski/internlm2-chat-20b-llama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bartowski__internlm2-chat-20b-llama",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-20T04:32:43.004960](https://huggingface.co/datasets/open-llm-leaderboard/details_bartowski__internlm2-chat-20b-llama/blob/main/results_2024-01-20T04-32-43.004960.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6635549405277965,
"acc_stderr": 0.031718531357649564,
"acc_norm": 0.6709734088042791,
"acc_norm_stderr": 0.032358019231685,
"mc1": 0.3378212974296206,
"mc1_stderr": 0.016557167322516882,
"mc2": 0.4874293978250427,
"mc2_stderr": 0.014540260066183534
},
"harness|arc:challenge|25": {
"acc": 0.5998293515358362,
"acc_stderr": 0.014317197787809174,
"acc_norm": 0.636518771331058,
"acc_norm_stderr": 0.014056207319068285
},
"harness|hellaswag|10": {
"acc": 0.6167098187612029,
"acc_stderr": 0.0048519441706712605,
"acc_norm": 0.8258315076677952,
"acc_norm_stderr": 0.0037847921724660635
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.75,
"acc_stderr": 0.03523807393012047,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03523807393012047
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.028049186315695255,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.028049186315695255
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.034765901043041336,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.034765901043041336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6468085106382979,
"acc_stderr": 0.031245325202761926,
"acc_norm": 0.6468085106382979,
"acc_norm_stderr": 0.031245325202761926
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.543859649122807,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.543859649122807,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6,
"acc_stderr": 0.04082482904638629,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04082482904638629
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5132275132275133,
"acc_stderr": 0.025742297289575142,
"acc_norm": 0.5132275132275133,
"acc_norm_stderr": 0.025742297289575142
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.02302589961718872,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.02302589961718872
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.03499113137676744,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.03499113137676744
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8424242424242424,
"acc_stderr": 0.028450388805284325,
"acc_norm": 0.8424242424242424,
"acc_norm_stderr": 0.028450388805284325
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8383838383838383,
"acc_stderr": 0.02622591986362928,
"acc_norm": 0.8383838383838383,
"acc_norm_stderr": 0.02622591986362928
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.02423353229775873,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.02423353229775873
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6871794871794872,
"acc_stderr": 0.023507579020645358,
"acc_norm": 0.6871794871794872,
"acc_norm_stderr": 0.023507579020645358
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.029318203645206865,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.029318203645206865
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7310924369747899,
"acc_stderr": 0.028801392193631276,
"acc_norm": 0.7310924369747899,
"acc_norm_stderr": 0.028801392193631276
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.44370860927152317,
"acc_stderr": 0.04056527902281732,
"acc_norm": 0.44370860927152317,
"acc_norm_stderr": 0.04056527902281732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8532110091743119,
"acc_stderr": 0.015173141845126255,
"acc_norm": 0.8532110091743119,
"acc_norm_stderr": 0.015173141845126255
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.033723432716530624,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.033723432716530624
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8676470588235294,
"acc_stderr": 0.023784297520918853,
"acc_norm": 0.8676470588235294,
"acc_norm_stderr": 0.023784297520918853
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8312236286919831,
"acc_stderr": 0.024381406832586234,
"acc_norm": 0.8312236286919831,
"acc_norm_stderr": 0.024381406832586234
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.031602951437766785,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.031602951437766785
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.036028141763926456,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.036028141763926456
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.803680981595092,
"acc_stderr": 0.031207970394709218,
"acc_norm": 0.803680981595092,
"acc_norm_stderr": 0.031207970394709218
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5357142857142857,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.5357142857142857,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7023121387283237,
"acc_stderr": 0.024617055388677,
"acc_norm": 0.7023121387283237,
"acc_norm_stderr": 0.024617055388677
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33743016759776534,
"acc_stderr": 0.015813901283913048,
"acc_norm": 0.33743016759776534,
"acc_norm_stderr": 0.015813901283913048
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7427652733118971,
"acc_stderr": 0.024826171289250888,
"acc_norm": 0.7427652733118971,
"acc_norm_stderr": 0.024826171289250888
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5177304964539007,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.5177304964539007,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5032594524119948,
"acc_stderr": 0.012769964760343318,
"acc_norm": 0.5032594524119948,
"acc_norm_stderr": 0.012769964760343318
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146292,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.018771683893528172,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.018771683893528172
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7673469387755102,
"acc_stderr": 0.02704925791589618,
"acc_norm": 0.7673469387755102,
"acc_norm_stderr": 0.02704925791589618
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101696,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101696
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3378212974296206,
"mc1_stderr": 0.016557167322516882,
"mc2": 0.4874293978250427,
"mc2_stderr": 0.014540260066183534
},
"harness|winogrande|5": {
"acc": 0.7955801104972375,
"acc_stderr": 0.011334090612597204
},
"harness|gsm8k|5": {
"acc": 0.33965125094768767,
"acc_stderr": 0.013045045067665257
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_bartowski__internlm2-chat-20b-llama | [
"region:us"
] | 2024-01-20T04:34:48+00:00 | {"pretty_name": "Evaluation run of bartowski/internlm2-chat-20b-llama", "dataset_summary": "Dataset automatically created during the evaluation run of model [bartowski/internlm2-chat-20b-llama](https://huggingface.co/bartowski/internlm2-chat-20b-llama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bartowski__internlm2-chat-20b-llama\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-20T04:32:43.004960](https://huggingface.co/datasets/open-llm-leaderboard/details_bartowski__internlm2-chat-20b-llama/blob/main/results_2024-01-20T04-32-43.004960.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6635549405277965,\n \"acc_stderr\": 0.031718531357649564,\n \"acc_norm\": 0.6709734088042791,\n \"acc_norm_stderr\": 0.032358019231685,\n \"mc1\": 0.3378212974296206,\n \"mc1_stderr\": 0.016557167322516882,\n \"mc2\": 0.4874293978250427,\n \"mc2_stderr\": 0.014540260066183534\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5998293515358362,\n \"acc_stderr\": 0.014317197787809174,\n \"acc_norm\": 0.636518771331058,\n \"acc_norm_stderr\": 0.014056207319068285\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6167098187612029,\n \"acc_stderr\": 0.0048519441706712605,\n \"acc_norm\": 0.8258315076677952,\n \"acc_norm_stderr\": 0.0037847921724660635\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03523807393012047,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03523807393012047\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.028049186315695255,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.028049186315695255\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.034765901043041336,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.034765901043041336\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6468085106382979,\n \"acc_stderr\": 0.031245325202761926,\n \"acc_norm\": 0.6468085106382979,\n \"acc_norm_stderr\": 0.031245325202761926\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.543859649122807,\n \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.543859649122807,\n \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04082482904638629,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04082482904638629\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.5132275132275133,\n \"acc_stderr\": 0.025742297289575142,\n \"acc_norm\": 0.5132275132275133,\n \"acc_norm_stderr\": 0.025742297289575142\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n \"acc_stderr\": 0.02302589961718872,\n \"acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.02302589961718872\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.03499113137676744,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.03499113137676744\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.028450388805284325,\n \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.028450388805284325\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8383838383838383,\n \"acc_stderr\": 0.02622591986362928,\n \"acc_norm\": 0.8383838383838383,\n \"acc_norm_stderr\": 0.02622591986362928\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.02423353229775873,\n \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.02423353229775873\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6871794871794872,\n \"acc_stderr\": 0.023507579020645358,\n \"acc_norm\": 0.6871794871794872,\n \"acc_norm_stderr\": 0.023507579020645358\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.362962962962963,\n \"acc_stderr\": 0.029318203645206865,\n \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.029318203645206865\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7310924369747899,\n \"acc_stderr\": 0.028801392193631276,\n \"acc_norm\": 0.7310924369747899,\n \"acc_norm_stderr\": 0.028801392193631276\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.44370860927152317,\n \"acc_stderr\": 0.04056527902281732,\n \"acc_norm\": 0.44370860927152317,\n \"acc_norm_stderr\": 0.04056527902281732\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8532110091743119,\n \"acc_stderr\": 0.015173141845126255,\n \"acc_norm\": 0.8532110091743119,\n \"acc_norm_stderr\": 0.015173141845126255\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.033723432716530624,\n \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.033723432716530624\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8676470588235294,\n \"acc_stderr\": 0.023784297520918853,\n \"acc_norm\": 0.8676470588235294,\n \"acc_norm_stderr\": 0.023784297520918853\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8312236286919831,\n \"acc_stderr\": 0.024381406832586234,\n \"acc_norm\": 0.8312236286919831,\n \"acc_norm_stderr\": 0.024381406832586234\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.036028141763926456,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.036028141763926456\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.803680981595092,\n \"acc_stderr\": 0.031207970394709218,\n \"acc_norm\": 0.803680981595092,\n \"acc_norm_stderr\": 0.031207970394709218\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5357142857142857,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.5357142857142857,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7023121387283237,\n \"acc_stderr\": 0.024617055388677,\n \"acc_norm\": 0.7023121387283237,\n \"acc_norm_stderr\": 0.024617055388677\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33743016759776534,\n \"acc_stderr\": 0.015813901283913048,\n \"acc_norm\": 0.33743016759776534,\n \"acc_norm_stderr\": 0.015813901283913048\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7427652733118971,\n \"acc_stderr\": 0.024826171289250888,\n \"acc_norm\": 0.7427652733118971,\n \"acc_norm_stderr\": 0.024826171289250888\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5177304964539007,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.5177304964539007,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5032594524119948,\n \"acc_stderr\": 0.012769964760343318,\n \"acc_norm\": 0.5032594524119948,\n \"acc_norm_stderr\": 0.012769964760343318\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.018771683893528172,\n \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.018771683893528172\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7673469387755102,\n \"acc_stderr\": 0.02704925791589618,\n \"acc_norm\": 0.7673469387755102,\n \"acc_norm_stderr\": 0.02704925791589618\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n \"acc_stderr\": 0.022509345325101696,\n \"acc_norm\": 0.8855721393034826,\n \"acc_norm_stderr\": 0.022509345325101696\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3378212974296206,\n \"mc1_stderr\": 0.016557167322516882,\n \"mc2\": 0.4874293978250427,\n \"mc2_stderr\": 0.014540260066183534\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7955801104972375,\n \"acc_stderr\": 0.011334090612597204\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.33965125094768767,\n \"acc_stderr\": 0.013045045067665257\n }\n}\n```", "repo_url": "https://huggingface.co/bartowski/internlm2-chat-20b-llama", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|arc:challenge|25_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|gsm8k|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hellaswag|10_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T04-32-43.004960.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["**/details_harness|winogrande|5_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-20T04-32-43.004960.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_20T04_32_43.004960", "path": ["results_2024-01-20T04-32-43.004960.parquet"]}, {"split": "latest", "path": ["results_2024-01-20T04-32-43.004960.parquet"]}]}]} | 2024-01-20T04:35:10+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of bartowski/internlm2-chat-20b-llama
Dataset automatically created during the evaluation run of model bartowski/internlm2-chat-20b-llama on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-20T04:32:43.004960(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of bartowski/internlm2-chat-20b-llama\n\n\n\nDataset automatically created during the evaluation run of model bartowski/internlm2-chat-20b-llama on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T04:32:43.004960(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of bartowski/internlm2-chat-20b-llama\n\n\n\nDataset automatically created during the evaluation run of model bartowski/internlm2-chat-20b-llama on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T04:32:43.004960(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
635ba3e303122bed5c7cf5acfeb14906a6d38687 |
# Dataset Card for Evaluation run of Epiculous/Fett-uccine-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Epiculous/Fett-uccine-7B](https://huggingface.co/Epiculous/Fett-uccine-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Epiculous__Fett-uccine-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-20T05:02:15.433939](https://huggingface.co/datasets/open-llm-leaderboard/details_Epiculous__Fett-uccine-7B/blob/main/results_2024-01-20T05-02-15.433939.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6002294551650265,
"acc_stderr": 0.03330447072045356,
"acc_norm": 0.6052353264823649,
"acc_norm_stderr": 0.033978008348377026,
"mc1": 0.5189718482252142,
"mc1_stderr": 0.017490896405762357,
"mc2": 0.694687038148882,
"mc2_stderr": 0.015214491441624034
},
"harness|arc:challenge|25": {
"acc": 0.5878839590443686,
"acc_stderr": 0.014383915302225405,
"acc_norm": 0.6322525597269625,
"acc_norm_stderr": 0.014090995618168482
},
"harness|hellaswag|10": {
"acc": 0.6935869348735312,
"acc_stderr": 0.00460061200042267,
"acc_norm": 0.8608842859988051,
"acc_norm_stderr": 0.003453599726736564
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.03778621079092056,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.03778621079092056
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.046774730044911984,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.046774730044911984
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36772486772486773,
"acc_stderr": 0.024833839825562413,
"acc_norm": 0.36772486772486773,
"acc_norm_stderr": 0.024833839825562413
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377563,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377563
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6193548387096774,
"acc_stderr": 0.02762171783290704,
"acc_norm": 0.6193548387096774,
"acc_norm_stderr": 0.02762171783290704
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.03095405547036589,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.03095405547036589
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015178,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5512820512820513,
"acc_stderr": 0.025217315184846486,
"acc_norm": 0.5512820512820513,
"acc_norm_stderr": 0.025217315184846486
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606649,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606649
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.030489911417673227,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.030489911417673227
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7871559633027523,
"acc_stderr": 0.017549376389313694,
"acc_norm": 0.7871559633027523,
"acc_norm_stderr": 0.017549376389313694
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.029331162294251735,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.029331162294251735
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6143497757847534,
"acc_stderr": 0.03266842214289201,
"acc_norm": 0.6143497757847534,
"acc_norm_stderr": 0.03266842214289201
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709697,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709697
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690879,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690879
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7675606641123882,
"acc_stderr": 0.015104550008905718,
"acc_norm": 0.7675606641123882,
"acc_norm_stderr": 0.015104550008905718
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.025416003773165538,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.025416003773165538
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3329608938547486,
"acc_stderr": 0.015761716178397552,
"acc_norm": 0.3329608938547486,
"acc_norm_stderr": 0.015761716178397552
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.026992544339297243,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.026992544339297243
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.662379421221865,
"acc_stderr": 0.026858825879488544,
"acc_norm": 0.662379421221865,
"acc_norm_stderr": 0.026858825879488544
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7006172839506173,
"acc_stderr": 0.025483115601195455,
"acc_norm": 0.7006172839506173,
"acc_norm_stderr": 0.025483115601195455
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43285528031290743,
"acc_stderr": 0.012654565234622864,
"acc_norm": 0.43285528031290743,
"acc_norm_stderr": 0.012654565234622864
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.02972215209928006,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.02972215209928006
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6225490196078431,
"acc_stderr": 0.019610851474880286,
"acc_norm": 0.6225490196078431,
"acc_norm_stderr": 0.019610851474880286
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065677,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7014925373134329,
"acc_stderr": 0.03235743789355042,
"acc_norm": 0.7014925373134329,
"acc_norm_stderr": 0.03235743789355042
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.0389136449583582,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.0389136449583582
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727682,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727682
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5189718482252142,
"mc1_stderr": 0.017490896405762357,
"mc2": 0.694687038148882,
"mc2_stderr": 0.015214491441624034
},
"harness|winogrande|5": {
"acc": 0.7505919494869772,
"acc_stderr": 0.01216018919693069
},
"harness|gsm8k|5": {
"acc": 0.3661865049279757,
"acc_stderr": 0.013270100238748847
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Epiculous__Fett-uccine-7B | [
"region:us"
] | 2024-01-20T04:39:06+00:00 | {"pretty_name": "Evaluation run of Epiculous/Fett-uccine-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Epiculous/Fett-uccine-7B](https://huggingface.co/Epiculous/Fett-uccine-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Epiculous__Fett-uccine-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-20T05:02:15.433939](https://huggingface.co/datasets/open-llm-leaderboard/details_Epiculous__Fett-uccine-7B/blob/main/results_2024-01-20T05-02-15.433939.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6002294551650265,\n \"acc_stderr\": 0.03330447072045356,\n \"acc_norm\": 0.6052353264823649,\n \"acc_norm_stderr\": 0.033978008348377026,\n \"mc1\": 0.5189718482252142,\n \"mc1_stderr\": 0.017490896405762357,\n \"mc2\": 0.694687038148882,\n \"mc2_stderr\": 0.015214491441624034\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5878839590443686,\n \"acc_stderr\": 0.014383915302225405,\n \"acc_norm\": 0.6322525597269625,\n \"acc_norm_stderr\": 0.014090995618168482\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6935869348735312,\n \"acc_stderr\": 0.00460061200042267,\n \"acc_norm\": 0.8608842859988051,\n \"acc_norm_stderr\": 0.003453599726736564\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5664739884393064,\n \"acc_stderr\": 0.03778621079092056,\n \"acc_norm\": 0.5664739884393064,\n \"acc_norm_stderr\": 0.03778621079092056\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.046774730044911984,\n \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.046774730044911984\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.36772486772486773,\n \"acc_stderr\": 0.024833839825562413,\n \"acc_norm\": 0.36772486772486773,\n \"acc_norm_stderr\": 0.024833839825562413\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.04390259265377563,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.04390259265377563\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6193548387096774,\n \"acc_stderr\": 0.02762171783290704,\n \"acc_norm\": 0.6193548387096774,\n \"acc_norm_stderr\": 0.02762171783290704\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7474747474747475,\n \"acc_stderr\": 0.03095405547036589,\n \"acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.03095405547036589\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015178,\n \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5512820512820513,\n \"acc_stderr\": 0.025217315184846486,\n \"acc_norm\": 0.5512820512820513,\n \"acc_norm_stderr\": 0.025217315184846486\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606649,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606649\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.030489911417673227,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.030489911417673227\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7871559633027523,\n \"acc_stderr\": 0.017549376389313694,\n \"acc_norm\": 0.7871559633027523,\n \"acc_norm_stderr\": 0.017549376389313694\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251735,\n \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251735\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6143497757847534,\n \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.6143497757847534,\n \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.041423137719966634,\n \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.041423137719966634\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709697,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709697\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690879,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690879\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7675606641123882,\n \"acc_stderr\": 0.015104550008905718,\n \"acc_norm\": 0.7675606641123882,\n \"acc_norm_stderr\": 0.015104550008905718\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.025416003773165538,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.025416003773165538\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3329608938547486,\n \"acc_stderr\": 0.015761716178397552,\n \"acc_norm\": 0.3329608938547486,\n \"acc_norm_stderr\": 0.015761716178397552\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.026992544339297243,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.026992544339297243\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.662379421221865,\n \"acc_stderr\": 0.026858825879488544,\n \"acc_norm\": 0.662379421221865,\n \"acc_norm_stderr\": 0.026858825879488544\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.025483115601195455,\n \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.025483115601195455\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43285528031290743,\n \"acc_stderr\": 0.012654565234622864,\n \"acc_norm\": 0.43285528031290743,\n \"acc_norm_stderr\": 0.012654565234622864\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6029411764705882,\n \"acc_stderr\": 0.02972215209928006,\n \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.02972215209928006\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6225490196078431,\n \"acc_stderr\": 0.019610851474880286,\n \"acc_norm\": 0.6225490196078431,\n \"acc_norm_stderr\": 0.019610851474880286\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065677,\n \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065677\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7014925373134329,\n \"acc_stderr\": 0.03235743789355042,\n \"acc_norm\": 0.7014925373134329,\n \"acc_norm_stderr\": 0.03235743789355042\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n \"acc_stderr\": 0.0389136449583582,\n \"acc_norm\": 0.4879518072289157,\n \"acc_norm_stderr\": 0.0389136449583582\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727682,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727682\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5189718482252142,\n \"mc1_stderr\": 0.017490896405762357,\n \"mc2\": 0.694687038148882,\n \"mc2_stderr\": 0.015214491441624034\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7505919494869772,\n \"acc_stderr\": 0.01216018919693069\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3661865049279757,\n \"acc_stderr\": 0.013270100238748847\n }\n}\n```", "repo_url": "https://huggingface.co/Epiculous/Fett-uccine-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|arc:challenge|25_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|arc:challenge|25_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|gsm8k|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|gsm8k|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hellaswag|10_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hellaswag|10_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T04-36-48.555916.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T05-02-15.433939.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["**/details_harness|winogrande|5_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["**/details_harness|winogrande|5_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-20T05-02-15.433939.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_20T04_36_48.555916", "path": ["results_2024-01-20T04-36-48.555916.parquet"]}, {"split": "2024_01_20T05_02_15.433939", "path": ["results_2024-01-20T05-02-15.433939.parquet"]}, {"split": "latest", "path": ["results_2024-01-20T05-02-15.433939.parquet"]}]}]} | 2024-01-20T05:04:38+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Epiculous/Fett-uccine-7B
Dataset automatically created during the evaluation run of model Epiculous/Fett-uccine-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-20T05:02:15.433939(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Epiculous/Fett-uccine-7B\n\n\n\nDataset automatically created during the evaluation run of model Epiculous/Fett-uccine-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T05:02:15.433939(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Epiculous/Fett-uccine-7B\n\n\n\nDataset automatically created during the evaluation run of model Epiculous/Fett-uccine-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T05:02:15.433939(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
873002ad889c18414fc287d7481f894546a8a682 |
# Dataset Card for Evaluation run of UCLA-AGI/zephyr-7b-sft-full-SPIN-iter3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [UCLA-AGI/zephyr-7b-sft-full-SPIN-iter3](https://huggingface.co/UCLA-AGI/zephyr-7b-sft-full-SPIN-iter3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_UCLA-AGI__zephyr-7b-sft-full-SPIN-iter3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-20T04:40:32.614362](https://huggingface.co/datasets/open-llm-leaderboard/details_UCLA-AGI__zephyr-7b-sft-full-SPIN-iter3/blob/main/results_2024-01-20T04-40-32.614362.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6144035496773548,
"acc_stderr": 0.032858739117399755,
"acc_norm": 0.6200519616024565,
"acc_norm_stderr": 0.03352475225298005,
"mc1": 0.4222766217870257,
"mc1_stderr": 0.017290733254248174,
"mc2": 0.5789464689775264,
"mc2_stderr": 0.015807009741465705
},
"harness|arc:challenge|25": {
"acc": 0.6305460750853242,
"acc_stderr": 0.014104578366491887,
"acc_norm": 0.6612627986348123,
"acc_norm_stderr": 0.01383056892797433
},
"harness|hellaswag|10": {
"acc": 0.676458872734515,
"acc_stderr": 0.0046687106891924,
"acc_norm": 0.8584943238398726,
"acc_norm_stderr": 0.0034783009945146973
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.03894734487013317,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.03894734487013317
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.028815615713432115,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.028815615713432115
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949098,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949098
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7322580645161291,
"acc_stderr": 0.02518900666021238,
"acc_norm": 0.7322580645161291,
"acc_norm_stderr": 0.02518900666021238
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.02985751567338642,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.02985751567338642
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.024639789097709447,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.024639789097709447
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6102564102564103,
"acc_stderr": 0.024726967886647074,
"acc_norm": 0.6102564102564103,
"acc_norm_stderr": 0.024726967886647074
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948485,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121622,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121622
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7944954128440367,
"acc_stderr": 0.01732435232501601,
"acc_norm": 0.7944954128440367,
"acc_norm_stderr": 0.01732435232501601
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.03376922151252336,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.03376922151252336
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588663,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588663
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.027479744550808514,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.027479744550808514
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.031493846709941306,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.031493846709941306
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489284,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489284
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.024752411960917212,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.024752411960917212
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3664804469273743,
"acc_stderr": 0.016115235504865464,
"acc_norm": 0.3664804469273743,
"acc_norm_stderr": 0.016115235504865464
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.026493033225145898,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.026493033225145898
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.026457225067811025,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.026457225067811025
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6512345679012346,
"acc_stderr": 0.02651759772446501,
"acc_norm": 0.6512345679012346,
"acc_norm_stderr": 0.02651759772446501
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303055,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303055
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44589308996088656,
"acc_stderr": 0.012695244711379778,
"acc_norm": 0.44589308996088656,
"acc_norm_stderr": 0.012695244711379778
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406755,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.619281045751634,
"acc_stderr": 0.0196438015579248,
"acc_norm": 0.619281045751634,
"acc_norm_stderr": 0.0196438015579248
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6653061224489796,
"acc_stderr": 0.030209235226242307,
"acc_norm": 0.6653061224489796,
"acc_norm_stderr": 0.030209235226242307
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786845,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786845
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4222766217870257,
"mc1_stderr": 0.017290733254248174,
"mc2": 0.5789464689775264,
"mc2_stderr": 0.015807009741465705
},
"harness|winogrande|5": {
"acc": 0.7663772691397001,
"acc_stderr": 0.011892194477183525
},
"harness|gsm8k|5": {
"acc": 0.3419257012888552,
"acc_stderr": 0.0130660896251828
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_UCLA-AGI__zephyr-7b-sft-full-SPIN-iter3 | [
"region:us"
] | 2024-01-20T04:42:49+00:00 | {"pretty_name": "Evaluation run of UCLA-AGI/zephyr-7b-sft-full-SPIN-iter3", "dataset_summary": "Dataset automatically created during the evaluation run of model [UCLA-AGI/zephyr-7b-sft-full-SPIN-iter3](https://huggingface.co/UCLA-AGI/zephyr-7b-sft-full-SPIN-iter3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_UCLA-AGI__zephyr-7b-sft-full-SPIN-iter3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-20T04:40:32.614362](https://huggingface.co/datasets/open-llm-leaderboard/details_UCLA-AGI__zephyr-7b-sft-full-SPIN-iter3/blob/main/results_2024-01-20T04-40-32.614362.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6144035496773548,\n \"acc_stderr\": 0.032858739117399755,\n \"acc_norm\": 0.6200519616024565,\n \"acc_norm_stderr\": 0.03352475225298005,\n \"mc1\": 0.4222766217870257,\n \"mc1_stderr\": 0.017290733254248174,\n \"mc2\": 0.5789464689775264,\n \"mc2_stderr\": 0.015807009741465705\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6305460750853242,\n \"acc_stderr\": 0.014104578366491887,\n \"acc_norm\": 0.6612627986348123,\n \"acc_norm_stderr\": 0.01383056892797433\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.676458872734515,\n \"acc_stderr\": 0.0046687106891924,\n \"acc_norm\": 0.8584943238398726,\n \"acc_norm_stderr\": 0.0034783009945146973\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.03894734487013317,\n \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.03894734487013317\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.028815615713432115,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.028815615713432115\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340355,\n \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340355\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.04343525428949098,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.04343525428949098\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7322580645161291,\n \"acc_stderr\": 0.02518900666021238,\n \"acc_norm\": 0.7322580645161291,\n \"acc_norm_stderr\": 0.02518900666021238\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338642,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338642\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.024639789097709447,\n \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.024639789097709447\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6102564102564103,\n \"acc_stderr\": 0.024726967886647074,\n \"acc_norm\": 0.6102564102564103,\n \"acc_norm_stderr\": 0.024726967886647074\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948485,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948485\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121622,\n \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121622\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7944954128440367,\n \"acc_stderr\": 0.01732435232501601,\n \"acc_norm\": 0.7944954128440367,\n \"acc_norm_stderr\": 0.01732435232501601\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4305555555555556,\n \"acc_stderr\": 0.03376922151252336,\n \"acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.03376922151252336\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588663,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588663\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808514,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808514\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n \"acc_stderr\": 0.023636873317489284,\n \"acc_norm\": 0.8461538461538461,\n \"acc_norm_stderr\": 0.023636873317489284\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.024752411960917212,\n \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.024752411960917212\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3664804469273743,\n \"acc_stderr\": 0.016115235504865464,\n \"acc_norm\": 0.3664804469273743,\n \"acc_norm_stderr\": 0.016115235504865464\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.026493033225145898,\n \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.026493033225145898\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.6816720257234726,\n \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6512345679012346,\n \"acc_stderr\": 0.02651759772446501,\n \"acc_norm\": 0.6512345679012346,\n \"acc_norm_stderr\": 0.02651759772446501\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303055,\n \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303055\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44589308996088656,\n \"acc_stderr\": 0.012695244711379778,\n \"acc_norm\": 0.44589308996088656,\n \"acc_norm_stderr\": 0.012695244711379778\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.619281045751634,\n \"acc_stderr\": 0.0196438015579248,\n \"acc_norm\": 0.619281045751634,\n \"acc_norm_stderr\": 0.0196438015579248\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6653061224489796,\n \"acc_stderr\": 0.030209235226242307,\n \"acc_norm\": 0.6653061224489796,\n \"acc_norm_stderr\": 0.030209235226242307\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.027403859410786845,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.027403859410786845\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4222766217870257,\n \"mc1_stderr\": 0.017290733254248174,\n \"mc2\": 0.5789464689775264,\n \"mc2_stderr\": 0.015807009741465705\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7663772691397001,\n \"acc_stderr\": 0.011892194477183525\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3419257012888552,\n \"acc_stderr\": 0.0130660896251828\n }\n}\n```", "repo_url": "https://huggingface.co/UCLA-AGI/zephyr-7b-sft-full-SPIN-iter3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|arc:challenge|25_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|gsm8k|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hellaswag|10_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T04-40-32.614362.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["**/details_harness|winogrande|5_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-20T04-40-32.614362.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_20T04_40_32.614362", "path": ["results_2024-01-20T04-40-32.614362.parquet"]}, {"split": "latest", "path": ["results_2024-01-20T04-40-32.614362.parquet"]}]}]} | 2024-01-20T04:43:10+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of UCLA-AGI/zephyr-7b-sft-full-SPIN-iter3
Dataset automatically created during the evaluation run of model UCLA-AGI/zephyr-7b-sft-full-SPIN-iter3 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-20T04:40:32.614362(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of UCLA-AGI/zephyr-7b-sft-full-SPIN-iter3\n\n\n\nDataset automatically created during the evaluation run of model UCLA-AGI/zephyr-7b-sft-full-SPIN-iter3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T04:40:32.614362(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of UCLA-AGI/zephyr-7b-sft-full-SPIN-iter3\n\n\n\nDataset automatically created during the evaluation run of model UCLA-AGI/zephyr-7b-sft-full-SPIN-iter3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T04:40:32.614362(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
1eb12443bbf30324edb2dbd106d73cb8d39813b2 |
## Python Copilot Image Training using Import Knowledge Graphs
This dataset is a subset of the matlok python copilot datasets. Please refer to the [Multimodal Python Copilot Training Overview](https://huggingface.co/datasets/matlok/multimodal-python-copilot-training-overview) for more details on how to use this dataset.
### Details
Each row contains a png file in the **dbytes** column.
- Rows: 216642
- Size: 211.2 GB
- Data type: png
- Format: Knowledge graph using NetworkX with alpaca text box
### Schema
The png is in the **dbytes** column:
```
{
"dbytes": "binary",
"dbytes_len": "int64",
"dbytes_mb": "float64",
"filename": "string",
"path": "string",
"repo": "string",
"type": "string"
}
```
### How to use the dataset
```python
from datasets import load_dataset
ds = load_dataset("matlok/python-image-copilot-training-using-import-knowledge-graphs", data_dir="files")
```
| matlok/python-image-copilot-training-using-import-knowledge-graphs | [
"task_categories:text-to-image",
"task_categories:image-to-image",
"task_categories:question-answering",
"task_ids:parsing",
"size_categories:100K<n<1M",
"license:other",
"python-copilot",
"python-coding",
"python-architecture",
"knowledge-graphs",
"multimodal",
"text-image-audio",
"fine-tuning",
"training",
"question-answering",
"image-knowledge-graph",
"alpaca",
"mp3",
"png",
"text",
"instruct",
"import",
"imports",
"region:us"
] | 2024-01-20T04:52:22+00:00 | {"license": ["other"], "size_categories": ["100K<n<1M"], "task_categories": ["text-to-image", "image-to-image", "question-answering"], "task_ids": ["parsing"], "pretty_name": "python copilot image training using import knowledge graphs", "dataset_info": [{"config_name": "view_schema", "splits": [{"name": "view_schema"}]}], "configs": [{"config_name": "view_schema", "data_files": [{"split": "view_schema", "path": "files/lok-python-copilot-img.import-v1_00000780.parquet"}]}], "tags": ["python-copilot", "python-coding", "python-architecture", "knowledge-graphs", "multimodal", "text-image-audio", "fine-tuning", "training", "question-answering", "image-knowledge-graph", "alpaca", "mp3", "png", "text", "instruct", "import", "imports"]} | 2024-01-25T18:52:32+00:00 | [] | [] | TAGS
#task_categories-text-to-image #task_categories-image-to-image #task_categories-question-answering #task_ids-parsing #size_categories-100K<n<1M #license-other #python-copilot #python-coding #python-architecture #knowledge-graphs #multimodal #text-image-audio #fine-tuning #training #question-answering #image-knowledge-graph #alpaca #mp3 #png #text #instruct #import #imports #region-us
|
## Python Copilot Image Training using Import Knowledge Graphs
This dataset is a subset of the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.
### Details
Each row contains a png file in the dbytes column.
- Rows: 216642
- Size: 211.2 GB
- Data type: png
- Format: Knowledge graph using NetworkX with alpaca text box
### Schema
The png is in the dbytes column:
### How to use the dataset
| [
"## Python Copilot Image Training using Import Knowledge Graphs\n\nThis dataset is a subset of the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.",
"### Details\n\nEach row contains a png file in the dbytes column.\n\n- Rows: 216642\n- Size: 211.2 GB\n- Data type: png\n- Format: Knowledge graph using NetworkX with alpaca text box",
"### Schema\n\nThe png is in the dbytes column:",
"### How to use the dataset"
] | [
"TAGS\n#task_categories-text-to-image #task_categories-image-to-image #task_categories-question-answering #task_ids-parsing #size_categories-100K<n<1M #license-other #python-copilot #python-coding #python-architecture #knowledge-graphs #multimodal #text-image-audio #fine-tuning #training #question-answering #image-knowledge-graph #alpaca #mp3 #png #text #instruct #import #imports #region-us \n",
"## Python Copilot Image Training using Import Knowledge Graphs\n\nThis dataset is a subset of the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.",
"### Details\n\nEach row contains a png file in the dbytes column.\n\n- Rows: 216642\n- Size: 211.2 GB\n- Data type: png\n- Format: Knowledge graph using NetworkX with alpaca text box",
"### Schema\n\nThe png is in the dbytes column:",
"### How to use the dataset"
] |
ca97569afec05f0da02e39db2492cd0f773ec892 |
This dataset contains the embeddings and cross cosine similarity for ~76k English nouns, verbs, and adjectives from [Princeton's WordNet database](https://wordnet.princeton.edu/). | SpellcraftAI/wordnet | [
"license:mit",
"region:us"
] | 2024-01-20T05:15:22+00:00 | {"license": "mit"} | 2024-01-22T11:49:31+00:00 | [] | [] | TAGS
#license-mit #region-us
|
This dataset contains the embeddings and cross cosine similarity for ~76k English nouns, verbs, and adjectives from Princeton's WordNet database. | [] | [
"TAGS\n#license-mit #region-us \n"
] |
5de828e8f1872d32238590620dc4c32cf5b5da35 | <div style="display: flex; justify-content: flex-start;">
<div style="flex: 1;">
<figure>
<img src="images/000135.jpg" alt="base image" style="width:100%; height:auto;">
<figcaption>Base</figcaption>
</figure>
</div>
<div style="flex: 1;">
<figure>
<img src="images/000135-nightshade-intensity-LOW-V1.jpg" alt="high" style="width:100%; height:auto;">
<figcaption>Intensity Low</figcaption>
</figure>
</div>
</div>
<div style="display: flex; justify-content: flex-start;">
<div style="flex: 1;">
<figure>
<img src="images/000135-nightshade-intensity-DEFAULT-V1.jpg" alt="high" style="width:100%; height:auto;">
<figcaption>Intensity Default</figcaption>
</figure>
</div>
<div style="flex: 1;">
<figure>
<img src="images/000135-nightshade-intensity-HIGH-V1.jpg" alt="high" style="width:100%; height:auto;">
<figcaption>Intensity High</figcaption>
</figure>
</div>
</div>
---
license: mit
language:
- ja
- en
tags:
- art
size_categories:
- 1K<n<10K
--- | YYXMM/NightShadeAnimeImages | [
"region:us"
] | 2024-01-20T05:16:08+00:00 | {} | 2024-01-24T08:59:42+00:00 | [] | [] | TAGS
#region-us
| <div style="display: flex; justify-content: flex-start;">
<div style="flex: 1;">
<figure>
<img src="images/URL" alt="base image" style="width:100%; height:auto;">
<figcaption>Base</figcaption>
</figure>
</div>
<div style="flex: 1;">
<figure>
<img src="images/URL" alt="high" style="width:100%; height:auto;">
<figcaption>Intensity Low</figcaption>
</figure>
</div>
</div>
<div style="display: flex; justify-content: flex-start;">
<div style="flex: 1;">
<figure>
<img src="images/URL" alt="high" style="width:100%; height:auto;">
<figcaption>Intensity Default</figcaption>
</figure>
</div>
<div style="flex: 1;">
<figure>
<img src="images/URL" alt="high" style="width:100%; height:auto;">
<figcaption>Intensity High</figcaption>
</figure>
</div>
</div>
---
license: mit
language:
- ja
- en
tags:
- art
size_categories:
- 1K<n<10K
--- | [] | [
"TAGS\n#region-us \n"
] |
dd82665146ef553891e1ce1a2f42cea7aa1ec163 |
# Dataset Card for Evaluation run of KnutJaegersberg/Qwen-1_8b-EverythingLM
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [KnutJaegersberg/Qwen-1_8b-EverythingLM](https://huggingface.co/KnutJaegersberg/Qwen-1_8b-EverythingLM) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__Qwen-1_8b-EverythingLM",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-20T05:24:42.561432](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Qwen-1_8b-EverythingLM/blob/main/results_2024-01-20T05-24-42.561432.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4452420564139058,
"acc_stderr": 0.034654847367197227,
"acc_norm": 0.45130700872767276,
"acc_norm_stderr": 0.03544153323888102,
"mc1": 0.23745410036719705,
"mc1_stderr": 0.014896277441041843,
"mc2": 0.3870197968922305,
"mc2_stderr": 0.015286933466885854
},
"harness|arc:challenge|25": {
"acc": 0.35238907849829354,
"acc_stderr": 0.013960142600598678,
"acc_norm": 0.386518771331058,
"acc_norm_stderr": 0.01423008476191048
},
"harness|hellaswag|10": {
"acc": 0.4763991236805417,
"acc_stderr": 0.00498421968173266,
"acc_norm": 0.6265684126667994,
"acc_norm_stderr": 0.004827266662144033
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480863,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480863
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4144736842105263,
"acc_stderr": 0.040089737857792046,
"acc_norm": 0.4144736842105263,
"acc_norm_stderr": 0.040089737857792046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5056603773584906,
"acc_stderr": 0.030770900763851302,
"acc_norm": 0.5056603773584906,
"acc_norm_stderr": 0.030770900763851302
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4027777777777778,
"acc_stderr": 0.041014055198424264,
"acc_norm": 0.4027777777777778,
"acc_norm_stderr": 0.041014055198424264
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3872832369942196,
"acc_stderr": 0.037143259063020656,
"acc_norm": 0.3872832369942196,
"acc_norm_stderr": 0.037143259063020656
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.42127659574468085,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.42127659574468085,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336936,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2804232804232804,
"acc_stderr": 0.023135287974325628,
"acc_norm": 0.2804232804232804,
"acc_norm_stderr": 0.023135287974325628
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.038095238095238106,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.038095238095238106
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.532258064516129,
"acc_stderr": 0.028384747788813332,
"acc_norm": 0.532258064516129,
"acc_norm_stderr": 0.028384747788813332
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39901477832512317,
"acc_stderr": 0.03445487686264715,
"acc_norm": 0.39901477832512317,
"acc_norm_stderr": 0.03445487686264715
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.593939393939394,
"acc_stderr": 0.03834816355401181,
"acc_norm": 0.593939393939394,
"acc_norm_stderr": 0.03834816355401181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.51010101010101,
"acc_stderr": 0.035616254886737454,
"acc_norm": 0.51010101010101,
"acc_norm_stderr": 0.035616254886737454
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5647668393782384,
"acc_stderr": 0.035780381650085846,
"acc_norm": 0.5647668393782384,
"acc_norm_stderr": 0.035780381650085846
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3769230769230769,
"acc_stderr": 0.024570975364225995,
"acc_norm": 0.3769230769230769,
"acc_norm_stderr": 0.024570975364225995
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42016806722689076,
"acc_stderr": 0.03206183783236152,
"acc_norm": 0.42016806722689076,
"acc_norm_stderr": 0.03206183783236152
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5504587155963303,
"acc_stderr": 0.02132788141782336,
"acc_norm": 0.5504587155963303,
"acc_norm_stderr": 0.02132788141782336
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.03141554629402544,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.03141554629402544
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.03498501649369527,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.03498501649369527
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5485232067510548,
"acc_stderr": 0.0323936001739747,
"acc_norm": 0.5485232067510548,
"acc_norm_stderr": 0.0323936001739747
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5022421524663677,
"acc_stderr": 0.033557465352232634,
"acc_norm": 0.5022421524663677,
"acc_norm_stderr": 0.033557465352232634
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.48854961832061067,
"acc_stderr": 0.04384140024078016,
"acc_norm": 0.48854961832061067,
"acc_norm_stderr": 0.04384140024078016
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.628099173553719,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.628099173553719,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.04792898170907061,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.04792898170907061
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4110429447852761,
"acc_stderr": 0.038656978537853624,
"acc_norm": 0.4110429447852761,
"acc_norm_stderr": 0.038656978537853624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.6601941747572816,
"acc_stderr": 0.04689765937278135,
"acc_norm": 0.6601941747572816,
"acc_norm_stderr": 0.04689765937278135
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7008547008547008,
"acc_stderr": 0.029996951858349483,
"acc_norm": 0.7008547008547008,
"acc_norm_stderr": 0.029996951858349483
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5951468710089399,
"acc_stderr": 0.01755324646772026,
"acc_norm": 0.5951468710089399,
"acc_norm_stderr": 0.01755324646772026
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5115606936416185,
"acc_stderr": 0.02691189868637792,
"acc_norm": 0.5115606936416185,
"acc_norm_stderr": 0.02691189868637792
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24581005586592178,
"acc_stderr": 0.014400296429225627,
"acc_norm": 0.24581005586592178,
"acc_norm_stderr": 0.014400296429225627
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.02843109544417665,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.02843109544417665
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4887459807073955,
"acc_stderr": 0.028390897396863533,
"acc_norm": 0.4887459807073955,
"acc_norm_stderr": 0.028390897396863533
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.47530864197530864,
"acc_stderr": 0.02778680093142745,
"acc_norm": 0.47530864197530864,
"acc_norm_stderr": 0.02778680093142745
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3404255319148936,
"acc_stderr": 0.02826765748265014,
"acc_norm": 0.3404255319148936,
"acc_norm_stderr": 0.02826765748265014
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3389830508474576,
"acc_stderr": 0.012089941857584479,
"acc_norm": 0.3389830508474576,
"acc_norm_stderr": 0.012089941857584479
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.029520095697687765,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.029520095697687765
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4133986928104575,
"acc_stderr": 0.01992211568278667,
"acc_norm": 0.4133986928104575,
"acc_norm_stderr": 0.01992211568278667
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5545454545454546,
"acc_stderr": 0.047605488214603246,
"acc_norm": 0.5545454545454546,
"acc_norm_stderr": 0.047605488214603246
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.46938775510204084,
"acc_stderr": 0.031949171367580624,
"acc_norm": 0.46938775510204084,
"acc_norm_stderr": 0.031949171367580624
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6268656716417911,
"acc_stderr": 0.034198326081760065,
"acc_norm": 0.6268656716417911,
"acc_norm_stderr": 0.034198326081760065
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.038284011150790206,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.038284011150790206
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.543859649122807,
"acc_stderr": 0.038200425866029654,
"acc_norm": 0.543859649122807,
"acc_norm_stderr": 0.038200425866029654
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23745410036719705,
"mc1_stderr": 0.014896277441041843,
"mc2": 0.3870197968922305,
"mc2_stderr": 0.015286933466885854
},
"harness|winogrande|5": {
"acc": 0.5895816890292028,
"acc_stderr": 0.013825107120035865
},
"harness|gsm8k|5": {
"acc": 0.12736921910538287,
"acc_stderr": 0.009183110326737833
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_KnutJaegersberg__Qwen-1_8b-EverythingLM | [
"region:us"
] | 2024-01-20T05:26:53+00:00 | {"pretty_name": "Evaluation run of KnutJaegersberg/Qwen-1_8b-EverythingLM", "dataset_summary": "Dataset automatically created during the evaluation run of model [KnutJaegersberg/Qwen-1_8b-EverythingLM](https://huggingface.co/KnutJaegersberg/Qwen-1_8b-EverythingLM) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__Qwen-1_8b-EverythingLM\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-20T05:24:42.561432](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Qwen-1_8b-EverythingLM/blob/main/results_2024-01-20T05-24-42.561432.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4452420564139058,\n \"acc_stderr\": 0.034654847367197227,\n \"acc_norm\": 0.45130700872767276,\n \"acc_norm_stderr\": 0.03544153323888102,\n \"mc1\": 0.23745410036719705,\n \"mc1_stderr\": 0.014896277441041843,\n \"mc2\": 0.3870197968922305,\n \"mc2_stderr\": 0.015286933466885854\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.35238907849829354,\n \"acc_stderr\": 0.013960142600598678,\n \"acc_norm\": 0.386518771331058,\n \"acc_norm_stderr\": 0.01423008476191048\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4763991236805417,\n \"acc_stderr\": 0.00498421968173266,\n \"acc_norm\": 0.6265684126667994,\n \"acc_norm_stderr\": 0.004827266662144033\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n \"acc_stderr\": 0.04299268905480863,\n \"acc_norm\": 0.45185185185185184,\n \"acc_norm_stderr\": 0.04299268905480863\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.4144736842105263,\n \"acc_stderr\": 0.040089737857792046,\n \"acc_norm\": 0.4144736842105263,\n \"acc_norm_stderr\": 0.040089737857792046\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5056603773584906,\n \"acc_stderr\": 0.030770900763851302,\n \"acc_norm\": 0.5056603773584906,\n \"acc_norm_stderr\": 0.030770900763851302\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4027777777777778,\n \"acc_stderr\": 0.041014055198424264,\n \"acc_norm\": 0.4027777777777778,\n \"acc_norm_stderr\": 0.041014055198424264\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3872832369942196,\n \"acc_stderr\": 0.037143259063020656,\n \"acc_norm\": 0.3872832369942196,\n \"acc_norm_stderr\": 0.037143259063020656\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146267,\n \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146267\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2804232804232804,\n \"acc_stderr\": 0.023135287974325628,\n \"acc_norm\": 0.2804232804232804,\n \"acc_norm_stderr\": 0.023135287974325628\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n \"acc_stderr\": 0.038095238095238106,\n \"acc_norm\": 0.23809523809523808,\n \"acc_norm_stderr\": 0.038095238095238106\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.532258064516129,\n \"acc_stderr\": 0.028384747788813332,\n \"acc_norm\": 0.532258064516129,\n \"acc_norm_stderr\": 0.028384747788813332\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.39901477832512317,\n \"acc_stderr\": 0.03445487686264715,\n \"acc_norm\": 0.39901477832512317,\n \"acc_norm_stderr\": 0.03445487686264715\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.593939393939394,\n \"acc_stderr\": 0.03834816355401181,\n \"acc_norm\": 0.593939393939394,\n \"acc_norm_stderr\": 0.03834816355401181\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.51010101010101,\n \"acc_stderr\": 0.035616254886737454,\n \"acc_norm\": 0.51010101010101,\n \"acc_norm_stderr\": 0.035616254886737454\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.5647668393782384,\n \"acc_stderr\": 0.035780381650085846,\n \"acc_norm\": 0.5647668393782384,\n \"acc_norm_stderr\": 0.035780381650085846\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.3769230769230769,\n \"acc_stderr\": 0.024570975364225995,\n \"acc_norm\": 0.3769230769230769,\n \"acc_norm_stderr\": 0.024570975364225995\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.42016806722689076,\n \"acc_stderr\": 0.03206183783236152,\n \"acc_norm\": 0.42016806722689076,\n \"acc_norm_stderr\": 0.03206183783236152\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.5504587155963303,\n \"acc_stderr\": 0.02132788141782336,\n \"acc_norm\": 0.5504587155963303,\n \"acc_norm_stderr\": 0.02132788141782336\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3055555555555556,\n \"acc_stderr\": 0.03141554629402544,\n \"acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.03141554629402544\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.03498501649369527,\n \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.03498501649369527\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.5485232067510548,\n \"acc_stderr\": 0.0323936001739747,\n \"acc_norm\": 0.5485232067510548,\n \"acc_norm_stderr\": 0.0323936001739747\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5022421524663677,\n \"acc_stderr\": 0.033557465352232634,\n \"acc_norm\": 0.5022421524663677,\n \"acc_norm_stderr\": 0.033557465352232634\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.48854961832061067,\n \"acc_stderr\": 0.04384140024078016,\n \"acc_norm\": 0.48854961832061067,\n \"acc_norm_stderr\": 0.04384140024078016\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.628099173553719,\n \"acc_stderr\": 0.044120158066245044,\n \"acc_norm\": 0.628099173553719,\n \"acc_norm_stderr\": 0.044120158066245044\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.04792898170907061,\n \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.04792898170907061\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.4110429447852761,\n \"acc_stderr\": 0.038656978537853624,\n \"acc_norm\": 0.4110429447852761,\n \"acc_norm_stderr\": 0.038656978537853624\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.04689765937278135,\n \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.04689765937278135\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7008547008547008,\n \"acc_stderr\": 0.029996951858349483,\n \"acc_norm\": 0.7008547008547008,\n \"acc_norm_stderr\": 0.029996951858349483\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5951468710089399,\n \"acc_stderr\": 0.01755324646772026,\n \"acc_norm\": 0.5951468710089399,\n \"acc_norm_stderr\": 0.01755324646772026\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5115606936416185,\n \"acc_stderr\": 0.02691189868637792,\n \"acc_norm\": 0.5115606936416185,\n \"acc_norm_stderr\": 0.02691189868637792\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24581005586592178,\n \"acc_stderr\": 0.014400296429225627,\n \"acc_norm\": 0.24581005586592178,\n \"acc_norm_stderr\": 0.014400296429225627\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.02843109544417665,\n \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.02843109544417665\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4887459807073955,\n \"acc_stderr\": 0.028390897396863533,\n \"acc_norm\": 0.4887459807073955,\n \"acc_norm_stderr\": 0.028390897396863533\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.47530864197530864,\n \"acc_stderr\": 0.02778680093142745,\n \"acc_norm\": 0.47530864197530864,\n \"acc_norm_stderr\": 0.02778680093142745\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3404255319148936,\n \"acc_stderr\": 0.02826765748265014,\n \"acc_norm\": 0.3404255319148936,\n \"acc_norm_stderr\": 0.02826765748265014\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3389830508474576,\n \"acc_stderr\": 0.012089941857584479,\n \"acc_norm\": 0.3389830508474576,\n \"acc_norm_stderr\": 0.012089941857584479\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.029520095697687765,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.029520095697687765\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4133986928104575,\n \"acc_stderr\": 0.01992211568278667,\n \"acc_norm\": 0.4133986928104575,\n \"acc_norm_stderr\": 0.01992211568278667\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5545454545454546,\n \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.5545454545454546,\n \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.46938775510204084,\n \"acc_stderr\": 0.031949171367580624,\n \"acc_norm\": 0.46938775510204084,\n \"acc_norm_stderr\": 0.031949171367580624\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6268656716417911,\n \"acc_stderr\": 0.034198326081760065,\n \"acc_norm\": 0.6268656716417911,\n \"acc_norm_stderr\": 0.034198326081760065\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n \"acc_stderr\": 0.038284011150790206,\n \"acc_norm\": 0.40963855421686746,\n \"acc_norm_stderr\": 0.038284011150790206\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.543859649122807,\n \"acc_stderr\": 0.038200425866029654,\n \"acc_norm\": 0.543859649122807,\n \"acc_norm_stderr\": 0.038200425866029654\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23745410036719705,\n \"mc1_stderr\": 0.014896277441041843,\n \"mc2\": 0.3870197968922305,\n \"mc2_stderr\": 0.015286933466885854\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5895816890292028,\n \"acc_stderr\": 0.013825107120035865\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12736921910538287,\n \"acc_stderr\": 0.009183110326737833\n }\n}\n```", "repo_url": "https://huggingface.co/KnutJaegersberg/Qwen-1_8b-EverythingLM", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|arc:challenge|25_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|gsm8k|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hellaswag|10_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T05-24-42.561432.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["**/details_harness|winogrande|5_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-20T05-24-42.561432.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_20T05_24_42.561432", "path": ["results_2024-01-20T05-24-42.561432.parquet"]}, {"split": "latest", "path": ["results_2024-01-20T05-24-42.561432.parquet"]}]}]} | 2024-01-20T05:27:12+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of KnutJaegersberg/Qwen-1_8b-EverythingLM
Dataset automatically created during the evaluation run of model KnutJaegersberg/Qwen-1_8b-EverythingLM on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-20T05:24:42.561432(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of KnutJaegersberg/Qwen-1_8b-EverythingLM\n\n\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/Qwen-1_8b-EverythingLM on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T05:24:42.561432(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of KnutJaegersberg/Qwen-1_8b-EverythingLM\n\n\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/Qwen-1_8b-EverythingLM on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T05:24:42.561432(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
bb9fd0e3a41583800157dd31c2b2e06ea0873775 |
# Dataset Card for Evaluation run of yunconglong/7Bx4_DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [yunconglong/7Bx4_DPO](https://huggingface.co/yunconglong/7Bx4_DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yunconglong__7Bx4_DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-20T05:35:44.285989](https://huggingface.co/datasets/open-llm-leaderboard/details_yunconglong__7Bx4_DPO/blob/main/results_2024-01-20T05-35-44.285989.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6520617363609051,
"acc_stderr": 0.03204309463044923,
"acc_norm": 0.6518681970654147,
"acc_norm_stderr": 0.03270530271674209,
"mc1": 0.5055079559363526,
"mc1_stderr": 0.01750243899045107,
"mc2": 0.6566429740244444,
"mc2_stderr": 0.014902439772800848
},
"harness|arc:challenge|25": {
"acc": 0.6732081911262798,
"acc_stderr": 0.013706665975587333,
"acc_norm": 0.6936860068259386,
"acc_norm_stderr": 0.013470584417276511
},
"harness|hellaswag|10": {
"acc": 0.6795459071898028,
"acc_stderr": 0.004656974162147995,
"acc_norm": 0.8688508265285799,
"acc_norm_stderr": 0.0033687354341613816
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998905,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998905
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.02550648169813821,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.02550648169813821
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181015,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181015
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.02412112541694119,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.02412112541694119
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473075,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887034,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887034
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8550458715596331,
"acc_stderr": 0.01509421569970048,
"acc_norm": 0.8550458715596331,
"acc_norm_stderr": 0.01509421569970048
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.027044621719474086,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.027044621719474086
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601436,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601436
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.013468201614066307,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.013468201614066307
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.423463687150838,
"acc_stderr": 0.016525425898773503,
"acc_norm": 0.423463687150838,
"acc_norm_stderr": 0.016525425898773503
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826528,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826528
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829727,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829727
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46284224250325945,
"acc_stderr": 0.012734923579532069,
"acc_norm": 0.46284224250325945,
"acc_norm_stderr": 0.012734923579532069
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.01916241858862356,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.01916241858862356
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.763265306122449,
"acc_stderr": 0.02721283588407315,
"acc_norm": 0.763265306122449,
"acc_norm_stderr": 0.02721283588407315
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.02484575321230604,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.02484575321230604
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5055079559363526,
"mc1_stderr": 0.01750243899045107,
"mc2": 0.6566429740244444,
"mc2_stderr": 0.014902439772800848
},
"harness|winogrande|5": {
"acc": 0.8058405682715075,
"acc_stderr": 0.011116983392392664
},
"harness|gsm8k|5": {
"acc": 0.7194844579226687,
"acc_stderr": 0.012374608490929553
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_yunconglong__7Bx4_DPO | [
"region:us"
] | 2024-01-20T05:38:00+00:00 | {"pretty_name": "Evaluation run of yunconglong/7Bx4_DPO", "dataset_summary": "Dataset automatically created during the evaluation run of model [yunconglong/7Bx4_DPO](https://huggingface.co/yunconglong/7Bx4_DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yunconglong__7Bx4_DPO\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-20T05:35:44.285989](https://huggingface.co/datasets/open-llm-leaderboard/details_yunconglong__7Bx4_DPO/blob/main/results_2024-01-20T05-35-44.285989.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6520617363609051,\n \"acc_stderr\": 0.03204309463044923,\n \"acc_norm\": 0.6518681970654147,\n \"acc_norm_stderr\": 0.03270530271674209,\n \"mc1\": 0.5055079559363526,\n \"mc1_stderr\": 0.01750243899045107,\n \"mc2\": 0.6566429740244444,\n \"mc2_stderr\": 0.014902439772800848\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6732081911262798,\n \"acc_stderr\": 0.013706665975587333,\n \"acc_norm\": 0.6936860068259386,\n \"acc_norm_stderr\": 0.013470584417276511\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6795459071898028,\n \"acc_stderr\": 0.004656974162147995,\n \"acc_norm\": 0.8688508265285799,\n \"acc_norm_stderr\": 0.0033687354341613816\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998905,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998905\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4312169312169312,\n \"acc_stderr\": 0.02550648169813821,\n \"acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.02550648169813821\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181015,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181015\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.02412112541694119,\n \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.02412112541694119\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473075,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887034,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887034\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8550458715596331,\n \"acc_stderr\": 0.01509421569970048,\n \"acc_norm\": 0.8550458715596331,\n \"acc_norm_stderr\": 0.01509421569970048\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474086,\n \"acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474086\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601436,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601436\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n \"acc_stderr\": 0.013468201614066307,\n \"acc_norm\": 0.8288633461047255,\n \"acc_norm_stderr\": 0.013468201614066307\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.423463687150838,\n \"acc_stderr\": 0.016525425898773503,\n \"acc_norm\": 0.423463687150838,\n \"acc_norm_stderr\": 0.016525425898773503\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826528,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826528\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829727,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829727\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46284224250325945,\n \"acc_stderr\": 0.012734923579532069,\n \"acc_norm\": 0.46284224250325945,\n \"acc_norm_stderr\": 0.012734923579532069\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6601307189542484,\n \"acc_stderr\": 0.01916241858862356,\n \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.01916241858862356\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.763265306122449,\n \"acc_stderr\": 0.02721283588407315,\n \"acc_norm\": 0.763265306122449,\n \"acc_norm_stderr\": 0.02721283588407315\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.02484575321230604,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.02484575321230604\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5055079559363526,\n \"mc1_stderr\": 0.01750243899045107,\n \"mc2\": 0.6566429740244444,\n \"mc2_stderr\": 0.014902439772800848\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8058405682715075,\n \"acc_stderr\": 0.011116983392392664\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7194844579226687,\n \"acc_stderr\": 0.012374608490929553\n }\n}\n```", "repo_url": "https://huggingface.co/yunconglong/7Bx4_DPO", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|arc:challenge|25_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|gsm8k|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hellaswag|10_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T05-35-44.285989.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["**/details_harness|winogrande|5_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-20T05-35-44.285989.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_20T05_35_44.285989", "path": ["results_2024-01-20T05-35-44.285989.parquet"]}, {"split": "latest", "path": ["results_2024-01-20T05-35-44.285989.parquet"]}]}]} | 2024-01-20T05:38:20+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of yunconglong/7Bx4_DPO
Dataset automatically created during the evaluation run of model yunconglong/7Bx4_DPO on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-20T05:35:44.285989(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of yunconglong/7Bx4_DPO\n\n\n\nDataset automatically created during the evaluation run of model yunconglong/7Bx4_DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T05:35:44.285989(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of yunconglong/7Bx4_DPO\n\n\n\nDataset automatically created during the evaluation run of model yunconglong/7Bx4_DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T05:35:44.285989(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
885f2f5efb6ecaa34ec2375ae5f70a063139737b |
# Dataset Card for Evaluation run of qnguyen3/quan-1.8b-base
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [qnguyen3/quan-1.8b-base](https://huggingface.co/qnguyen3/quan-1.8b-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_qnguyen3__quan-1.8b-base",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-20T06:21:30.624094](https://huggingface.co/datasets/open-llm-leaderboard/details_qnguyen3__quan-1.8b-base/blob/main/results_2024-01-20T06-21-30.624094.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.45019557300373664,
"acc_stderr": 0.03466423700665641,
"acc_norm": 0.45513931402439484,
"acc_norm_stderr": 0.035421179165139774,
"mc1": 0.26560587515299877,
"mc1_stderr": 0.015461027627253597,
"mc2": 0.41601399964422436,
"mc2_stderr": 0.014507923663746555
},
"harness|arc:challenge|25": {
"acc": 0.3464163822525597,
"acc_stderr": 0.01390501118006325,
"acc_norm": 0.36945392491467577,
"acc_norm_stderr": 0.014104578366491894
},
"harness|hellaswag|10": {
"acc": 0.43995220075682134,
"acc_stderr": 0.004953667028654385,
"acc_norm": 0.5846444931288588,
"acc_norm_stderr": 0.004917761181740155
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3851851851851852,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.3851851851851852,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5197368421052632,
"acc_stderr": 0.040657710025626036,
"acc_norm": 0.5197368421052632,
"acc_norm_stderr": 0.040657710025626036
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5320754716981132,
"acc_stderr": 0.030709486992556552,
"acc_norm": 0.5320754716981132,
"acc_norm_stderr": 0.030709486992556552
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4375,
"acc_stderr": 0.04148415739394154,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04148415739394154
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4277456647398844,
"acc_stderr": 0.037724468575180255,
"acc_norm": 0.4277456647398844,
"acc_norm_stderr": 0.037724468575180255
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929775,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929775
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4085106382978723,
"acc_stderr": 0.03213418026701576,
"acc_norm": 0.4085106382978723,
"acc_norm_stderr": 0.03213418026701576
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.043036840335373146,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.043036840335373146
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.041641887201693775,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.041641887201693775
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.023973861998992062,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.023973861998992062
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.03764950879790605,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.03764950879790605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.532258064516129,
"acc_stderr": 0.028384747788813332,
"acc_norm": 0.532258064516129,
"acc_norm_stderr": 0.028384747788813332
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3694581280788177,
"acc_stderr": 0.033959703819985726,
"acc_norm": 0.3694581280788177,
"acc_norm_stderr": 0.033959703819985726
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5515151515151515,
"acc_stderr": 0.038835659779569286,
"acc_norm": 0.5515151515151515,
"acc_norm_stderr": 0.038835659779569286
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5858585858585859,
"acc_stderr": 0.03509438348879629,
"acc_norm": 0.5858585858585859,
"acc_norm_stderr": 0.03509438348879629
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5492227979274611,
"acc_stderr": 0.03590910952235524,
"acc_norm": 0.5492227979274611,
"acc_norm_stderr": 0.03590910952235524
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4282051282051282,
"acc_stderr": 0.02508830145469484,
"acc_norm": 0.4282051282051282,
"acc_norm_stderr": 0.02508830145469484
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228405,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228405
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.41596638655462187,
"acc_stderr": 0.03201650100739615,
"acc_norm": 0.41596638655462187,
"acc_norm_stderr": 0.03201650100739615
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5614678899082569,
"acc_stderr": 0.021274713073954572,
"acc_norm": 0.5614678899082569,
"acc_norm_stderr": 0.021274713073954572
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3472222222222222,
"acc_stderr": 0.032468872436376486,
"acc_norm": 0.3472222222222222,
"acc_norm_stderr": 0.032468872436376486
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5441176470588235,
"acc_stderr": 0.03495624522015477,
"acc_norm": 0.5441176470588235,
"acc_norm_stderr": 0.03495624522015477
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5822784810126582,
"acc_stderr": 0.032103530322412685,
"acc_norm": 0.5822784810126582,
"acc_norm_stderr": 0.032103530322412685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.452914798206278,
"acc_stderr": 0.03340867501923324,
"acc_norm": 0.452914798206278,
"acc_norm_stderr": 0.03340867501923324
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5190839694656488,
"acc_stderr": 0.043820947055509867,
"acc_norm": 0.5190839694656488,
"acc_norm_stderr": 0.043820947055509867
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.04832853553437056,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.04832853553437056
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.44171779141104295,
"acc_stderr": 0.039015918258361836,
"acc_norm": 0.44171779141104295,
"acc_norm_stderr": 0.039015918258361836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.0458212416016155,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.0458212416016155
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7051282051282052,
"acc_stderr": 0.02987257770889118,
"acc_norm": 0.7051282051282052,
"acc_norm_stderr": 0.02987257770889118
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.565772669220945,
"acc_stderr": 0.017724589389677785,
"acc_norm": 0.565772669220945,
"acc_norm_stderr": 0.017724589389677785
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.476878612716763,
"acc_stderr": 0.026890297881303125,
"acc_norm": 0.476878612716763,
"acc_norm_stderr": 0.026890297881303125
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24134078212290502,
"acc_stderr": 0.014310999547961438,
"acc_norm": 0.24134078212290502,
"acc_norm_stderr": 0.014310999547961438
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5522875816993464,
"acc_stderr": 0.028472938478033526,
"acc_norm": 0.5522875816993464,
"acc_norm_stderr": 0.028472938478033526
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4758842443729904,
"acc_stderr": 0.028365041542564577,
"acc_norm": 0.4758842443729904,
"acc_norm_stderr": 0.028365041542564577
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4691358024691358,
"acc_stderr": 0.02776768960683392,
"acc_norm": 0.4691358024691358,
"acc_norm_stderr": 0.02776768960683392
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.34397163120567376,
"acc_stderr": 0.028338017428611317,
"acc_norm": 0.34397163120567376,
"acc_norm_stderr": 0.028338017428611317
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3513689700130378,
"acc_stderr": 0.01219296945748402,
"acc_norm": 0.3513689700130378,
"acc_norm_stderr": 0.01219296945748402
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.41911764705882354,
"acc_stderr": 0.029972807170464622,
"acc_norm": 0.41911764705882354,
"acc_norm_stderr": 0.029972807170464622
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.42810457516339867,
"acc_stderr": 0.020017629214213104,
"acc_norm": 0.42810457516339867,
"acc_norm_stderr": 0.020017629214213104
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5545454545454546,
"acc_stderr": 0.047605488214603246,
"acc_norm": 0.5545454545454546,
"acc_norm_stderr": 0.047605488214603246
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.46530612244897956,
"acc_stderr": 0.03193207024425314,
"acc_norm": 0.46530612244897956,
"acc_norm_stderr": 0.03193207024425314
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5771144278606966,
"acc_stderr": 0.034932317774212816,
"acc_norm": 0.5771144278606966,
"acc_norm_stderr": 0.034932317774212816
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3855421686746988,
"acc_stderr": 0.0378913442461155,
"acc_norm": 0.3855421686746988,
"acc_norm_stderr": 0.0378913442461155
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5672514619883041,
"acc_stderr": 0.03799978644370606,
"acc_norm": 0.5672514619883041,
"acc_norm_stderr": 0.03799978644370606
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26560587515299877,
"mc1_stderr": 0.015461027627253597,
"mc2": 0.41601399964422436,
"mc2_stderr": 0.014507923663746555
},
"harness|winogrande|5": {
"acc": 0.579321231254933,
"acc_stderr": 0.013874526372008315
},
"harness|gsm8k|5": {
"acc": 0.19711902956785443,
"acc_stderr": 0.010958021630300654
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_qnguyen3__quan-1.8b-base | [
"region:us"
] | 2024-01-20T06:23:37+00:00 | {"pretty_name": "Evaluation run of qnguyen3/quan-1.8b-base", "dataset_summary": "Dataset automatically created during the evaluation run of model [qnguyen3/quan-1.8b-base](https://huggingface.co/qnguyen3/quan-1.8b-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_qnguyen3__quan-1.8b-base\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-20T06:21:30.624094](https://huggingface.co/datasets/open-llm-leaderboard/details_qnguyen3__quan-1.8b-base/blob/main/results_2024-01-20T06-21-30.624094.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.45019557300373664,\n \"acc_stderr\": 0.03466423700665641,\n \"acc_norm\": 0.45513931402439484,\n \"acc_norm_stderr\": 0.035421179165139774,\n \"mc1\": 0.26560587515299877,\n \"mc1_stderr\": 0.015461027627253597,\n \"mc2\": 0.41601399964422436,\n \"mc2_stderr\": 0.014507923663746555\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3464163822525597,\n \"acc_stderr\": 0.01390501118006325,\n \"acc_norm\": 0.36945392491467577,\n \"acc_norm_stderr\": 0.014104578366491894\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.43995220075682134,\n \"acc_stderr\": 0.004953667028654385,\n \"acc_norm\": 0.5846444931288588,\n \"acc_norm_stderr\": 0.004917761181740155\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3851851851851852,\n \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.3851851851851852,\n \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5197368421052632,\n \"acc_stderr\": 0.040657710025626036,\n \"acc_norm\": 0.5197368421052632,\n \"acc_norm_stderr\": 0.040657710025626036\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5320754716981132,\n \"acc_stderr\": 0.030709486992556552,\n \"acc_norm\": 0.5320754716981132,\n \"acc_norm_stderr\": 0.030709486992556552\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4277456647398844,\n \"acc_stderr\": 0.037724468575180255,\n \"acc_norm\": 0.4277456647398844,\n \"acc_norm_stderr\": 0.037724468575180255\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929775,\n \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929775\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4085106382978723,\n \"acc_stderr\": 0.03213418026701576,\n \"acc_norm\": 0.4085106382978723,\n \"acc_norm_stderr\": 0.03213418026701576\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.043036840335373146,\n \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.043036840335373146\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.041641887201693775,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.041641887201693775\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.31746031746031744,\n \"acc_stderr\": 0.023973861998992062,\n \"acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.023973861998992062\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n \"acc_stderr\": 0.03764950879790605,\n \"acc_norm\": 0.23015873015873015,\n \"acc_norm_stderr\": 0.03764950879790605\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.532258064516129,\n \"acc_stderr\": 0.028384747788813332,\n \"acc_norm\": 0.532258064516129,\n \"acc_norm_stderr\": 0.028384747788813332\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3694581280788177,\n \"acc_stderr\": 0.033959703819985726,\n \"acc_norm\": 0.3694581280788177,\n \"acc_norm_stderr\": 0.033959703819985726\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5515151515151515,\n \"acc_stderr\": 0.038835659779569286,\n \"acc_norm\": 0.5515151515151515,\n \"acc_norm_stderr\": 0.038835659779569286\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5858585858585859,\n \"acc_stderr\": 0.03509438348879629,\n \"acc_norm\": 0.5858585858585859,\n \"acc_norm_stderr\": 0.03509438348879629\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.5492227979274611,\n \"acc_stderr\": 0.03590910952235524,\n \"acc_norm\": 0.5492227979274611,\n \"acc_norm_stderr\": 0.03590910952235524\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4282051282051282,\n \"acc_stderr\": 0.02508830145469484,\n \"acc_norm\": 0.4282051282051282,\n \"acc_norm_stderr\": 0.02508830145469484\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.027940457136228405,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.027940457136228405\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.41596638655462187,\n \"acc_stderr\": 0.03201650100739615,\n \"acc_norm\": 0.41596638655462187,\n \"acc_norm_stderr\": 0.03201650100739615\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.5614678899082569,\n \"acc_stderr\": 0.021274713073954572,\n \"acc_norm\": 0.5614678899082569,\n \"acc_norm_stderr\": 0.021274713073954572\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3472222222222222,\n \"acc_stderr\": 0.032468872436376486,\n \"acc_norm\": 0.3472222222222222,\n \"acc_norm_stderr\": 0.032468872436376486\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5441176470588235,\n \"acc_stderr\": 0.03495624522015477,\n \"acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.03495624522015477\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.5822784810126582,\n \"acc_stderr\": 0.032103530322412685,\n \"acc_norm\": 0.5822784810126582,\n \"acc_norm_stderr\": 0.032103530322412685\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.452914798206278,\n \"acc_stderr\": 0.03340867501923324,\n \"acc_norm\": 0.452914798206278,\n \"acc_norm_stderr\": 0.03340867501923324\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5190839694656488,\n \"acc_stderr\": 0.043820947055509867,\n \"acc_norm\": 0.5190839694656488,\n \"acc_norm_stderr\": 0.043820947055509867\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.04832853553437056,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.04832853553437056\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.44171779141104295,\n \"acc_stderr\": 0.039015918258361836,\n \"acc_norm\": 0.44171779141104295,\n \"acc_norm_stderr\": 0.039015918258361836\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.0458212416016155,\n \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.0458212416016155\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7051282051282052,\n \"acc_stderr\": 0.02987257770889118,\n \"acc_norm\": 0.7051282051282052,\n \"acc_norm_stderr\": 0.02987257770889118\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.565772669220945,\n \"acc_stderr\": 0.017724589389677785,\n \"acc_norm\": 0.565772669220945,\n \"acc_norm_stderr\": 0.017724589389677785\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.476878612716763,\n \"acc_stderr\": 0.026890297881303125,\n \"acc_norm\": 0.476878612716763,\n \"acc_norm_stderr\": 0.026890297881303125\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n \"acc_stderr\": 0.014310999547961438,\n \"acc_norm\": 0.24134078212290502,\n \"acc_norm_stderr\": 0.014310999547961438\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5522875816993464,\n \"acc_stderr\": 0.028472938478033526,\n \"acc_norm\": 0.5522875816993464,\n \"acc_norm_stderr\": 0.028472938478033526\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4758842443729904,\n \"acc_stderr\": 0.028365041542564577,\n \"acc_norm\": 0.4758842443729904,\n \"acc_norm_stderr\": 0.028365041542564577\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.4691358024691358,\n \"acc_stderr\": 0.02776768960683392,\n \"acc_norm\": 0.4691358024691358,\n \"acc_norm_stderr\": 0.02776768960683392\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.34397163120567376,\n \"acc_stderr\": 0.028338017428611317,\n \"acc_norm\": 0.34397163120567376,\n \"acc_norm_stderr\": 0.028338017428611317\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3513689700130378,\n \"acc_stderr\": 0.01219296945748402,\n \"acc_norm\": 0.3513689700130378,\n \"acc_norm_stderr\": 0.01219296945748402\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.41911764705882354,\n \"acc_stderr\": 0.029972807170464622,\n \"acc_norm\": 0.41911764705882354,\n \"acc_norm_stderr\": 0.029972807170464622\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.42810457516339867,\n \"acc_stderr\": 0.020017629214213104,\n \"acc_norm\": 0.42810457516339867,\n \"acc_norm_stderr\": 0.020017629214213104\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5545454545454546,\n \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.5545454545454546,\n \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.46530612244897956,\n \"acc_stderr\": 0.03193207024425314,\n \"acc_norm\": 0.46530612244897956,\n \"acc_norm_stderr\": 0.03193207024425314\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5771144278606966,\n \"acc_stderr\": 0.034932317774212816,\n \"acc_norm\": 0.5771144278606966,\n \"acc_norm_stderr\": 0.034932317774212816\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3855421686746988,\n \"acc_stderr\": 0.0378913442461155,\n \"acc_norm\": 0.3855421686746988,\n \"acc_norm_stderr\": 0.0378913442461155\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.5672514619883041,\n \"acc_stderr\": 0.03799978644370606,\n \"acc_norm\": 0.5672514619883041,\n \"acc_norm_stderr\": 0.03799978644370606\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26560587515299877,\n \"mc1_stderr\": 0.015461027627253597,\n \"mc2\": 0.41601399964422436,\n \"mc2_stderr\": 0.014507923663746555\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.579321231254933,\n \"acc_stderr\": 0.013874526372008315\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.19711902956785443,\n \"acc_stderr\": 0.010958021630300654\n }\n}\n```", "repo_url": "https://huggingface.co/qnguyen3/quan-1.8b-base", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|arc:challenge|25_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|gsm8k|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hellaswag|10_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T06-21-30.624094.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["**/details_harness|winogrande|5_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-20T06-21-30.624094.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_20T06_21_30.624094", "path": ["results_2024-01-20T06-21-30.624094.parquet"]}, {"split": "latest", "path": ["results_2024-01-20T06-21-30.624094.parquet"]}]}]} | 2024-01-20T06:23:58+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of qnguyen3/quan-1.8b-base
Dataset automatically created during the evaluation run of model qnguyen3/quan-1.8b-base on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-20T06:21:30.624094(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of qnguyen3/quan-1.8b-base\n\n\n\nDataset automatically created during the evaluation run of model qnguyen3/quan-1.8b-base on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T06:21:30.624094(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of qnguyen3/quan-1.8b-base\n\n\n\nDataset automatically created during the evaluation run of model qnguyen3/quan-1.8b-base on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T06:21:30.624094(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
23ef7f91f725fdcd8d03a886593f3ed27471122c |
# Dataset Card for Evaluation run of luqmanxyz/Maya_Hermes-2.5-Mistral-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [luqmanxyz/Maya_Hermes-2.5-Mistral-7B](https://huggingface.co/luqmanxyz/Maya_Hermes-2.5-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_luqmanxyz__Maya_Hermes-2.5-Mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-20T06:23:44.665870](https://huggingface.co/datasets/open-llm-leaderboard/details_luqmanxyz__Maya_Hermes-2.5-Mistral-7B/blob/main/results_2024-01-20T06-23-44.665870.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6350225466068612,
"acc_stderr": 0.032278056538100994,
"acc_norm": 0.6365281938421351,
"acc_norm_stderr": 0.03292564993365088,
"mc1": 0.38310893512851896,
"mc1_stderr": 0.017018461679389855,
"mc2": 0.5588800140985622,
"mc2_stderr": 0.01534554020333979
},
"harness|arc:challenge|25": {
"acc": 0.6228668941979523,
"acc_stderr": 0.014163366896192603,
"acc_norm": 0.6629692832764505,
"acc_norm_stderr": 0.013813476652902276
},
"harness|hellaswag|10": {
"acc": 0.6611232822146983,
"acc_stderr": 0.004723605376936913,
"acc_norm": 0.8507269468233419,
"acc_norm_stderr": 0.0035562912320503525
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055266,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055266
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723292,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723292
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047711,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047711
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121437,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121437
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6256410256410256,
"acc_stderr": 0.0245375915728305,
"acc_norm": 0.6256410256410256,
"acc_norm_stderr": 0.0245375915728305
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652458,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652458
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977927,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977927
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.015776239256163227,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.015776239256163227
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.02759917430064077,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.02759917430064077
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699803,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699803
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313728,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313728
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286774,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286774
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077805,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077805
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.013778693778464076,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.013778693778464076
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.32625698324022345,
"acc_stderr": 0.01568044151888918,
"acc_norm": 0.32625698324022345,
"acc_norm_stderr": 0.01568044151888918
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.02399350170904211,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.02399350170904211
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46740547588005216,
"acc_stderr": 0.012743072942653342,
"acc_norm": 0.46740547588005216,
"acc_norm_stderr": 0.012743072942653342
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146293,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146293
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.01897542792050721,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.01897542792050721
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.027962677604768914,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.027962677604768914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.38310893512851896,
"mc1_stderr": 0.017018461679389855,
"mc2": 0.5588800140985622,
"mc2_stderr": 0.01534554020333979
},
"harness|winogrande|5": {
"acc": 0.7884767166535123,
"acc_stderr": 0.011477747684223188
},
"harness|gsm8k|5": {
"acc": 0.6224412433661866,
"acc_stderr": 0.013353150666358532
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_luqmanxyz__Maya_Hermes-2.5-Mistral-7B | [
"region:us"
] | 2024-01-20T06:26:03+00:00 | {"pretty_name": "Evaluation run of luqmanxyz/Maya_Hermes-2.5-Mistral-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [luqmanxyz/Maya_Hermes-2.5-Mistral-7B](https://huggingface.co/luqmanxyz/Maya_Hermes-2.5-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_luqmanxyz__Maya_Hermes-2.5-Mistral-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-20T06:23:44.665870](https://huggingface.co/datasets/open-llm-leaderboard/details_luqmanxyz__Maya_Hermes-2.5-Mistral-7B/blob/main/results_2024-01-20T06-23-44.665870.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6350225466068612,\n \"acc_stderr\": 0.032278056538100994,\n \"acc_norm\": 0.6365281938421351,\n \"acc_norm_stderr\": 0.03292564993365088,\n \"mc1\": 0.38310893512851896,\n \"mc1_stderr\": 0.017018461679389855,\n \"mc2\": 0.5588800140985622,\n \"mc2_stderr\": 0.01534554020333979\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6228668941979523,\n \"acc_stderr\": 0.014163366896192603,\n \"acc_norm\": 0.6629692832764505,\n \"acc_norm_stderr\": 0.013813476652902276\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6611232822146983,\n \"acc_stderr\": 0.004723605376936913,\n \"acc_norm\": 0.8507269468233419,\n \"acc_norm_stderr\": 0.0035562912320503525\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055266,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055266\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723292,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723292\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047711,\n \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047711\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121437,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121437\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6256410256410256,\n \"acc_stderr\": 0.0245375915728305,\n \"acc_norm\": 0.6256410256410256,\n \"acc_norm_stderr\": 0.0245375915728305\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652458,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652458\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977927,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977927\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163227,\n \"acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163227\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.02759917430064077,\n \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.02759917430064077\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.030636591348699803,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.030636591348699803\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313728,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313728\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286774,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286774\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.022509033937077805,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.022509033937077805\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n \"acc_stderr\": 0.013778693778464076,\n \"acc_norm\": 0.8186462324393359,\n \"acc_norm_stderr\": 0.013778693778464076\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.32625698324022345,\n \"acc_stderr\": 0.01568044151888918,\n \"acc_norm\": 0.32625698324022345,\n \"acc_norm_stderr\": 0.01568044151888918\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904211,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904211\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n \"acc_stderr\": 0.012743072942653342,\n \"acc_norm\": 0.46740547588005216,\n \"acc_norm_stderr\": 0.012743072942653342\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146293,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146293\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.01897542792050721,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.01897542792050721\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n \"acc_stderr\": 0.027962677604768914,\n \"acc_norm\": 0.8059701492537313,\n \"acc_norm_stderr\": 0.027962677604768914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.38310893512851896,\n \"mc1_stderr\": 0.017018461679389855,\n \"mc2\": 0.5588800140985622,\n \"mc2_stderr\": 0.01534554020333979\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7884767166535123,\n \"acc_stderr\": 0.011477747684223188\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6224412433661866,\n \"acc_stderr\": 0.013353150666358532\n }\n}\n```", "repo_url": "https://huggingface.co/luqmanxyz/Maya_Hermes-2.5-Mistral-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|arc:challenge|25_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|gsm8k|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hellaswag|10_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T06-23-44.665870.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["**/details_harness|winogrande|5_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-20T06-23-44.665870.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_20T06_23_44.665870", "path": ["results_2024-01-20T06-23-44.665870.parquet"]}, {"split": "latest", "path": ["results_2024-01-20T06-23-44.665870.parquet"]}]}]} | 2024-01-20T06:26:29+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of luqmanxyz/Maya_Hermes-2.5-Mistral-7B
Dataset automatically created during the evaluation run of model luqmanxyz/Maya_Hermes-2.5-Mistral-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-20T06:23:44.665870(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of luqmanxyz/Maya_Hermes-2.5-Mistral-7B\n\n\n\nDataset automatically created during the evaluation run of model luqmanxyz/Maya_Hermes-2.5-Mistral-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T06:23:44.665870(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of luqmanxyz/Maya_Hermes-2.5-Mistral-7B\n\n\n\nDataset automatically created during the evaluation run of model luqmanxyz/Maya_Hermes-2.5-Mistral-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T06:23:44.665870(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
bb7ab974627f63507c5d7c4aeecea0b03cfd29df |
# Dataset Card for Evaluation run of ConvexAI/BurningBruce-003
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ConvexAI/BurningBruce-003](https://huggingface.co/ConvexAI/BurningBruce-003) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ConvexAI__BurningBruce-003",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-20T06:25:43.434577](https://huggingface.co/datasets/open-llm-leaderboard/details_ConvexAI__BurningBruce-003/blob/main/results_2024-01-20T06-25-43.434577.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6508619834171414,
"acc_stderr": 0.03201792510133264,
"acc_norm": 0.6499287379859224,
"acc_norm_stderr": 0.03268846567220191,
"mc1": 0.5238678090575275,
"mc1_stderr": 0.017483547156961564,
"mc2": 0.6638891866418904,
"mc2_stderr": 0.015278150666534426
},
"harness|arc:challenge|25": {
"acc": 0.6877133105802048,
"acc_stderr": 0.013542598541688065,
"acc_norm": 0.712457337883959,
"acc_norm_stderr": 0.01322671905626613
},
"harness|hellaswag|10": {
"acc": 0.7093208524198367,
"acc_stderr": 0.00453147740758965,
"acc_norm": 0.882194781915953,
"acc_norm_stderr": 0.0032171849068479436
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7320754716981132,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.7320754716981132,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.035331333893236574,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.035331333893236574
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289733,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971125,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971125
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.015919557829976037,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.015919557829976037
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.034086558679777494,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.034086558679777494
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601446,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903336,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903336
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258176,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4446927374301676,
"acc_stderr": 0.01661988198817702,
"acc_norm": 0.4446927374301676,
"acc_norm_stderr": 0.01661988198817702
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46740547588005216,
"acc_stderr": 0.012743072942653345,
"acc_norm": 0.46740547588005216,
"acc_norm_stderr": 0.012743072942653345
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724553,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169146,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169146
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5238678090575275,
"mc1_stderr": 0.017483547156961564,
"mc2": 0.6638891866418904,
"mc2_stderr": 0.015278150666534426
},
"harness|winogrande|5": {
"acc": 0.8318863456985004,
"acc_stderr": 0.010510336954166729
},
"harness|gsm8k|5": {
"acc": 0.7225170583775588,
"acc_stderr": 0.012333447581047534
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_ConvexAI__BurningBruce-003 | [
"region:us"
] | 2024-01-20T06:28:03+00:00 | {"pretty_name": "Evaluation run of ConvexAI/BurningBruce-003", "dataset_summary": "Dataset automatically created during the evaluation run of model [ConvexAI/BurningBruce-003](https://huggingface.co/ConvexAI/BurningBruce-003) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ConvexAI__BurningBruce-003\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-20T06:25:43.434577](https://huggingface.co/datasets/open-llm-leaderboard/details_ConvexAI__BurningBruce-003/blob/main/results_2024-01-20T06-25-43.434577.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6508619834171414,\n \"acc_stderr\": 0.03201792510133264,\n \"acc_norm\": 0.6499287379859224,\n \"acc_norm_stderr\": 0.03268846567220191,\n \"mc1\": 0.5238678090575275,\n \"mc1_stderr\": 0.017483547156961564,\n \"mc2\": 0.6638891866418904,\n \"mc2_stderr\": 0.015278150666534426\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6877133105802048,\n \"acc_stderr\": 0.013542598541688065,\n \"acc_norm\": 0.712457337883959,\n \"acc_norm_stderr\": 0.01322671905626613\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7093208524198367,\n \"acc_stderr\": 0.00453147740758965,\n \"acc_norm\": 0.882194781915953,\n \"acc_norm_stderr\": 0.0032171849068479436\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7320754716981132,\n \"acc_stderr\": 0.027257260322494845,\n \"acc_norm\": 0.7320754716981132,\n \"acc_norm_stderr\": 0.027257260322494845\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971125,\n \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971125\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8348623853211009,\n \"acc_stderr\": 0.015919557829976037,\n \"acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.015919557829976037\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.034086558679777494,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.034086558679777494\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601446,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601446\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n \"acc_stderr\": 0.013586619219903336,\n \"acc_norm\": 0.8250319284802043,\n \"acc_norm_stderr\": 0.013586619219903336\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258176,\n \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258176\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4446927374301676,\n \"acc_stderr\": 0.01661988198817702,\n \"acc_norm\": 0.4446927374301676,\n \"acc_norm_stderr\": 0.01661988198817702\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n \"acc_stderr\": 0.012743072942653345,\n \"acc_norm\": 0.46740547588005216,\n \"acc_norm_stderr\": 0.012743072942653345\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169146,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169146\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5238678090575275,\n \"mc1_stderr\": 0.017483547156961564,\n \"mc2\": 0.6638891866418904,\n \"mc2_stderr\": 0.015278150666534426\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8318863456985004,\n \"acc_stderr\": 0.010510336954166729\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7225170583775588,\n \"acc_stderr\": 0.012333447581047534\n }\n}\n```", "repo_url": "https://huggingface.co/ConvexAI/BurningBruce-003", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|arc:challenge|25_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|gsm8k|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hellaswag|10_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T06-25-43.434577.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["**/details_harness|winogrande|5_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-20T06-25-43.434577.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_20T06_25_43.434577", "path": ["results_2024-01-20T06-25-43.434577.parquet"]}, {"split": "latest", "path": ["results_2024-01-20T06-25-43.434577.parquet"]}]}]} | 2024-01-20T06:28:28+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of ConvexAI/BurningBruce-003
Dataset automatically created during the evaluation run of model ConvexAI/BurningBruce-003 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-20T06:25:43.434577(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of ConvexAI/BurningBruce-003\n\n\n\nDataset automatically created during the evaluation run of model ConvexAI/BurningBruce-003 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T06:25:43.434577(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ConvexAI/BurningBruce-003\n\n\n\nDataset automatically created during the evaluation run of model ConvexAI/BurningBruce-003 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T06:25:43.434577(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
b490216ad5c9bc645860454395319a39d24aff61 |
# Dataset Card for Evaluation run of haoranxu/ALMA-13B-R
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [haoranxu/ALMA-13B-R](https://huggingface.co/haoranxu/ALMA-13B-R) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_haoranxu__ALMA-13B-R",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-20T07:24:09.655926](https://huggingface.co/datasets/open-llm-leaderboard/details_haoranxu__ALMA-13B-R/blob/main/results_2024-01-20T07-24-09.655926.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4935056783282025,
"acc_stderr": 0.03402030256976682,
"acc_norm": 0.5012713038885745,
"acc_norm_stderr": 0.03495235233822466,
"mc1": 0.2521419828641371,
"mc1_stderr": 0.015201522246299962,
"mc2": 0.36085639370497274,
"mc2_stderr": 0.013958476205822561
},
"harness|arc:challenge|25": {
"acc": 0.5290102389078498,
"acc_stderr": 0.014586776355294328,
"acc_norm": 0.5554607508532423,
"acc_norm_stderr": 0.01452122640562708
},
"harness|hellaswag|10": {
"acc": 0.5967934674367655,
"acc_stderr": 0.004895390341445622,
"acc_norm": 0.7944632543318064,
"acc_norm_stderr": 0.00403267443344754
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5481481481481482,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.5481481481481482,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5,
"acc_stderr": 0.04068942293855797,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04068942293855797
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5207547169811321,
"acc_stderr": 0.030746349975723463,
"acc_norm": 0.5207547169811321,
"acc_norm_stderr": 0.030746349975723463
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.041795966175810016,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.041795966175810016
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4682080924855491,
"acc_stderr": 0.03804749744364764,
"acc_norm": 0.4682080924855491,
"acc_norm_stderr": 0.03804749744364764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364397,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364397
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4127659574468085,
"acc_stderr": 0.03218471141400352,
"acc_norm": 0.4127659574468085,
"acc_norm_stderr": 0.03218471141400352
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489362,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489362
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.328042328042328,
"acc_stderr": 0.024180497164376907,
"acc_norm": 0.328042328042328,
"acc_norm_stderr": 0.024180497164376907
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.038932596106046734,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.038932596106046734
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5516129032258065,
"acc_stderr": 0.028292056830112735,
"acc_norm": 0.5516129032258065,
"acc_norm_stderr": 0.028292056830112735
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3793103448275862,
"acc_stderr": 0.03413963805906235,
"acc_norm": 0.3793103448275862,
"acc_norm_stderr": 0.03413963805906235
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.03793713171165635,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.03793713171165635
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5808080808080808,
"acc_stderr": 0.03515520728670417,
"acc_norm": 0.5808080808080808,
"acc_norm_stderr": 0.03515520728670417
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.689119170984456,
"acc_stderr": 0.033403619062765864,
"acc_norm": 0.689119170984456,
"acc_norm_stderr": 0.033403619062765864
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.44871794871794873,
"acc_stderr": 0.025217315184846482,
"acc_norm": 0.44871794871794873,
"acc_norm_stderr": 0.025217315184846482
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.02696242432507383,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.02696242432507383
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.03242225027115006,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.03242225027115006
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23841059602649006,
"acc_stderr": 0.03479185572599661,
"acc_norm": 0.23841059602649006,
"acc_norm_stderr": 0.03479185572599661
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6385321100917432,
"acc_stderr": 0.020598082009937378,
"acc_norm": 0.6385321100917432,
"acc_norm_stderr": 0.020598082009937378
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3472222222222222,
"acc_stderr": 0.032468872436376486,
"acc_norm": 0.3472222222222222,
"acc_norm_stderr": 0.032468872436376486
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.03465868196380762,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.03465868196380762
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6286919831223629,
"acc_stderr": 0.03145068600744859,
"acc_norm": 0.6286919831223629,
"acc_norm_stderr": 0.03145068600744859
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.032190792004199956,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.032190792004199956
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5877862595419847,
"acc_stderr": 0.04317171194870254,
"acc_norm": 0.5877862595419847,
"acc_norm_stderr": 0.04317171194870254
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6859504132231405,
"acc_stderr": 0.04236964753041019,
"acc_norm": 0.6859504132231405,
"acc_norm_stderr": 0.04236964753041019
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.04691521224077742,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.04691521224077742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5705521472392638,
"acc_stderr": 0.03889066619112722,
"acc_norm": 0.5705521472392638,
"acc_norm_stderr": 0.03889066619112722
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291519,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291519
},
"harness|hendrycksTest-management|5": {
"acc": 0.6310679611650486,
"acc_stderr": 0.04777615181156739,
"acc_norm": 0.6310679611650486,
"acc_norm_stderr": 0.04777615181156739
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.717948717948718,
"acc_stderr": 0.02948036054954119,
"acc_norm": 0.717948717948718,
"acc_norm_stderr": 0.02948036054954119
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6717752234993615,
"acc_stderr": 0.016791685640192892,
"acc_norm": 0.6717752234993615,
"acc_norm_stderr": 0.016791685640192892
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5635838150289018,
"acc_stderr": 0.026700545424943673,
"acc_norm": 0.5635838150289018,
"acc_norm_stderr": 0.026700545424943673
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3027932960893855,
"acc_stderr": 0.015366860386397108,
"acc_norm": 0.3027932960893855,
"acc_norm_stderr": 0.015366860386397108
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5424836601307189,
"acc_stderr": 0.02852638345214264,
"acc_norm": 0.5424836601307189,
"acc_norm_stderr": 0.02852638345214264
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5819935691318328,
"acc_stderr": 0.028013651891995072,
"acc_norm": 0.5819935691318328,
"acc_norm_stderr": 0.028013651891995072
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5709876543209876,
"acc_stderr": 0.027538925613470863,
"acc_norm": 0.5709876543209876,
"acc_norm_stderr": 0.027538925613470863
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36524822695035464,
"acc_stderr": 0.028723863853281285,
"acc_norm": 0.36524822695035464,
"acc_norm_stderr": 0.028723863853281285
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3670143415906128,
"acc_stderr": 0.012310264244842124,
"acc_norm": 0.3670143415906128,
"acc_norm_stderr": 0.012310264244842124
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4522058823529412,
"acc_stderr": 0.030233758551596455,
"acc_norm": 0.4522058823529412,
"acc_norm_stderr": 0.030233758551596455
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.511437908496732,
"acc_stderr": 0.020222541515610863,
"acc_norm": 0.511437908496732,
"acc_norm_stderr": 0.020222541515610863
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5727272727272728,
"acc_stderr": 0.04738198703545483,
"acc_norm": 0.5727272727272728,
"acc_norm_stderr": 0.04738198703545483
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5224489795918368,
"acc_stderr": 0.031976941187136725,
"acc_norm": 0.5224489795918368,
"acc_norm_stderr": 0.031976941187136725
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6567164179104478,
"acc_stderr": 0.03357379665433431,
"acc_norm": 0.6567164179104478,
"acc_norm_stderr": 0.03357379665433431
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.03851597683718534,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.03851597683718534
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7309941520467836,
"acc_stderr": 0.03401052620104089,
"acc_norm": 0.7309941520467836,
"acc_norm_stderr": 0.03401052620104089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2521419828641371,
"mc1_stderr": 0.015201522246299962,
"mc2": 0.36085639370497274,
"mc2_stderr": 0.013958476205822561
},
"harness|winogrande|5": {
"acc": 0.7529597474348856,
"acc_stderr": 0.01212140294285557
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_haoranxu__ALMA-13B-R | [
"region:us"
] | 2024-01-20T07:26:28+00:00 | {"pretty_name": "Evaluation run of haoranxu/ALMA-13B-R", "dataset_summary": "Dataset automatically created during the evaluation run of model [haoranxu/ALMA-13B-R](https://huggingface.co/haoranxu/ALMA-13B-R) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_haoranxu__ALMA-13B-R\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-20T07:24:09.655926](https://huggingface.co/datasets/open-llm-leaderboard/details_haoranxu__ALMA-13B-R/blob/main/results_2024-01-20T07-24-09.655926.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4935056783282025,\n \"acc_stderr\": 0.03402030256976682,\n \"acc_norm\": 0.5012713038885745,\n \"acc_norm_stderr\": 0.03495235233822466,\n \"mc1\": 0.2521419828641371,\n \"mc1_stderr\": 0.015201522246299962,\n \"mc2\": 0.36085639370497274,\n \"mc2_stderr\": 0.013958476205822561\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5290102389078498,\n \"acc_stderr\": 0.014586776355294328,\n \"acc_norm\": 0.5554607508532423,\n \"acc_norm_stderr\": 0.01452122640562708\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5967934674367655,\n \"acc_stderr\": 0.004895390341445622,\n \"acc_norm\": 0.7944632543318064,\n \"acc_norm_stderr\": 0.00403267443344754\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5481481481481482,\n \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.5481481481481482,\n \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04068942293855797,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04068942293855797\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5207547169811321,\n \"acc_stderr\": 0.030746349975723463,\n \"acc_norm\": 0.5207547169811321,\n \"acc_norm_stderr\": 0.030746349975723463\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.041795966175810016,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.041795966175810016\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4682080924855491,\n \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.4682080924855491,\n \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364397,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364397\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4127659574468085,\n \"acc_stderr\": 0.03218471141400352,\n \"acc_norm\": 0.4127659574468085,\n \"acc_norm_stderr\": 0.03218471141400352\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.04142439719489362,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.04142439719489362\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.328042328042328,\n \"acc_stderr\": 0.024180497164376907,\n \"acc_norm\": 0.328042328042328,\n \"acc_norm_stderr\": 0.024180497164376907\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n \"acc_stderr\": 0.038932596106046734,\n \"acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.038932596106046734\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5516129032258065,\n \"acc_stderr\": 0.028292056830112735,\n \"acc_norm\": 0.5516129032258065,\n \"acc_norm_stderr\": 0.028292056830112735\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3793103448275862,\n \"acc_stderr\": 0.03413963805906235,\n \"acc_norm\": 0.3793103448275862,\n \"acc_norm_stderr\": 0.03413963805906235\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.03793713171165635,\n \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.03793713171165635\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5808080808080808,\n \"acc_stderr\": 0.03515520728670417,\n \"acc_norm\": 0.5808080808080808,\n \"acc_norm_stderr\": 0.03515520728670417\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.689119170984456,\n \"acc_stderr\": 0.033403619062765864,\n \"acc_norm\": 0.689119170984456,\n \"acc_norm_stderr\": 0.033403619062765864\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.44871794871794873,\n \"acc_stderr\": 0.025217315184846482,\n \"acc_norm\": 0.44871794871794873,\n \"acc_norm_stderr\": 0.025217315184846482\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507383,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507383\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.03242225027115006,\n \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.03242225027115006\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.23841059602649006,\n \"acc_stderr\": 0.03479185572599661,\n \"acc_norm\": 0.23841059602649006,\n \"acc_norm_stderr\": 0.03479185572599661\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6385321100917432,\n \"acc_stderr\": 0.020598082009937378,\n \"acc_norm\": 0.6385321100917432,\n \"acc_norm_stderr\": 0.020598082009937378\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3472222222222222,\n \"acc_stderr\": 0.032468872436376486,\n \"acc_norm\": 0.3472222222222222,\n \"acc_norm_stderr\": 0.032468872436376486\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5784313725490197,\n \"acc_stderr\": 0.03465868196380762,\n \"acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.03465868196380762\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6286919831223629,\n \"acc_stderr\": 0.03145068600744859,\n \"acc_norm\": 0.6286919831223629,\n \"acc_norm_stderr\": 0.03145068600744859\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n \"acc_stderr\": 0.032190792004199956,\n \"acc_norm\": 0.6412556053811659,\n \"acc_norm_stderr\": 0.032190792004199956\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5877862595419847,\n \"acc_stderr\": 0.04317171194870254,\n \"acc_norm\": 0.5877862595419847,\n \"acc_norm_stderr\": 0.04317171194870254\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6859504132231405,\n \"acc_stderr\": 0.04236964753041019,\n \"acc_norm\": 0.6859504132231405,\n \"acc_norm_stderr\": 0.04236964753041019\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6203703703703703,\n \"acc_stderr\": 0.04691521224077742,\n \"acc_norm\": 0.6203703703703703,\n \"acc_norm_stderr\": 0.04691521224077742\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5705521472392638,\n \"acc_stderr\": 0.03889066619112722,\n \"acc_norm\": 0.5705521472392638,\n \"acc_norm_stderr\": 0.03889066619112722\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.04777615181156739,\n \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.04777615181156739\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.717948717948718,\n \"acc_stderr\": 0.02948036054954119,\n \"acc_norm\": 0.717948717948718,\n \"acc_norm_stderr\": 0.02948036054954119\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6717752234993615,\n \"acc_stderr\": 0.016791685640192892,\n \"acc_norm\": 0.6717752234993615,\n \"acc_norm_stderr\": 0.016791685640192892\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5635838150289018,\n \"acc_stderr\": 0.026700545424943673,\n \"acc_norm\": 0.5635838150289018,\n \"acc_norm_stderr\": 0.026700545424943673\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3027932960893855,\n \"acc_stderr\": 0.015366860386397108,\n \"acc_norm\": 0.3027932960893855,\n \"acc_norm_stderr\": 0.015366860386397108\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5424836601307189,\n \"acc_stderr\": 0.02852638345214264,\n \"acc_norm\": 0.5424836601307189,\n \"acc_norm_stderr\": 0.02852638345214264\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5819935691318328,\n \"acc_stderr\": 0.028013651891995072,\n \"acc_norm\": 0.5819935691318328,\n \"acc_norm_stderr\": 0.028013651891995072\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5709876543209876,\n \"acc_stderr\": 0.027538925613470863,\n \"acc_norm\": 0.5709876543209876,\n \"acc_norm_stderr\": 0.027538925613470863\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.36524822695035464,\n \"acc_stderr\": 0.028723863853281285,\n \"acc_norm\": 0.36524822695035464,\n \"acc_norm_stderr\": 0.028723863853281285\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3670143415906128,\n \"acc_stderr\": 0.012310264244842124,\n \"acc_norm\": 0.3670143415906128,\n \"acc_norm_stderr\": 0.012310264244842124\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4522058823529412,\n \"acc_stderr\": 0.030233758551596455,\n \"acc_norm\": 0.4522058823529412,\n \"acc_norm_stderr\": 0.030233758551596455\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.511437908496732,\n \"acc_stderr\": 0.020222541515610863,\n \"acc_norm\": 0.511437908496732,\n \"acc_norm_stderr\": 0.020222541515610863\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5727272727272728,\n \"acc_stderr\": 0.04738198703545483,\n \"acc_norm\": 0.5727272727272728,\n \"acc_norm_stderr\": 0.04738198703545483\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5224489795918368,\n \"acc_stderr\": 0.031976941187136725,\n \"acc_norm\": 0.5224489795918368,\n \"acc_norm_stderr\": 0.031976941187136725\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6567164179104478,\n \"acc_stderr\": 0.03357379665433431,\n \"acc_norm\": 0.6567164179104478,\n \"acc_norm_stderr\": 0.03357379665433431\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n \"acc_stderr\": 0.03851597683718534,\n \"acc_norm\": 0.42771084337349397,\n \"acc_norm_stderr\": 0.03851597683718534\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7309941520467836,\n \"acc_stderr\": 0.03401052620104089,\n \"acc_norm\": 0.7309941520467836,\n \"acc_norm_stderr\": 0.03401052620104089\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2521419828641371,\n \"mc1_stderr\": 0.015201522246299962,\n \"mc2\": 0.36085639370497274,\n \"mc2_stderr\": 0.013958476205822561\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7529597474348856,\n \"acc_stderr\": 0.01212140294285557\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/haoranxu/ALMA-13B-R", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|arc:challenge|25_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|gsm8k|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hellaswag|10_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T07-24-09.655926.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["**/details_harness|winogrande|5_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-20T07-24-09.655926.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_20T07_24_09.655926", "path": ["results_2024-01-20T07-24-09.655926.parquet"]}, {"split": "latest", "path": ["results_2024-01-20T07-24-09.655926.parquet"]}]}]} | 2024-01-20T07:26:48+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of haoranxu/ALMA-13B-R
Dataset automatically created during the evaluation run of model haoranxu/ALMA-13B-R on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-20T07:24:09.655926(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of haoranxu/ALMA-13B-R\n\n\n\nDataset automatically created during the evaluation run of model haoranxu/ALMA-13B-R on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T07:24:09.655926(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of haoranxu/ALMA-13B-R\n\n\n\nDataset automatically created during the evaluation run of model haoranxu/ALMA-13B-R on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T07:24:09.655926(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
7901df610ae63faf9e1cd73ce3f7b61674bc00a9 |
# Dataset Card for Evaluation run of BlueNipples/SnowLotus-v2-10.7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BlueNipples/SnowLotus-v2-10.7B](https://huggingface.co/BlueNipples/SnowLotus-v2-10.7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BlueNipples__SnowLotus-v2-10.7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-20T07:51:08.326530](https://huggingface.co/datasets/open-llm-leaderboard/details_BlueNipples__SnowLotus-v2-10.7B/blob/main/results_2024-01-20T07-51-08.326530.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6414098845053054,
"acc_stderr": 0.03211274251704057,
"acc_norm": 0.6446775451538039,
"acc_norm_stderr": 0.03276105557389541,
"mc1": 0.3390452876376989,
"mc1_stderr": 0.016571797910626608,
"mc2": 0.455391269193204,
"mc2_stderr": 0.015276610420265695
},
"harness|arc:challenge|25": {
"acc": 0.6143344709897611,
"acc_stderr": 0.014224250973257184,
"acc_norm": 0.6476109215017065,
"acc_norm_stderr": 0.013960142600598684
},
"harness|hellaswag|10": {
"acc": 0.6677952599083847,
"acc_stderr": 0.004700413824942559,
"acc_norm": 0.8528181637124079,
"acc_norm_stderr": 0.0035356302890914675
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.02872750295788027,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.02872750295788027
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062947,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062947
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.025591857761382182,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.025591857761382182
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.03499113137676744,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.03499113137676744
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656209,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656209
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026552207828215282,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026552207828215282
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6333333333333333,
"acc_stderr": 0.02443301646605246,
"acc_norm": 0.6333333333333333,
"acc_norm_stderr": 0.02443301646605246
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131133,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131133
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6302521008403361,
"acc_stderr": 0.031357095996135904,
"acc_norm": 0.6302521008403361,
"acc_norm_stderr": 0.031357095996135904
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242741,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242741
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.818348623853211,
"acc_stderr": 0.016530617409266847,
"acc_norm": 0.818348623853211,
"acc_norm_stderr": 0.016530617409266847
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.033723432716530624,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.033723432716530624
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8227848101265823,
"acc_stderr": 0.024856364184503228,
"acc_norm": 0.8227848101265823,
"acc_norm_stderr": 0.024856364184503228
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7130044843049327,
"acc_stderr": 0.030360379710291947,
"acc_norm": 0.7130044843049327,
"acc_norm_stderr": 0.030360379710291947
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467765,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467765
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909456,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.020237149008990905,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.020237149008990905
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8122605363984674,
"acc_stderr": 0.013964393769899133,
"acc_norm": 0.8122605363984674,
"acc_norm_stderr": 0.013964393769899133
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545546,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545546
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.32625698324022345,
"acc_stderr": 0.01568044151888918,
"acc_norm": 0.32625698324022345,
"acc_norm_stderr": 0.01568044151888918
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.02573885479781873,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.02573885479781873
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.024836057868294677,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.024836057868294677
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4810951760104302,
"acc_stderr": 0.012761104871472657,
"acc_norm": 0.4810951760104302,
"acc_norm_stderr": 0.012761104871472657
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.027971541370170595,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.027971541370170595
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6879084967320261,
"acc_stderr": 0.01874501120127766,
"acc_norm": 0.6879084967320261,
"acc_norm_stderr": 0.01874501120127766
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786848,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786848
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.03878626771002361,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.03878626771002361
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3390452876376989,
"mc1_stderr": 0.016571797910626608,
"mc2": 0.455391269193204,
"mc2_stderr": 0.015276610420265695
},
"harness|winogrande|5": {
"acc": 0.8208366219415943,
"acc_stderr": 0.010777949156047986
},
"harness|gsm8k|5": {
"acc": 0.4874905231235785,
"acc_stderr": 0.013768173615087857
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_BlueNipples__SnowLotus-v2-10.7B | [
"region:us"
] | 2024-01-20T07:53:21+00:00 | {"pretty_name": "Evaluation run of BlueNipples/SnowLotus-v2-10.7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [BlueNipples/SnowLotus-v2-10.7B](https://huggingface.co/BlueNipples/SnowLotus-v2-10.7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BlueNipples__SnowLotus-v2-10.7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-20T07:51:08.326530](https://huggingface.co/datasets/open-llm-leaderboard/details_BlueNipples__SnowLotus-v2-10.7B/blob/main/results_2024-01-20T07-51-08.326530.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6414098845053054,\n \"acc_stderr\": 0.03211274251704057,\n \"acc_norm\": 0.6446775451538039,\n \"acc_norm_stderr\": 0.03276105557389541,\n \"mc1\": 0.3390452876376989,\n \"mc1_stderr\": 0.016571797910626608,\n \"mc2\": 0.455391269193204,\n \"mc2_stderr\": 0.015276610420265695\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6143344709897611,\n \"acc_stderr\": 0.014224250973257184,\n \"acc_norm\": 0.6476109215017065,\n \"acc_norm_stderr\": 0.013960142600598684\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6677952599083847,\n \"acc_stderr\": 0.004700413824942559,\n \"acc_norm\": 0.8528181637124079,\n \"acc_norm_stderr\": 0.0035356302890914675\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.02872750295788027,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.02872750295788027\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062947,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062947\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146267,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146267\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.025591857761382182,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.025591857761382182\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.03499113137676744,\n \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.03499113137676744\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656209,\n \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656209\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026552207828215282,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026552207828215282\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6333333333333333,\n \"acc_stderr\": 0.02443301646605246,\n \"acc_norm\": 0.6333333333333333,\n \"acc_norm_stderr\": 0.02443301646605246\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131133,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131133\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.031357095996135904,\n \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.031357095996135904\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242741,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242741\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.818348623853211,\n \"acc_stderr\": 0.016530617409266847,\n \"acc_norm\": 0.818348623853211,\n \"acc_norm_stderr\": 0.016530617409266847\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.033723432716530624,\n \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.033723432716530624\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8227848101265823,\n \"acc_stderr\": 0.024856364184503228,\n \"acc_norm\": 0.8227848101265823,\n \"acc_norm_stderr\": 0.024856364184503228\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n \"acc_stderr\": 0.030360379710291947,\n \"acc_norm\": 0.7130044843049327,\n \"acc_norm_stderr\": 0.030360379710291947\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467765,\n \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467765\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909456,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909456\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n \"acc_stderr\": 0.020237149008990905,\n \"acc_norm\": 0.8931623931623932,\n \"acc_norm_stderr\": 0.020237149008990905\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8122605363984674,\n \"acc_stderr\": 0.013964393769899133,\n \"acc_norm\": 0.8122605363984674,\n \"acc_norm_stderr\": 0.013964393769899133\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545546,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545546\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.32625698324022345,\n \"acc_stderr\": 0.01568044151888918,\n \"acc_norm\": 0.32625698324022345,\n \"acc_norm_stderr\": 0.01568044151888918\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.02573885479781873,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.02573885479781873\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294677,\n \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294677\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4810951760104302,\n \"acc_stderr\": 0.012761104871472657,\n \"acc_norm\": 0.4810951760104302,\n \"acc_norm_stderr\": 0.012761104871472657\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170595,\n \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170595\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6879084967320261,\n \"acc_stderr\": 0.01874501120127766,\n \"acc_norm\": 0.6879084967320261,\n \"acc_norm_stderr\": 0.01874501120127766\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.027403859410786848,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.027403859410786848\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.03878626771002361,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.03878626771002361\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3390452876376989,\n \"mc1_stderr\": 0.016571797910626608,\n \"mc2\": 0.455391269193204,\n \"mc2_stderr\": 0.015276610420265695\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8208366219415943,\n \"acc_stderr\": 0.010777949156047986\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4874905231235785,\n \"acc_stderr\": 0.013768173615087857\n }\n}\n```", "repo_url": "https://huggingface.co/BlueNipples/SnowLotus-v2-10.7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|arc:challenge|25_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|gsm8k|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hellaswag|10_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T07-51-08.326530.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["**/details_harness|winogrande|5_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-20T07-51-08.326530.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_20T07_51_08.326530", "path": ["results_2024-01-20T07-51-08.326530.parquet"]}, {"split": "latest", "path": ["results_2024-01-20T07-51-08.326530.parquet"]}]}]} | 2024-01-20T07:54:23+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of BlueNipples/SnowLotus-v2-10.7B
Dataset automatically created during the evaluation run of model BlueNipples/SnowLotus-v2-10.7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-20T07:51:08.326530(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of BlueNipples/SnowLotus-v2-10.7B\n\n\n\nDataset automatically created during the evaluation run of model BlueNipples/SnowLotus-v2-10.7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T07:51:08.326530(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of BlueNipples/SnowLotus-v2-10.7B\n\n\n\nDataset automatically created during the evaluation run of model BlueNipples/SnowLotus-v2-10.7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T07:51:08.326530(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
3eb208ccb340582c942c648fa6dbba4d915406ac |
# Dataset Card for Evaluation run of yunconglong/7Bx4_DPO_2e
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [yunconglong/7Bx4_DPO_2e](https://huggingface.co/yunconglong/7Bx4_DPO_2e) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yunconglong__7Bx4_DPO_2e",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-20T07:53:49.728301](https://huggingface.co/datasets/open-llm-leaderboard/details_yunconglong__7Bx4_DPO_2e/blob/main/results_2024-01-20T07-53-49.728301.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6497382445744926,
"acc_stderr": 0.032094653254152825,
"acc_norm": 0.6495590967657305,
"acc_norm_stderr": 0.03275779052761359,
"mc1": 0.5030599755201959,
"mc1_stderr": 0.01750317326096063,
"mc2": 0.6560145251092138,
"mc2_stderr": 0.014910139553633708
},
"harness|arc:challenge|25": {
"acc": 0.6672354948805461,
"acc_stderr": 0.013769863046192309,
"acc_norm": 0.689419795221843,
"acc_norm_stderr": 0.013522292098053067
},
"harness|hellaswag|10": {
"acc": 0.6793467436765585,
"acc_stderr": 0.004657738398900936,
"acc_norm": 0.8679545907189803,
"acc_norm_stderr": 0.003378482488748873
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924003,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924003
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181015,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181015
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121434,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121434
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.024035489676335075,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.024035489676335075
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948485,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977938,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977938
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8550458715596331,
"acc_stderr": 0.01509421569970048,
"acc_norm": 0.8550458715596331,
"acc_norm_stderr": 0.01509421569970048
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.03388857118502325,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.03388857118502325
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.02732547096671631,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.02732547096671631
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601443,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.02335736578587403,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.02335736578587403
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42905027932960893,
"acc_stderr": 0.01655328786311604,
"acc_norm": 0.42905027932960893,
"acc_norm_stderr": 0.01655328786311604
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826528,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826528
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042117,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042117
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829727,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829727
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45827900912646674,
"acc_stderr": 0.01272570165695364,
"acc_norm": 0.45827900912646674,
"acc_norm_stderr": 0.01272570165695364
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396556,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396556
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.01916241858862356,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.01916241858862356
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7551020408163265,
"acc_stderr": 0.02752963744017493,
"acc_norm": 0.7551020408163265,
"acc_norm_stderr": 0.02752963744017493
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482708,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482708
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5030599755201959,
"mc1_stderr": 0.01750317326096063,
"mc2": 0.6560145251092138,
"mc2_stderr": 0.014910139553633708
},
"harness|winogrande|5": {
"acc": 0.8074191002367798,
"acc_stderr": 0.011082538847491894
},
"harness|gsm8k|5": {
"acc": 0.7134192570128886,
"acc_stderr": 0.012454841668337695
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_yunconglong__7Bx4_DPO_2e | [
"region:us"
] | 2024-01-20T07:56:04+00:00 | {"pretty_name": "Evaluation run of yunconglong/7Bx4_DPO_2e", "dataset_summary": "Dataset automatically created during the evaluation run of model [yunconglong/7Bx4_DPO_2e](https://huggingface.co/yunconglong/7Bx4_DPO_2e) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yunconglong__7Bx4_DPO_2e\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-20T07:53:49.728301](https://huggingface.co/datasets/open-llm-leaderboard/details_yunconglong__7Bx4_DPO_2e/blob/main/results_2024-01-20T07-53-49.728301.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6497382445744926,\n \"acc_stderr\": 0.032094653254152825,\n \"acc_norm\": 0.6495590967657305,\n \"acc_norm_stderr\": 0.03275779052761359,\n \"mc1\": 0.5030599755201959,\n \"mc1_stderr\": 0.01750317326096063,\n \"mc2\": 0.6560145251092138,\n \"mc2_stderr\": 0.014910139553633708\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6672354948805461,\n \"acc_stderr\": 0.013769863046192309,\n \"acc_norm\": 0.689419795221843,\n \"acc_norm_stderr\": 0.013522292098053067\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6793467436765585,\n \"acc_stderr\": 0.004657738398900936,\n \"acc_norm\": 0.8679545907189803,\n \"acc_norm_stderr\": 0.003378482488748873\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924003,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924003\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181015,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181015\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121434,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121434\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.024035489676335075,\n \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.024035489676335075\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948485,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948485\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977938,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977938\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8550458715596331,\n \"acc_stderr\": 0.01509421569970048,\n \"acc_norm\": 0.8550458715596331,\n \"acc_norm_stderr\": 0.01509421569970048\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.03388857118502325,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03388857118502325\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.02732547096671631,\n \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.02732547096671631\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.02335736578587403,\n \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.02335736578587403\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42905027932960893,\n \"acc_stderr\": 0.01655328786311604,\n \"acc_norm\": 0.42905027932960893,\n \"acc_norm_stderr\": 0.01655328786311604\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826528,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826528\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042117,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042117\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829727,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829727\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45827900912646674,\n \"acc_stderr\": 0.01272570165695364,\n \"acc_norm\": 0.45827900912646674,\n \"acc_norm_stderr\": 0.01272570165695364\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396556,\n \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396556\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6601307189542484,\n \"acc_stderr\": 0.01916241858862356,\n \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.01916241858862356\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.02752963744017493,\n \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.02752963744017493\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.02519692987482708,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.02519692987482708\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5030599755201959,\n \"mc1_stderr\": 0.01750317326096063,\n \"mc2\": 0.6560145251092138,\n \"mc2_stderr\": 0.014910139553633708\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8074191002367798,\n \"acc_stderr\": 0.011082538847491894\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7134192570128886,\n \"acc_stderr\": 0.012454841668337695\n }\n}\n```", "repo_url": "https://huggingface.co/yunconglong/7Bx4_DPO_2e", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|arc:challenge|25_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|gsm8k|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hellaswag|10_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T07-53-49.728301.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["**/details_harness|winogrande|5_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-20T07-53-49.728301.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_20T07_53_49.728301", "path": ["results_2024-01-20T07-53-49.728301.parquet"]}, {"split": "latest", "path": ["results_2024-01-20T07-53-49.728301.parquet"]}]}]} | 2024-01-20T07:56:23+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of yunconglong/7Bx4_DPO_2e
Dataset automatically created during the evaluation run of model yunconglong/7Bx4_DPO_2e on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-20T07:53:49.728301(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of yunconglong/7Bx4_DPO_2e\n\n\n\nDataset automatically created during the evaluation run of model yunconglong/7Bx4_DPO_2e on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T07:53:49.728301(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of yunconglong/7Bx4_DPO_2e\n\n\n\nDataset automatically created during the evaluation run of model yunconglong/7Bx4_DPO_2e on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T07:53:49.728301(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
4cc2aa09c7d7a722098baa4b42471c8e8cfe142b |
This repository contains validation datasets for use with the `perplexity` tool from the `llama.cpp` project.
**Note:** [PR #5047](https://github.com/ggerganov/llama.cpp/pull/5047) is required to be able to use these datasets.
The simple program in `demo.cpp` shows how to read these files and can be used to combine two files into one.
The simple program in `convert.cpp` shows how to convert the data to JSON. For instance:
```
g++ -o convert convert.cpp
./convert arc-easy-validation.bin arc-easy-validation.json
``` | ikawrakow/validation-datasets-for-llama.cpp | [
"license:apache-2.0",
"region:us"
] | 2024-01-20T08:18:44+00:00 | {"license": "apache-2.0"} | 2024-01-22T07:16:18+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
|
This repository contains validation datasets for use with the 'perplexity' tool from the 'URL' project.
Note: PR #5047 is required to be able to use these datasets.
The simple program in 'URL' shows how to read these files and can be used to combine two files into one.
The simple program in 'URL' shows how to convert the data to JSON. For instance:
| [] | [
"TAGS\n#license-apache-2.0 #region-us \n"
] |
18b700e6db0467592e88b6ae1bc7807e4b3ef379 |
# Dataset Card for Evaluation run of adamo1139/Yi-34B-200K-AEZAKMI-RAW-1701
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [adamo1139/Yi-34B-200K-AEZAKMI-RAW-1701](https://huggingface.co/adamo1139/Yi-34B-200K-AEZAKMI-RAW-1701) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_adamo1139__Yi-34B-200K-AEZAKMI-RAW-1701",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-20T08:16:53.132202](https://huggingface.co/datasets/open-llm-leaderboard/details_adamo1139__Yi-34B-200K-AEZAKMI-RAW-1701/blob/main/results_2024-01-20T08-16-53.132202.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7493773199796548,
"acc_stderr": 0.02868566782690787,
"acc_norm": 0.7547145421066727,
"acc_norm_stderr": 0.029214766427902314,
"mc1": 0.423500611995104,
"mc1_stderr": 0.017297421448534727,
"mc2": 0.5790757070039864,
"mc2_stderr": 0.015548026674401659
},
"harness|arc:challenge|25": {
"acc": 0.6390784982935154,
"acc_stderr": 0.014034761386175449,
"acc_norm": 0.6680887372013652,
"acc_norm_stderr": 0.013760988200880531
},
"harness|hellaswag|10": {
"acc": 0.6675960963951404,
"acc_stderr": 0.00470112142180544,
"acc_norm": 0.8578968333001394,
"acc_norm_stderr": 0.0034844234420926675
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7111111111111111,
"acc_stderr": 0.03915450630414251,
"acc_norm": 0.7111111111111111,
"acc_norm_stderr": 0.03915450630414251
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8552631578947368,
"acc_stderr": 0.028631951845930387,
"acc_norm": 0.8552631578947368,
"acc_norm_stderr": 0.028631951845930387
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8113207547169812,
"acc_stderr": 0.02407999513006226,
"acc_norm": 0.8113207547169812,
"acc_norm_stderr": 0.02407999513006226
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8958333333333334,
"acc_stderr": 0.025545239210256917,
"acc_norm": 0.8958333333333334,
"acc_norm_stderr": 0.025545239210256917
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.034355680560478746,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.034355680560478746
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5196078431372549,
"acc_stderr": 0.04971358884367405,
"acc_norm": 0.5196078431372549,
"acc_norm_stderr": 0.04971358884367405
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7574468085106383,
"acc_stderr": 0.028020226271200217,
"acc_norm": 0.7574468085106383,
"acc_norm_stderr": 0.028020226271200217
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5614035087719298,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.5614035087719298,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7103448275862069,
"acc_stderr": 0.037800192304380135,
"acc_norm": 0.7103448275862069,
"acc_norm_stderr": 0.037800192304380135
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6693121693121693,
"acc_stderr": 0.024229965298425086,
"acc_norm": 0.6693121693121693,
"acc_norm_stderr": 0.024229965298425086
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.896774193548387,
"acc_stderr": 0.017308381281034523,
"acc_norm": 0.896774193548387,
"acc_norm_stderr": 0.017308381281034523
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6699507389162561,
"acc_stderr": 0.03308530426228257,
"acc_norm": 0.6699507389162561,
"acc_norm_stderr": 0.03308530426228257
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.027998073798781657,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.027998073798781657
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9141414141414141,
"acc_stderr": 0.01996022556317289,
"acc_norm": 0.9141414141414141,
"acc_norm_stderr": 0.01996022556317289
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9637305699481865,
"acc_stderr": 0.013492659751295127,
"acc_norm": 0.9637305699481865,
"acc_norm_stderr": 0.013492659751295127
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7923076923076923,
"acc_stderr": 0.020567539567246797,
"acc_norm": 0.7923076923076923,
"acc_norm_stderr": 0.020567539567246797
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02995824925008212,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02995824925008212
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8403361344537815,
"acc_stderr": 0.0237933539975288,
"acc_norm": 0.8403361344537815,
"acc_norm_stderr": 0.0237933539975288
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.040752249922169775,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.040752249922169775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9211009174311927,
"acc_stderr": 0.011558198113769584,
"acc_norm": 0.9211009174311927,
"acc_norm_stderr": 0.011558198113769584
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.03256850570293647,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.03256850570293647
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.019907399791316945,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.019907399791316945
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.890295358649789,
"acc_stderr": 0.020343400734868847,
"acc_norm": 0.890295358649789,
"acc_norm_stderr": 0.020343400734868847
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7802690582959642,
"acc_stderr": 0.027790177064383595,
"acc_norm": 0.7802690582959642,
"acc_norm_stderr": 0.027790177064383595
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8473282442748091,
"acc_stderr": 0.031545216720054725,
"acc_norm": 0.8473282442748091,
"acc_norm_stderr": 0.031545216720054725
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8677685950413223,
"acc_stderr": 0.0309227883204458,
"acc_norm": 0.8677685950413223,
"acc_norm_stderr": 0.0309227883204458
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.03038159675665167,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.03038159675665167
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8650306748466258,
"acc_stderr": 0.026845765054553848,
"acc_norm": 0.8650306748466258,
"acc_norm_stderr": 0.026845765054553848
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5803571428571429,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.5803571428571429,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.8737864077669902,
"acc_stderr": 0.03288180278808629,
"acc_norm": 0.8737864077669902,
"acc_norm_stderr": 0.03288180278808629
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9401709401709402,
"acc_stderr": 0.015537514263253878,
"acc_norm": 0.9401709401709402,
"acc_norm_stderr": 0.015537514263253878
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352202,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352202
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9016602809706258,
"acc_stderr": 0.010648356301876341,
"acc_norm": 0.9016602809706258,
"acc_norm_stderr": 0.010648356301876341
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.838150289017341,
"acc_stderr": 0.019829299214925416,
"acc_norm": 0.838150289017341,
"acc_norm_stderr": 0.019829299214925416
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7072625698324022,
"acc_stderr": 0.015218109544410184,
"acc_norm": 0.7072625698324022,
"acc_norm_stderr": 0.015218109544410184
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.020823758837580905,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.020823758837580905
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7845659163987139,
"acc_stderr": 0.023350225475471442,
"acc_norm": 0.7845659163987139,
"acc_norm_stderr": 0.023350225475471442
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8395061728395061,
"acc_stderr": 0.020423955354778034,
"acc_norm": 0.8395061728395061,
"acc_norm_stderr": 0.020423955354778034
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6418439716312057,
"acc_stderr": 0.028602085862759422,
"acc_norm": 0.6418439716312057,
"acc_norm_stderr": 0.028602085862759422
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5736636245110821,
"acc_stderr": 0.012630884771599689,
"acc_norm": 0.5736636245110821,
"acc_norm_stderr": 0.012630884771599689
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8161764705882353,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.8161764705882353,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.01575052628436335,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.01575052628436335
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8448979591836735,
"acc_stderr": 0.0231747988612186,
"acc_norm": 0.8448979591836735,
"acc_norm_stderr": 0.0231747988612186
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8805970149253731,
"acc_stderr": 0.02292879327721974,
"acc_norm": 0.8805970149253731,
"acc_norm_stderr": 0.02292879327721974
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.0256432399976243,
"acc_norm": 0.93,
"acc_norm_stderr": 0.0256432399976243
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685515,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8947368421052632,
"acc_stderr": 0.023537557657892554,
"acc_norm": 0.8947368421052632,
"acc_norm_stderr": 0.023537557657892554
},
"harness|truthfulqa:mc|0": {
"mc1": 0.423500611995104,
"mc1_stderr": 0.017297421448534727,
"mc2": 0.5790757070039864,
"mc2_stderr": 0.015548026674401659
},
"harness|winogrande|5": {
"acc": 0.8034727703235991,
"acc_stderr": 0.011168120593569567
},
"harness|gsm8k|5": {
"acc": 0.599696739954511,
"acc_stderr": 0.013495926436566438
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_adamo1139__Yi-34B-200K-AEZAKMI-RAW-1701 | [
"region:us"
] | 2024-01-20T08:19:04+00:00 | {"pretty_name": "Evaluation run of adamo1139/Yi-34B-200K-AEZAKMI-RAW-1701", "dataset_summary": "Dataset automatically created during the evaluation run of model [adamo1139/Yi-34B-200K-AEZAKMI-RAW-1701](https://huggingface.co/adamo1139/Yi-34B-200K-AEZAKMI-RAW-1701) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_adamo1139__Yi-34B-200K-AEZAKMI-RAW-1701\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-20T08:16:53.132202](https://huggingface.co/datasets/open-llm-leaderboard/details_adamo1139__Yi-34B-200K-AEZAKMI-RAW-1701/blob/main/results_2024-01-20T08-16-53.132202.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7493773199796548,\n \"acc_stderr\": 0.02868566782690787,\n \"acc_norm\": 0.7547145421066727,\n \"acc_norm_stderr\": 0.029214766427902314,\n \"mc1\": 0.423500611995104,\n \"mc1_stderr\": 0.017297421448534727,\n \"mc2\": 0.5790757070039864,\n \"mc2_stderr\": 0.015548026674401659\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6390784982935154,\n \"acc_stderr\": 0.014034761386175449,\n \"acc_norm\": 0.6680887372013652,\n \"acc_norm_stderr\": 0.013760988200880531\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6675960963951404,\n \"acc_stderr\": 0.00470112142180544,\n \"acc_norm\": 0.8578968333001394,\n \"acc_norm_stderr\": 0.0034844234420926675\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7111111111111111,\n \"acc_stderr\": 0.03915450630414251,\n \"acc_norm\": 0.7111111111111111,\n \"acc_norm_stderr\": 0.03915450630414251\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8552631578947368,\n \"acc_stderr\": 0.028631951845930387,\n \"acc_norm\": 0.8552631578947368,\n \"acc_norm_stderr\": 0.028631951845930387\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8113207547169812,\n \"acc_stderr\": 0.02407999513006226,\n \"acc_norm\": 0.8113207547169812,\n \"acc_norm_stderr\": 0.02407999513006226\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8958333333333334,\n \"acc_stderr\": 0.025545239210256917,\n \"acc_norm\": 0.8958333333333334,\n \"acc_norm_stderr\": 0.025545239210256917\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.034355680560478746,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.034355680560478746\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5196078431372549,\n \"acc_stderr\": 0.04971358884367405,\n \"acc_norm\": 0.5196078431372549,\n \"acc_norm_stderr\": 0.04971358884367405\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7574468085106383,\n \"acc_stderr\": 0.028020226271200217,\n \"acc_norm\": 0.7574468085106383,\n \"acc_norm_stderr\": 0.028020226271200217\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5614035087719298,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.5614035087719298,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7103448275862069,\n \"acc_stderr\": 0.037800192304380135,\n \"acc_norm\": 0.7103448275862069,\n \"acc_norm_stderr\": 0.037800192304380135\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6693121693121693,\n \"acc_stderr\": 0.024229965298425086,\n \"acc_norm\": 0.6693121693121693,\n \"acc_norm_stderr\": 0.024229965298425086\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.896774193548387,\n \"acc_stderr\": 0.017308381281034523,\n \"acc_norm\": 0.896774193548387,\n \"acc_norm_stderr\": 0.017308381281034523\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6699507389162561,\n \"acc_stderr\": 0.03308530426228257,\n \"acc_norm\": 0.6699507389162561,\n \"acc_norm_stderr\": 0.03308530426228257\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781657,\n \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781657\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9141414141414141,\n \"acc_stderr\": 0.01996022556317289,\n \"acc_norm\": 0.9141414141414141,\n \"acc_norm_stderr\": 0.01996022556317289\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9637305699481865,\n \"acc_stderr\": 0.013492659751295127,\n \"acc_norm\": 0.9637305699481865,\n \"acc_norm_stderr\": 0.013492659751295127\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7923076923076923,\n \"acc_stderr\": 0.020567539567246797,\n \"acc_norm\": 0.7923076923076923,\n \"acc_norm_stderr\": 0.020567539567246797\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.02995824925008212,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02995824925008212\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8403361344537815,\n \"acc_stderr\": 0.0237933539975288,\n \"acc_norm\": 0.8403361344537815,\n \"acc_norm_stderr\": 0.0237933539975288\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9211009174311927,\n \"acc_stderr\": 0.011558198113769584,\n \"acc_norm\": 0.9211009174311927,\n \"acc_norm_stderr\": 0.011558198113769584\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6481481481481481,\n \"acc_stderr\": 0.03256850570293647,\n \"acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.03256850570293647\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9117647058823529,\n \"acc_stderr\": 0.019907399791316945,\n \"acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.019907399791316945\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.890295358649789,\n \"acc_stderr\": 0.020343400734868847,\n \"acc_norm\": 0.890295358649789,\n \"acc_norm_stderr\": 0.020343400734868847\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7802690582959642,\n \"acc_stderr\": 0.027790177064383595,\n \"acc_norm\": 0.7802690582959642,\n \"acc_norm_stderr\": 0.027790177064383595\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8473282442748091,\n \"acc_stderr\": 0.031545216720054725,\n \"acc_norm\": 0.8473282442748091,\n \"acc_norm_stderr\": 0.031545216720054725\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8677685950413223,\n \"acc_stderr\": 0.0309227883204458,\n \"acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.0309227883204458\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.03038159675665167,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.03038159675665167\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8650306748466258,\n \"acc_stderr\": 0.026845765054553848,\n \"acc_norm\": 0.8650306748466258,\n \"acc_norm_stderr\": 0.026845765054553848\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5803571428571429,\n \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.5803571428571429,\n \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8737864077669902,\n \"acc_stderr\": 0.03288180278808629,\n \"acc_norm\": 0.8737864077669902,\n \"acc_norm_stderr\": 0.03288180278808629\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9401709401709402,\n \"acc_stderr\": 0.015537514263253878,\n \"acc_norm\": 0.9401709401709402,\n \"acc_norm_stderr\": 0.015537514263253878\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352202,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352202\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9016602809706258,\n \"acc_stderr\": 0.010648356301876341,\n \"acc_norm\": 0.9016602809706258,\n \"acc_norm_stderr\": 0.010648356301876341\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.838150289017341,\n \"acc_stderr\": 0.019829299214925416,\n \"acc_norm\": 0.838150289017341,\n \"acc_norm_stderr\": 0.019829299214925416\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7072625698324022,\n \"acc_stderr\": 0.015218109544410184,\n \"acc_norm\": 0.7072625698324022,\n \"acc_norm_stderr\": 0.015218109544410184\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.020823758837580905,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.020823758837580905\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7845659163987139,\n \"acc_stderr\": 0.023350225475471442,\n \"acc_norm\": 0.7845659163987139,\n \"acc_norm_stderr\": 0.023350225475471442\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8395061728395061,\n \"acc_stderr\": 0.020423955354778034,\n \"acc_norm\": 0.8395061728395061,\n \"acc_norm_stderr\": 0.020423955354778034\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6418439716312057,\n \"acc_stderr\": 0.028602085862759422,\n \"acc_norm\": 0.6418439716312057,\n \"acc_norm_stderr\": 0.028602085862759422\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5736636245110821,\n \"acc_stderr\": 0.012630884771599689,\n \"acc_norm\": 0.5736636245110821,\n \"acc_norm_stderr\": 0.012630884771599689\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8161764705882353,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.8161764705882353,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.01575052628436335,\n \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.01575052628436335\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8448979591836735,\n \"acc_stderr\": 0.0231747988612186,\n \"acc_norm\": 0.8448979591836735,\n \"acc_norm_stderr\": 0.0231747988612186\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8805970149253731,\n \"acc_stderr\": 0.02292879327721974,\n \"acc_norm\": 0.8805970149253731,\n \"acc_norm_stderr\": 0.02292879327721974\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.93,\n \"acc_stderr\": 0.0256432399976243,\n \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.0256432399976243\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8947368421052632,\n \"acc_stderr\": 0.023537557657892554,\n \"acc_norm\": 0.8947368421052632,\n \"acc_norm_stderr\": 0.023537557657892554\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.423500611995104,\n \"mc1_stderr\": 0.017297421448534727,\n \"mc2\": 0.5790757070039864,\n \"mc2_stderr\": 0.015548026674401659\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8034727703235991,\n \"acc_stderr\": 0.011168120593569567\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.599696739954511,\n \"acc_stderr\": 0.013495926436566438\n }\n}\n```", "repo_url": "https://huggingface.co/adamo1139/Yi-34B-200K-AEZAKMI-RAW-1701", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|arc:challenge|25_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|gsm8k|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hellaswag|10_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T08-16-53.132202.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["**/details_harness|winogrande|5_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-20T08-16-53.132202.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_20T08_16_53.132202", "path": ["results_2024-01-20T08-16-53.132202.parquet"]}, {"split": "latest", "path": ["results_2024-01-20T08-16-53.132202.parquet"]}]}]} | 2024-01-20T08:19:27+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of adamo1139/Yi-34B-200K-AEZAKMI-RAW-1701
Dataset automatically created during the evaluation run of model adamo1139/Yi-34B-200K-AEZAKMI-RAW-1701 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-20T08:16:53.132202(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of adamo1139/Yi-34B-200K-AEZAKMI-RAW-1701\n\n\n\nDataset automatically created during the evaluation run of model adamo1139/Yi-34B-200K-AEZAKMI-RAW-1701 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T08:16:53.132202(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of adamo1139/Yi-34B-200K-AEZAKMI-RAW-1701\n\n\n\nDataset automatically created during the evaluation run of model adamo1139/Yi-34B-200K-AEZAKMI-RAW-1701 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T08:16:53.132202(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
824dd69c9a8b71c6f259fa9f88c2e5ab7f97afaa | ### GSM8K-Zero
This dataset is constructed from GSM8K to test the over-reasoning and redundant calculation in the responses of LLMs.
For more details, please refer to [our github repo](https://github.com/d223302/Over-Reasoning-of-LLMs) or our [EACL'24 paper](https://openreview.net/forum?id=Glx1e3bYU5).
| dcml0714/GSM8K-Zero | [
"region:us"
] | 2024-01-20T08:43:27+00:00 | {} | 2024-01-20T08:48:34+00:00 | [] | [] | TAGS
#region-us
| ### GSM8K-Zero
This dataset is constructed from GSM8K to test the over-reasoning and redundant calculation in the responses of LLMs.
For more details, please refer to our github repo or our EACL'24 paper.
| [
"### GSM8K-Zero\nThis dataset is constructed from GSM8K to test the over-reasoning and redundant calculation in the responses of LLMs.\nFor more details, please refer to our github repo or our EACL'24 paper."
] | [
"TAGS\n#region-us \n",
"### GSM8K-Zero\nThis dataset is constructed from GSM8K to test the over-reasoning and redundant calculation in the responses of LLMs.\nFor more details, please refer to our github repo or our EACL'24 paper."
] |
fc333e5ff44d35052ddf3226cf00716dc1711516 |
# Dataset Card for Evaluation run of AA051610/Q
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AA051610/Q](https://huggingface.co/AA051610/Q) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AA051610__Q",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-20T09:11:03.066548](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__Q/blob/main/results_2024-01-20T09-11-03.066548.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7399080967661088,
"acc_stderr": 0.02875112016204294,
"acc_norm": 0.7517342136662964,
"acc_norm_stderr": 0.02932406494014928,
"mc1": 0.412484700122399,
"mc1_stderr": 0.01723329939957122,
"mc2": 0.5935958667241532,
"mc2_stderr": 0.015329701989808613
},
"harness|arc:challenge|25": {
"acc": 0.6450511945392492,
"acc_stderr": 0.013983036904094089,
"acc_norm": 0.6697952218430034,
"acc_norm_stderr": 0.013743085603760426
},
"harness|hellaswag|10": {
"acc": 0.6638119896434973,
"acc_stderr": 0.004714386376337134,
"acc_norm": 0.8567018522206732,
"acc_norm_stderr": 0.0034966056729606905
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.762962962962963,
"acc_stderr": 0.03673731683969506,
"acc_norm": 0.762962962962963,
"acc_norm_stderr": 0.03673731683969506
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8486842105263158,
"acc_stderr": 0.02916263159684399,
"acc_norm": 0.8486842105263158,
"acc_norm_stderr": 0.02916263159684399
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8037735849056604,
"acc_stderr": 0.024442388131100824,
"acc_norm": 0.8037735849056604,
"acc_norm_stderr": 0.024442388131100824
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8472222222222222,
"acc_stderr": 0.03008574324856567,
"acc_norm": 0.8472222222222222,
"acc_norm_stderr": 0.03008574324856567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.035331333893236574,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.035331333893236574
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5392156862745098,
"acc_stderr": 0.04959859966384181,
"acc_norm": 0.5392156862745098,
"acc_norm_stderr": 0.04959859966384181
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.774468085106383,
"acc_stderr": 0.027321078417387536,
"acc_norm": 0.774468085106383,
"acc_norm_stderr": 0.027321078417387536
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5701754385964912,
"acc_stderr": 0.04657047260594964,
"acc_norm": 0.5701754385964912,
"acc_norm_stderr": 0.04657047260594964
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7310344827586207,
"acc_stderr": 0.036951833116502325,
"acc_norm": 0.7310344827586207,
"acc_norm_stderr": 0.036951833116502325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.7513227513227513,
"acc_stderr": 0.02226181769240016,
"acc_norm": 0.7513227513227513,
"acc_norm_stderr": 0.02226181769240016
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5238095238095238,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.5238095238095238,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8870967741935484,
"acc_stderr": 0.01800360332586363,
"acc_norm": 0.8870967741935484,
"acc_norm_stderr": 0.01800360332586363
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5960591133004927,
"acc_stderr": 0.03452453903822033,
"acc_norm": 0.5960591133004927,
"acc_norm_stderr": 0.03452453903822033
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8606060606060606,
"acc_stderr": 0.027045948825865414,
"acc_norm": 0.8606060606060606,
"acc_norm_stderr": 0.027045948825865414
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9040404040404041,
"acc_stderr": 0.020984808610047933,
"acc_norm": 0.9040404040404041,
"acc_norm_stderr": 0.020984808610047933
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9585492227979274,
"acc_stderr": 0.014385432857476442,
"acc_norm": 0.9585492227979274,
"acc_norm_stderr": 0.014385432857476442
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8128205128205128,
"acc_stderr": 0.01977660108655004,
"acc_norm": 0.8128205128205128,
"acc_norm_stderr": 0.01977660108655004
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.03044452852881074,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.03044452852881074
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8445378151260504,
"acc_stderr": 0.023536818625398897,
"acc_norm": 0.8445378151260504,
"acc_norm_stderr": 0.023536818625398897
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.48344370860927155,
"acc_stderr": 0.040802441856289715,
"acc_norm": 0.48344370860927155,
"acc_norm_stderr": 0.040802441856289715
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9119266055045872,
"acc_stderr": 0.012150743719481655,
"acc_norm": 0.9119266055045872,
"acc_norm_stderr": 0.012150743719481655
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.03236585252602158,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.03236585252602158
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.019398452135813905,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.019398452135813905
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8860759493670886,
"acc_stderr": 0.020681745135884565,
"acc_norm": 0.8860759493670886,
"acc_norm_stderr": 0.020681745135884565
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7847533632286996,
"acc_stderr": 0.027584066602208274,
"acc_norm": 0.7847533632286996,
"acc_norm_stderr": 0.027584066602208274
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8854961832061069,
"acc_stderr": 0.027927473753597446,
"acc_norm": 0.8854961832061069,
"acc_norm_stderr": 0.027927473753597446
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.029199802455622793,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.029199802455622793
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.030381596756651672,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.030381596756651672
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8588957055214724,
"acc_stderr": 0.027351605518389752,
"acc_norm": 0.8588957055214724,
"acc_norm_stderr": 0.027351605518389752
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.04697113923010213,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.04697113923010213
},
"harness|hendrycksTest-management|5": {
"acc": 0.8932038834951457,
"acc_stderr": 0.030581088928331366,
"acc_norm": 0.8932038834951457,
"acc_norm_stderr": 0.030581088928331366
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9487179487179487,
"acc_stderr": 0.014450181176872733,
"acc_norm": 0.9487179487179487,
"acc_norm_stderr": 0.014450181176872733
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9042145593869731,
"acc_stderr": 0.010524031079055834,
"acc_norm": 0.9042145593869731,
"acc_norm_stderr": 0.010524031079055834
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.791907514450867,
"acc_stderr": 0.021855255263421795,
"acc_norm": 0.791907514450867,
"acc_norm_stderr": 0.021855255263421795
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6782122905027933,
"acc_stderr": 0.015624236160792584,
"acc_norm": 0.6782122905027933,
"acc_norm_stderr": 0.015624236160792584
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8398692810457516,
"acc_stderr": 0.020998740930362306,
"acc_norm": 0.8398692810457516,
"acc_norm_stderr": 0.020998740930362306
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7942122186495176,
"acc_stderr": 0.022961339906764244,
"acc_norm": 0.7942122186495176,
"acc_norm_stderr": 0.022961339906764244
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8395061728395061,
"acc_stderr": 0.02042395535477803,
"acc_norm": 0.8395061728395061,
"acc_norm_stderr": 0.02042395535477803
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6028368794326241,
"acc_stderr": 0.0291898056735871,
"acc_norm": 0.6028368794326241,
"acc_norm_stderr": 0.0291898056735871
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5925684485006519,
"acc_stderr": 0.01254947371421222,
"acc_norm": 0.5925684485006519,
"acc_norm_stderr": 0.01254947371421222
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.023157468308559352,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.023157468308559352
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7924836601307189,
"acc_stderr": 0.01640592427010324,
"acc_norm": 0.7924836601307189,
"acc_norm_stderr": 0.01640592427010324
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8122448979591836,
"acc_stderr": 0.025000256039546198,
"acc_norm": 0.8122448979591836,
"acc_norm_stderr": 0.025000256039546198
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8805970149253731,
"acc_stderr": 0.02292879327721974,
"acc_norm": 0.8805970149253731,
"acc_norm_stderr": 0.02292879327721974
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.9005847953216374,
"acc_stderr": 0.02294902557935504,
"acc_norm": 0.9005847953216374,
"acc_norm_stderr": 0.02294902557935504
},
"harness|truthfulqa:mc|0": {
"mc1": 0.412484700122399,
"mc1_stderr": 0.01723329939957122,
"mc2": 0.5935958667241532,
"mc2_stderr": 0.015329701989808613
},
"harness|winogrande|5": {
"acc": 0.8003157063930545,
"acc_stderr": 0.011235328382625845
},
"harness|gsm8k|5": {
"acc": 0.19939347990902198,
"acc_stderr": 0.011005438029475656
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_AA051610__Q | [
"region:us"
] | 2024-01-20T09:13:16+00:00 | {"pretty_name": "Evaluation run of AA051610/Q", "dataset_summary": "Dataset automatically created during the evaluation run of model [AA051610/Q](https://huggingface.co/AA051610/Q) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051610__Q\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-20T09:11:03.066548](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__Q/blob/main/results_2024-01-20T09-11-03.066548.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7399080967661088,\n \"acc_stderr\": 0.02875112016204294,\n \"acc_norm\": 0.7517342136662964,\n \"acc_norm_stderr\": 0.02932406494014928,\n \"mc1\": 0.412484700122399,\n \"mc1_stderr\": 0.01723329939957122,\n \"mc2\": 0.5935958667241532,\n \"mc2_stderr\": 0.015329701989808613\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6450511945392492,\n \"acc_stderr\": 0.013983036904094089,\n \"acc_norm\": 0.6697952218430034,\n \"acc_norm_stderr\": 0.013743085603760426\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6638119896434973,\n \"acc_stderr\": 0.004714386376337134,\n \"acc_norm\": 0.8567018522206732,\n \"acc_norm_stderr\": 0.0034966056729606905\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.762962962962963,\n \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.762962962962963,\n \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8486842105263158,\n \"acc_stderr\": 0.02916263159684399,\n \"acc_norm\": 0.8486842105263158,\n \"acc_norm_stderr\": 0.02916263159684399\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8037735849056604,\n \"acc_stderr\": 0.024442388131100824,\n \"acc_norm\": 0.8037735849056604,\n \"acc_norm_stderr\": 0.024442388131100824\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8472222222222222,\n \"acc_stderr\": 0.03008574324856567,\n \"acc_norm\": 0.8472222222222222,\n \"acc_norm_stderr\": 0.03008574324856567\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5392156862745098,\n \"acc_stderr\": 0.04959859966384181,\n \"acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.04959859966384181\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.774468085106383,\n \"acc_stderr\": 0.027321078417387536,\n \"acc_norm\": 0.774468085106383,\n \"acc_norm_stderr\": 0.027321078417387536\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5701754385964912,\n \"acc_stderr\": 0.04657047260594964,\n \"acc_norm\": 0.5701754385964912,\n \"acc_norm_stderr\": 0.04657047260594964\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7310344827586207,\n \"acc_stderr\": 0.036951833116502325,\n \"acc_norm\": 0.7310344827586207,\n \"acc_norm_stderr\": 0.036951833116502325\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.7513227513227513,\n \"acc_stderr\": 0.02226181769240016,\n \"acc_norm\": 0.7513227513227513,\n \"acc_norm_stderr\": 0.02226181769240016\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5238095238095238,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.5238095238095238,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8870967741935484,\n \"acc_stderr\": 0.01800360332586363,\n \"acc_norm\": 0.8870967741935484,\n \"acc_norm_stderr\": 0.01800360332586363\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5960591133004927,\n \"acc_stderr\": 0.03452453903822033,\n \"acc_norm\": 0.5960591133004927,\n \"acc_norm_stderr\": 0.03452453903822033\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8606060606060606,\n \"acc_stderr\": 0.027045948825865414,\n \"acc_norm\": 0.8606060606060606,\n \"acc_norm_stderr\": 0.027045948825865414\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9040404040404041,\n \"acc_stderr\": 0.020984808610047933,\n \"acc_norm\": 0.9040404040404041,\n \"acc_norm_stderr\": 0.020984808610047933\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9585492227979274,\n \"acc_stderr\": 0.014385432857476442,\n \"acc_norm\": 0.9585492227979274,\n \"acc_norm_stderr\": 0.014385432857476442\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8128205128205128,\n \"acc_stderr\": 0.01977660108655004,\n \"acc_norm\": 0.8128205128205128,\n \"acc_norm_stderr\": 0.01977660108655004\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4740740740740741,\n \"acc_stderr\": 0.03044452852881074,\n \"acc_norm\": 0.4740740740740741,\n \"acc_norm_stderr\": 0.03044452852881074\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8445378151260504,\n \"acc_stderr\": 0.023536818625398897,\n \"acc_norm\": 0.8445378151260504,\n \"acc_norm_stderr\": 0.023536818625398897\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.48344370860927155,\n \"acc_stderr\": 0.040802441856289715,\n \"acc_norm\": 0.48344370860927155,\n \"acc_norm_stderr\": 0.040802441856289715\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9119266055045872,\n \"acc_stderr\": 0.012150743719481655,\n \"acc_norm\": 0.9119266055045872,\n \"acc_norm_stderr\": 0.012150743719481655\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6574074074074074,\n \"acc_stderr\": 0.03236585252602158,\n \"acc_norm\": 0.6574074074074074,\n \"acc_norm_stderr\": 0.03236585252602158\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9166666666666666,\n \"acc_stderr\": 0.019398452135813905,\n \"acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.019398452135813905\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8860759493670886,\n \"acc_stderr\": 0.020681745135884565,\n \"acc_norm\": 0.8860759493670886,\n \"acc_norm_stderr\": 0.020681745135884565\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7847533632286996,\n \"acc_stderr\": 0.027584066602208274,\n \"acc_norm\": 0.7847533632286996,\n \"acc_norm_stderr\": 0.027584066602208274\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8854961832061069,\n \"acc_stderr\": 0.027927473753597446,\n \"acc_norm\": 0.8854961832061069,\n \"acc_norm_stderr\": 0.027927473753597446\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8842975206611571,\n \"acc_stderr\": 0.029199802455622793,\n \"acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.029199802455622793\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.030381596756651672,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.030381596756651672\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8588957055214724,\n \"acc_stderr\": 0.027351605518389752,\n \"acc_norm\": 0.8588957055214724,\n \"acc_norm_stderr\": 0.027351605518389752\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.04697113923010213,\n \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.04697113923010213\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8932038834951457,\n \"acc_stderr\": 0.030581088928331366,\n \"acc_norm\": 0.8932038834951457,\n \"acc_norm_stderr\": 0.030581088928331366\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9487179487179487,\n \"acc_stderr\": 0.014450181176872733,\n \"acc_norm\": 0.9487179487179487,\n \"acc_norm_stderr\": 0.014450181176872733\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9042145593869731,\n \"acc_stderr\": 0.010524031079055834,\n \"acc_norm\": 0.9042145593869731,\n \"acc_norm_stderr\": 0.010524031079055834\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.791907514450867,\n \"acc_stderr\": 0.021855255263421795,\n \"acc_norm\": 0.791907514450867,\n \"acc_norm_stderr\": 0.021855255263421795\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6782122905027933,\n \"acc_stderr\": 0.015624236160792584,\n \"acc_norm\": 0.6782122905027933,\n \"acc_norm_stderr\": 0.015624236160792584\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8398692810457516,\n \"acc_stderr\": 0.020998740930362306,\n \"acc_norm\": 0.8398692810457516,\n \"acc_norm_stderr\": 0.020998740930362306\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7942122186495176,\n \"acc_stderr\": 0.022961339906764244,\n \"acc_norm\": 0.7942122186495176,\n \"acc_norm_stderr\": 0.022961339906764244\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8395061728395061,\n \"acc_stderr\": 0.02042395535477803,\n \"acc_norm\": 0.8395061728395061,\n \"acc_norm_stderr\": 0.02042395535477803\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6028368794326241,\n \"acc_stderr\": 0.0291898056735871,\n \"acc_norm\": 0.6028368794326241,\n \"acc_norm_stderr\": 0.0291898056735871\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5925684485006519,\n \"acc_stderr\": 0.01254947371421222,\n \"acc_norm\": 0.5925684485006519,\n \"acc_norm_stderr\": 0.01254947371421222\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.023157468308559352,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.023157468308559352\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7924836601307189,\n \"acc_stderr\": 0.01640592427010324,\n \"acc_norm\": 0.7924836601307189,\n \"acc_norm_stderr\": 0.01640592427010324\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8122448979591836,\n \"acc_stderr\": 0.025000256039546198,\n \"acc_norm\": 0.8122448979591836,\n \"acc_norm_stderr\": 0.025000256039546198\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8805970149253731,\n \"acc_stderr\": 0.02292879327721974,\n \"acc_norm\": 0.8805970149253731,\n \"acc_norm_stderr\": 0.02292879327721974\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.9005847953216374,\n \"acc_stderr\": 0.02294902557935504,\n \"acc_norm\": 0.9005847953216374,\n \"acc_norm_stderr\": 0.02294902557935504\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.412484700122399,\n \"mc1_stderr\": 0.01723329939957122,\n \"mc2\": 0.5935958667241532,\n \"mc2_stderr\": 0.015329701989808613\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8003157063930545,\n \"acc_stderr\": 0.011235328382625845\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.19939347990902198,\n \"acc_stderr\": 0.011005438029475656\n }\n}\n```", "repo_url": "https://huggingface.co/AA051610/Q", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|arc:challenge|25_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|gsm8k|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hellaswag|10_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T09-11-03.066548.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["**/details_harness|winogrande|5_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-20T09-11-03.066548.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_20T09_11_03.066548", "path": ["results_2024-01-20T09-11-03.066548.parquet"]}, {"split": "latest", "path": ["results_2024-01-20T09-11-03.066548.parquet"]}]}]} | 2024-01-20T09:13:41+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of AA051610/Q
Dataset automatically created during the evaluation run of model AA051610/Q on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-20T09:11:03.066548(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of AA051610/Q\n\n\n\nDataset automatically created during the evaluation run of model AA051610/Q on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T09:11:03.066548(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of AA051610/Q\n\n\n\nDataset automatically created during the evaluation run of model AA051610/Q on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T09:11:03.066548(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
78464d6291ca00a26d50a2deb58b8bfbd9b57e66 |
# DL3DV-Dataset
This repo has all the raw 4K videos of DL3DV-Dataset. We are working hard to review all the dataset to avoid sensitive information. Thank you for your patience.
# News
- [x] DL3DV-1K
- [ ] DL3DV-2K
- [ ] DL3DV-3K
- [ ] DL3DV-4K
- [ ] DL3DV-5K
- [ ] DL3DV-6K
- [ ] DL3DV-7K
- [ ] DL3DV-8K
- [ ] DL3DV-9K
- [ ] DL3DV-10K | DL3DV/DL3DV-ALL | [
"size_categories:n>1T",
"3D Vision",
"NeRF",
"3D Gaussian",
"Dataset",
"Novel View Synthesis",
"Text to 3D",
"Image to 3D",
"region:us"
] | 2024-01-20T09:15:34+00:00 | {"size_categories": ["n>1T"], "pretty_name": "Dl3DV-Dataset", "tags": ["3D Vision", "NeRF", "3D Gaussian", "Dataset", "Novel View Synthesis", "Text to 3D", "Image to 3D"]} | 2024-01-21T06:38:33+00:00 | [] | [] | TAGS
#size_categories-n>1T #3D Vision #NeRF #3D Gaussian #Dataset #Novel View Synthesis #Text to 3D #Image to 3D #region-us
|
# DL3DV-Dataset
This repo has all the raw 4K videos of DL3DV-Dataset. We are working hard to review all the dataset to avoid sensitive information. Thank you for your patience.
# News
- [x] DL3DV-1K
- [ ] DL3DV-2K
- [ ] DL3DV-3K
- [ ] DL3DV-4K
- [ ] DL3DV-5K
- [ ] DL3DV-6K
- [ ] DL3DV-7K
- [ ] DL3DV-8K
- [ ] DL3DV-9K
- [ ] DL3DV-10K | [
"# DL3DV-Dataset \nThis repo has all the raw 4K videos of DL3DV-Dataset. We are working hard to review all the dataset to avoid sensitive information. Thank you for your patience.",
"# News \n- [x] DL3DV-1K\n- [ ] DL3DV-2K\n- [ ] DL3DV-3K\n- [ ] DL3DV-4K\n- [ ] DL3DV-5K\n- [ ] DL3DV-6K\n- [ ] DL3DV-7K\n- [ ] DL3DV-8K\n- [ ] DL3DV-9K\n- [ ] DL3DV-10K"
] | [
"TAGS\n#size_categories-n>1T #3D Vision #NeRF #3D Gaussian #Dataset #Novel View Synthesis #Text to 3D #Image to 3D #region-us \n",
"# DL3DV-Dataset \nThis repo has all the raw 4K videos of DL3DV-Dataset. We are working hard to review all the dataset to avoid sensitive information. Thank you for your patience.",
"# News \n- [x] DL3DV-1K\n- [ ] DL3DV-2K\n- [ ] DL3DV-3K\n- [ ] DL3DV-4K\n- [ ] DL3DV-5K\n- [ ] DL3DV-6K\n- [ ] DL3DV-7K\n- [ ] DL3DV-8K\n- [ ] DL3DV-9K\n- [ ] DL3DV-10K"
] |
cfdbba4e84701ca7da5c3ee34bb700a703fb5aea |
# Dataset Card for Evaluation run of alnrg2arg/blockchainlabs_7B_merged_test2_4
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [alnrg2arg/blockchainlabs_7B_merged_test2_4](https://huggingface.co/alnrg2arg/blockchainlabs_7B_merged_test2_4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_alnrg2arg__blockchainlabs_7B_merged_test2_4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-20T09:52:41.122319](https://huggingface.co/datasets/open-llm-leaderboard/details_alnrg2arg__blockchainlabs_7B_merged_test2_4/blob/main/results_2024-01-20T09-52-41.122319.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.652927958678689,
"acc_stderr": 0.0321169960910649,
"acc_norm": 0.6519652759500019,
"acc_norm_stderr": 0.03279242565970157,
"mc1": 0.576499388004896,
"mc1_stderr": 0.01729742144853475,
"mc2": 0.6976711663625277,
"mc2_stderr": 0.015093001598591628
},
"harness|arc:challenge|25": {
"acc": 0.7150170648464164,
"acc_stderr": 0.013191348179838793,
"acc_norm": 0.735494880546075,
"acc_norm_stderr": 0.012889272949313368
},
"harness|hellaswag|10": {
"acc": 0.7229635530770763,
"acc_stderr": 0.004466200055292544,
"acc_norm": 0.8886675960963951,
"acc_norm_stderr": 0.0031390048159258633
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724057,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.032400380867927465,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.032400380867927465
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944423,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944423
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603348,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603348
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886786,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886786
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.046840993210771065,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.046840993210771065
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258176,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4480446927374302,
"acc_stderr": 0.016631976628930595,
"acc_norm": 0.4480446927374302,
"acc_norm_stderr": 0.016631976628930595
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.02555316999182652,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.02555316999182652
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984813,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984813
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042107,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042107
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869649,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869649
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031208,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031208
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.01904748523936038,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.01904748523936038
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7510204081632653,
"acc_stderr": 0.027682979522960234,
"acc_norm": 0.7510204081632653,
"acc_norm_stderr": 0.027682979522960234
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.576499388004896,
"mc1_stderr": 0.01729742144853475,
"mc2": 0.6976711663625277,
"mc2_stderr": 0.015093001598591628
},
"harness|winogrande|5": {
"acc": 0.8445146014206788,
"acc_stderr": 0.010184308214775777
},
"harness|gsm8k|5": {
"acc": 0.7043214556482184,
"acc_stderr": 0.012570068947898772
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_alnrg2arg__blockchainlabs_7B_merged_test2_4 | [
"region:us"
] | 2024-01-20T09:55:01+00:00 | {"pretty_name": "Evaluation run of alnrg2arg/blockchainlabs_7B_merged_test2_4", "dataset_summary": "Dataset automatically created during the evaluation run of model [alnrg2arg/blockchainlabs_7B_merged_test2_4](https://huggingface.co/alnrg2arg/blockchainlabs_7B_merged_test2_4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_alnrg2arg__blockchainlabs_7B_merged_test2_4\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-20T09:52:41.122319](https://huggingface.co/datasets/open-llm-leaderboard/details_alnrg2arg__blockchainlabs_7B_merged_test2_4/blob/main/results_2024-01-20T09-52-41.122319.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.652927958678689,\n \"acc_stderr\": 0.0321169960910649,\n \"acc_norm\": 0.6519652759500019,\n \"acc_norm_stderr\": 0.03279242565970157,\n \"mc1\": 0.576499388004896,\n \"mc1_stderr\": 0.01729742144853475,\n \"mc2\": 0.6976711663625277,\n \"mc2_stderr\": 0.015093001598591628\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7150170648464164,\n \"acc_stderr\": 0.013191348179838793,\n \"acc_norm\": 0.735494880546075,\n \"acc_norm_stderr\": 0.012889272949313368\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7229635530770763,\n \"acc_stderr\": 0.004466200055292544,\n \"acc_norm\": 0.8886675960963951,\n \"acc_norm_stderr\": 0.0031390048159258633\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724057,\n \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724057\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.032400380867927465,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.032400380867927465\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944423,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944423\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603348,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603348\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886786,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886786\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.046840993210771065,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.046840993210771065\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258176,\n \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258176\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4480446927374302,\n \"acc_stderr\": 0.016631976628930595,\n \"acc_norm\": 0.4480446927374302,\n \"acc_norm_stderr\": 0.016631976628930595\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.02555316999182652,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.02555316999182652\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042107,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042107\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n \"acc_stderr\": 0.012744149704869649,\n \"acc_norm\": 0.4680573663624511,\n \"acc_norm_stderr\": 0.012744149704869649\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031208,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031208\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960234,\n \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960234\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.576499388004896,\n \"mc1_stderr\": 0.01729742144853475,\n \"mc2\": 0.6976711663625277,\n \"mc2_stderr\": 0.015093001598591628\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8445146014206788,\n \"acc_stderr\": 0.010184308214775777\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7043214556482184,\n \"acc_stderr\": 0.012570068947898772\n }\n}\n```", "repo_url": "https://huggingface.co/alnrg2arg/blockchainlabs_7B_merged_test2_4", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|arc:challenge|25_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|gsm8k|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hellaswag|10_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T09-52-41.122319.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["**/details_harness|winogrande|5_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-20T09-52-41.122319.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_20T09_52_41.122319", "path": ["results_2024-01-20T09-52-41.122319.parquet"]}, {"split": "latest", "path": ["results_2024-01-20T09-52-41.122319.parquet"]}]}]} | 2024-01-20T09:55:29+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of alnrg2arg/blockchainlabs_7B_merged_test2_4
Dataset automatically created during the evaluation run of model alnrg2arg/blockchainlabs_7B_merged_test2_4 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-20T09:52:41.122319(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of alnrg2arg/blockchainlabs_7B_merged_test2_4\n\n\n\nDataset automatically created during the evaluation run of model alnrg2arg/blockchainlabs_7B_merged_test2_4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T09:52:41.122319(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of alnrg2arg/blockchainlabs_7B_merged_test2_4\n\n\n\nDataset automatically created during the evaluation run of model alnrg2arg/blockchainlabs_7B_merged_test2_4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T09:52:41.122319(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
7e54489d4dd3fa1531f3cc3f6e127274b3c22baf |
# Dataset Card for Evaluation run of PetroGPT/Voldemort-10B-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [PetroGPT/Voldemort-10B-DPO](https://huggingface.co/PetroGPT/Voldemort-10B-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PetroGPT__Voldemort-10B-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-20T12:02:57.927448](https://huggingface.co/datasets/open-llm-leaderboard/details_PetroGPT__Voldemort-10B-DPO/blob/main/results_2024-01-20T12-02-57.927448.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6305876260706662,
"acc_stderr": 0.03255938653931723,
"acc_norm": 0.6330868385686215,
"acc_norm_stderr": 0.033208227030172364,
"mc1": 0.41615667074663404,
"mc1_stderr": 0.017255657502903046,
"mc2": 0.6144474102286928,
"mc2_stderr": 0.015672191454631425
},
"harness|arc:challenge|25": {
"acc": 0.6407849829351536,
"acc_stderr": 0.014020224155839159,
"acc_norm": 0.6604095563139932,
"acc_norm_stderr": 0.013839039762820169
},
"harness|hellaswag|10": {
"acc": 0.6731726747659829,
"acc_stderr": 0.004680949283855316,
"acc_norm": 0.8484365664210317,
"acc_norm_stderr": 0.0035786433875478452
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.037150621549989056,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.037150621549989056
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6679245283018868,
"acc_stderr": 0.02898545565233439,
"acc_norm": 0.6679245283018868,
"acc_norm_stderr": 0.02898545565233439
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.02535574126305527,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.02535574126305527
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782655,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782655
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.03510766597959217,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.03510766597959217
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124484,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124484
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758723,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6282051282051282,
"acc_stderr": 0.024503472557110936,
"acc_norm": 0.6282051282051282,
"acc_norm_stderr": 0.024503472557110936
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135356,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135356
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.036848815213890225,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.036848815213890225
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8091743119266055,
"acc_stderr": 0.016847676400091098,
"acc_norm": 0.8091743119266055,
"acc_norm_stderr": 0.016847676400091098
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.027303484599069436,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.027303484599069436
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281382,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281382
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.013778693778464076,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.013778693778464076
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7023121387283237,
"acc_stderr": 0.024617055388677006,
"acc_norm": 0.7023121387283237,
"acc_norm_stderr": 0.024617055388677006
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38324022346368714,
"acc_stderr": 0.016260159604429128,
"acc_norm": 0.38324022346368714,
"acc_norm_stderr": 0.016260159604429128
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.025917806117147158,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.025917806117147158
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.026311858071854155,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.026311858071854155
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.02540719779889016,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.02540719779889016
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.02973659252642444,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.02973659252642444
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4367666232073012,
"acc_stderr": 0.012667701919603662,
"acc_norm": 0.4367666232073012,
"acc_norm_stderr": 0.012667701919603662
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.02873932851398357,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.02873932851398357
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6503267973856209,
"acc_stderr": 0.019291961895066382,
"acc_norm": 0.6503267973856209,
"acc_norm_stderr": 0.019291961895066382
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065674,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065674
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786855,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786855
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.41615667074663404,
"mc1_stderr": 0.017255657502903046,
"mc2": 0.6144474102286928,
"mc2_stderr": 0.015672191454631425
},
"harness|winogrande|5": {
"acc": 0.7703235990528808,
"acc_stderr": 0.011821645601838229
},
"harness|gsm8k|5": {
"acc": 0.5382865807429871,
"acc_stderr": 0.01373204822701668
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_PetroGPT__Voldemort-10B-DPO | [
"region:us"
] | 2024-01-20T10:02:04+00:00 | {"pretty_name": "Evaluation run of PetroGPT/Voldemort-10B-DPO", "dataset_summary": "Dataset automatically created during the evaluation run of model [PetroGPT/Voldemort-10B-DPO](https://huggingface.co/PetroGPT/Voldemort-10B-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PetroGPT__Voldemort-10B-DPO\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-20T12:02:57.927448](https://huggingface.co/datasets/open-llm-leaderboard/details_PetroGPT__Voldemort-10B-DPO/blob/main/results_2024-01-20T12-02-57.927448.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6305876260706662,\n \"acc_stderr\": 0.03255938653931723,\n \"acc_norm\": 0.6330868385686215,\n \"acc_norm_stderr\": 0.033208227030172364,\n \"mc1\": 0.41615667074663404,\n \"mc1_stderr\": 0.017255657502903046,\n \"mc2\": 0.6144474102286928,\n \"mc2_stderr\": 0.015672191454631425\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6407849829351536,\n \"acc_stderr\": 0.014020224155839159,\n \"acc_norm\": 0.6604095563139932,\n \"acc_norm_stderr\": 0.013839039762820169\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6731726747659829,\n \"acc_stderr\": 0.004680949283855316,\n \"acc_norm\": 0.8484365664210317,\n \"acc_norm_stderr\": 0.0035786433875478452\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.037150621549989056,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.037150621549989056\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.02898545565233439,\n \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.02898545565233439\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305527,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305527\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n \"acc_stderr\": 0.023904914311782655,\n \"acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.023904914311782655\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.03510766597959217,\n \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.03510766597959217\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124484,\n \"acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124484\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758723,\n \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758723\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6282051282051282,\n \"acc_stderr\": 0.024503472557110936,\n \"acc_norm\": 0.6282051282051282,\n \"acc_norm_stderr\": 0.024503472557110936\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135356,\n \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135356\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2847682119205298,\n \"acc_stderr\": 0.036848815213890225,\n \"acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.036848815213890225\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8091743119266055,\n \"acc_stderr\": 0.016847676400091098,\n \"acc_norm\": 0.8091743119266055,\n \"acc_norm_stderr\": 0.016847676400091098\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069436,\n \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069436\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281382,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281382\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n \"acc_stderr\": 0.013778693778464076,\n \"acc_norm\": 0.8186462324393359,\n \"acc_norm_stderr\": 0.013778693778464076\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7023121387283237,\n \"acc_stderr\": 0.024617055388677006,\n \"acc_norm\": 0.7023121387283237,\n \"acc_norm_stderr\": 0.024617055388677006\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38324022346368714,\n \"acc_stderr\": 0.016260159604429128,\n \"acc_norm\": 0.38324022346368714,\n \"acc_norm_stderr\": 0.016260159604429128\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.025917806117147158,\n \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.025917806117147158\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n \"acc_stderr\": 0.026311858071854155,\n \"acc_norm\": 0.6881028938906752,\n \"acc_norm_stderr\": 0.026311858071854155\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.02540719779889016,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.02540719779889016\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46099290780141844,\n \"acc_stderr\": 0.02973659252642444,\n \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.02973659252642444\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4367666232073012,\n \"acc_stderr\": 0.012667701919603662,\n \"acc_norm\": 0.4367666232073012,\n \"acc_norm_stderr\": 0.012667701919603662\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.02873932851398357,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.02873932851398357\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6503267973856209,\n \"acc_stderr\": 0.019291961895066382,\n \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.019291961895066382\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065674,\n \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065674\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.027403859410786855,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.027403859410786855\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.41615667074663404,\n \"mc1_stderr\": 0.017255657502903046,\n \"mc2\": 0.6144474102286928,\n \"mc2_stderr\": 0.015672191454631425\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7703235990528808,\n \"acc_stderr\": 0.011821645601838229\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5382865807429871,\n \"acc_stderr\": 0.01373204822701668\n }\n}\n```", "repo_url": "https://huggingface.co/PetroGPT/Voldemort-10B-DPO", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|arc:challenge|25_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|arc:challenge|25_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|gsm8k|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|gsm8k|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hellaswag|10_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hellaswag|10_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T09-59-49.442476.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T12-02-57.927448.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["**/details_harness|winogrande|5_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["**/details_harness|winogrande|5_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-20T12-02-57.927448.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_20T09_59_49.442476", "path": ["results_2024-01-20T09-59-49.442476.parquet"]}, {"split": "2024_01_20T12_02_57.927448", "path": ["results_2024-01-20T12-02-57.927448.parquet"]}, {"split": "latest", "path": ["results_2024-01-20T12-02-57.927448.parquet"]}]}]} | 2024-01-20T12:05:35+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of PetroGPT/Voldemort-10B-DPO
Dataset automatically created during the evaluation run of model PetroGPT/Voldemort-10B-DPO on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-20T12:02:57.927448(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of PetroGPT/Voldemort-10B-DPO\n\n\n\nDataset automatically created during the evaluation run of model PetroGPT/Voldemort-10B-DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T12:02:57.927448(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of PetroGPT/Voldemort-10B-DPO\n\n\n\nDataset automatically created during the evaluation run of model PetroGPT/Voldemort-10B-DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T12:02:57.927448(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
5553beeb32c12fa0fbef2229955a1fa7c42f0f0c | # Dataset Card for "protogen-data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | killameep/protogen-data | [
"region:us"
] | 2024-01-20T10:14:30+00:00 | {"dataset_info": {"features": [{"name": "source_id", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "image", "dtype": "image"}, {"name": "tags", "sequence": "string"}, {"name": "url", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "selector", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 214815973.0, "num_examples": 512}], "download_size": 212424717, "dataset_size": 214815973.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-20T10:21:36+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "protogen-data"
More Information needed | [
"# Dataset Card for \"protogen-data\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"protogen-data\"\n\nMore Information needed"
] |
854aa5701b04a79c8591e7e33bc35741881d5d82 | # Dataset Card for "dr_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | SagarKeshave/dr_data | [
"region:us"
] | 2024-01-20T10:22:24+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "Profile", "dtype": "string"}, {"name": "Name", "dtype": "string"}, {"name": "input_ids", "sequence": "int32"}, {"name": "attention_mask", "sequence": "int8"}, {"name": "labels", "sequence": "int64"}], "splits": [{"name": "train", "num_bytes": 1658710.9056839475, "num_examples": 1440}, {"name": "test", "num_bytes": 185453.09431605248, "num_examples": 161}], "download_size": 591301, "dataset_size": 1844164.0}} | 2024-01-22T05:38:51+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "dr_data"
More Information needed | [
"# Dataset Card for \"dr_data\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"dr_data\"\n\nMore Information needed"
] |
d263a0c2ca894e3b1b1d6b5558ead5066fbd8d2f | # discography_v2_cdn
Archive of rebuilt music database
Using UUID version 7 ([uuidv7](https://github.com/LiosK/uuidv7))
| DeliberatorArchiver/discography_v2_cdn | [
"license:cc-by-nc-nd-4.0",
"region:us"
] | 2024-01-20T10:49:43+00:00 | {"license": "cc-by-nc-nd-4.0", "viewer": false} | 2024-02-16T13:59:15+00:00 | [] | [] | TAGS
#license-cc-by-nc-nd-4.0 #region-us
| # discography_v2_cdn
Archive of rebuilt music database
Using UUID version 7 (uuidv7)
| [
"# discography_v2_cdn\n\nArchive of rebuilt music database\n\nUsing UUID version 7 (uuidv7)"
] | [
"TAGS\n#license-cc-by-nc-nd-4.0 #region-us \n",
"# discography_v2_cdn\n\nArchive of rebuilt music database\n\nUsing UUID version 7 (uuidv7)"
] |
1050a0afa8a495e10e92980c2d5e98262cb7a9ef | # Welcome to Mercury 🪐!
## It is a code efficiency benchmark.
## Please consider citing our paper: https://arxiv.org/abs/2402.07844 | Elfsong/Mercury | [
"size_categories:1K<n<10K",
"language:en",
"arxiv:2402.07844",
"region:us"
] | 2024-01-20T11:21:20+00:00 | {"language": ["en"], "size_categories": ["1K<n<10K"], "dataset_info": {"features": [{"name": "slug_name", "dtype": "string"}, {"name": "meta_info", "struct": [{"name": "data", "struct": [{"name": "question", "struct": [{"name": "categoryTitle", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "difficulty", "dtype": "string"}, {"name": "questionFrontendId", "dtype": "string"}, {"name": "questionId", "dtype": "string"}, {"name": "questionTitle", "dtype": "string"}, {"name": "questionTitleSlug", "dtype": "string"}, {"name": "similarQuestions", "dtype": "string"}, {"name": "stats", "dtype": "string"}, {"name": "topicTags", "list": [{"name": "name", "dtype": "string"}, {"name": "slug", "dtype": "string"}]}]}]}]}, {"name": "id", "dtype": "string"}, {"name": "difficulty", "dtype": "string"}, {"name": "pretty_content", "sequence": "string"}, {"name": "solutions", "list": [{"name": "hash", "dtype": "int64"}, {"name": "runtime", "dtype": "string"}, {"name": "solution", "dtype": "string"}]}, {"name": "prompt", "dtype": "string"}, {"name": "generator_code", "dtype": "string"}, {"name": "convert_online", "dtype": "string"}, {"name": "convert_offline", "dtype": "string"}, {"name": "evaluate_offline", "dtype": "string"}, {"name": "entry_point", "dtype": "string"}, {"name": "test_cases", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 24879611, "num_examples": 1633}, {"name": "eval", "num_bytes": 7028101, "num_examples": 256}], "download_size": 10526574, "dataset_size": 31907712}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "eval", "path": "data/eval-*"}]}]} | 2024-02-13T04:55:24+00:00 | [
"2402.07844"
] | [
"en"
] | TAGS
#size_categories-1K<n<10K #language-English #arxiv-2402.07844 #region-us
| # Welcome to Mercury !
## It is a code efficiency benchmark.
## Please consider citing our paper: URL | [
"# Welcome to Mercury !",
"## It is a code efficiency benchmark.",
"## Please consider citing our paper: URL"
] | [
"TAGS\n#size_categories-1K<n<10K #language-English #arxiv-2402.07844 #region-us \n",
"# Welcome to Mercury !",
"## It is a code efficiency benchmark.",
"## Please consider citing our paper: URL"
] |
214fa7a1fe73b855a9d3c597c80da2d80e2a60ea | # Dataset Card for "CIDAR"
# 🌴CIDAR: *Culturally Relevant Instruction Dataset For Arabic*
<p align="center">
<img src="https://cdn-uploads.huggingface.co/production/uploads/655e10b1c38270696b290f20/lKec96otC8VdM09SnPKL8.png" width = "150px"/>
<p align="center"> [ <a href="https://arxiv.org/abs/2402.03177">Paper</a> - <a href="https://github.com/ARBML/CIDAR">GitHub</a> ]</p>
</p>
CIDAR contains **10,000** `instructions` and their `output`. The dataset was created by selecting around **9,109** samples from [Alpagasus](https://huggingface.co/datasets/mlabonne/alpagasus) dataset then translating it to `Arabic` using ChatGPT. In addition, we append that with around **891** Arabic grammar instructions from the webiste [Ask the teacher](https://learning.aljazeera.net/ar/asktheteacher). All the 10,000 samples were reviewed by around 12 reviewers.
<p align="center">
<img src="https://cdn-uploads.huggingface.co/production/uploads/655e10b1c38270696b290f20/0m-RewO_OAjPQRI0zE175.png" width = "1200px"/>
</p>
## 📚 Datasets Summary
<table>
<tr>
<th>Name</th>
<th>Explanation</th>
</tr>
<tr>
<td><a href=https://huggingface.co/datasets/arbml/cidar>CIDAR</a></t>
<td>10,000 instructions and responses in Arabic</td>
</tr>
<tr>
<td><a href=https://huggingface.co/datasets/arbml/cidar-eval-100>CIDAR-EVAL-100</a></t>
<td>100 instructions to evaluate LLMs on cultural relevance</td>
</tr>
<tr>
<td><a href=https://huggingface.co/datasets/arbml/cidar-mcq-100>CIDAR-MCQ-100</a></t>
<td>100 Multiple choice questions and answers to evaluate LLMs on cultural relevance </td>
</tr>
</table>
## 📋 Dataset Structure
- `index(int)`: Unique number of each instruction-output pair.
- `instruction(str)`: Required instruction.
- `output(str)`: Response for the given instruction.
## 📁 Loading The Dataset
You can download the dataset directly from HuggingFace or use the following code:
```python
from datasets import load_dataset
cidar = load_dataset('arbml/CIDAR')
```
## 📄 Sample From The Dataset:
This sample shows an Arabic-specific instruction related to `البلاغة(rhetoric)` and `الإعراب(grammatical status)`. The sentence is originally in Arabic.
**index**: 10030
**instruction**: لو سمحت اين صيغة المبالغة في " الجاحظ اديب سيارة اراؤه" و ما اعراب الجملة؟
**output**:
'صيغة المبالغة في هذه الجملة هي "سيّارة"، فاسم الفاعل من "السير" هو '
'"سائر/ سائرة" وإذا أردت الدلالة على المبالغة في سير الآراء (أي '
'شهرتها وكثرة تداولها) قلت "آراء سيّارة".\r\n'
'أما إعراب الجملة فهو: "الجاحظُ": مبتدأ مرفوع بالضمة. "أديبٌ": خبر '
'مرفوع بالضمة. "سيارةٌ": خبر متقدم، و"آراؤُه": مبتدأ متأخر، وهو '
'مضاف والهاء ضمير متصل مضاف إليه في محل جر. ويمكن اعتبار "سيارة" '
'مبتدأ وهو وصف يعمل عمل فعله، و"آراؤُه" فاعل سدّ مسدّ الخبر.\r\n'
'وفي الحالتين فجملة "سيارة آراؤه" جملة اسمية في محل رفع نعت '
'لـ"أديب".'
## 🧩 Contributers
There were at least 12 contributors to the annotation of CIDAR. You can check the list [here](https://alpacaarabic-production.up.railway.app/explore).
## ⛔️ Limitations and Future Work
CIDAR is intended for **research** purposes only. The authors disclaim any responsibility for misuse and condemn any use contrary to **Arabic culture** or **Islamic values**. Even though subjected to human verification, there is no guarantee that responses are entirely aligned with Arabic culture and Islamic values. Users of the dataset are urged to exercise caution, employ critical thinking, and seek guidance from representative figures when necessary.
## 🔑 License
CIDAR is intended and licensed for **research** use only. The dataset and weight diffs are licensed under **CC BY NC 4.0** (LIMITED TO NON-COMMERCIAL USE). Models trained using the dataset should not be used outside of research purposes.
[Creative Commons NonCommercial (CC BY-NC 4.0)](https://creativecommons.org/licenses/by-nc/4.0/deed.en).
## Citation
```
@misc{alyafeai2024cidar,
title={{CIDAR: Culturally Relevant Instruction Dataset For Arabic}},
author={Zaid Alyafeai and Khalid Almubarak and Ahmed Ashraf and Deema Alnuhait and Saied Alshahrani and Gubran A. Q. Abdulrahman and Gamil Ahmed and Qais Gawah and Zead Saleh and Mustafa Ghaleb and Yousef Ali and Maged S. Al-Shaibani},
year={2024},
eprint={2402.03177},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | arbml/CIDAR | [
"task_categories:text-generation",
"size_categories:1K<n<10K",
"language:ar",
"license:cc-by-nc-4.0",
"Instruction",
"arxiv:2402.03177",
"region:us"
] | 2024-01-20T11:34:18+00:00 | {"language": ["ar"], "license": "cc-by-nc-4.0", "size_categories": ["1K<n<10K"], "task_categories": ["text-generation"], "pretty_name": "CIDAR", "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "output", "dtype": "string"}, {"name": "instruction", "dtype": "string"}, {"name": "index", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 6712623, "num_examples": 10000}], "download_size": 3553672, "dataset_size": 6712623}, "tags": ["Instruction"]} | 2024-02-12T19:56:43+00:00 | [
"2402.03177"
] | [
"ar"
] | TAGS
#task_categories-text-generation #size_categories-1K<n<10K #language-Arabic #license-cc-by-nc-4.0 #Instruction #arxiv-2402.03177 #region-us
| Dataset Card for "CIDAR"
========================
CIDAR: *Culturally Relevant Instruction Dataset For Arabic*
===========================================================

[
CIDAR contains 10,000 'instructions' and their 'output'. The dataset was created by selecting around 9,109 samples from Alpagasus dataset then translating it to 'Arabic' using ChatGPT. In addition, we append that with around 891 Arabic grammar instructions from the webiste Ask the teacher. All the 10,000 samples were reviewed by around 12 reviewers.
<img src="URL width = "1200px"/>
Datasets Summary
----------------
Dataset Structure
-----------------
* 'index(int)': Unique number of each instruction-output pair.
* 'instruction(str)': Required instruction.
* 'output(str)': Response for the given instruction.
Loading The Dataset
-------------------
You can download the dataset directly from HuggingFace or use the following code:
Sample From The Dataset:
------------------------
This sample shows an Arabic-specific instruction related to 'البلاغة(rhetoric)' and 'الإعراب(grammatical status)'. The sentence is originally in Arabic.
index: 10030
instruction: لو سمحت اين صيغة المبالغة في " الجاحظ اديب سيارة اراؤه" و ما اعراب الجملة؟
output:
```
'صيغة المبالغة في هذه الجملة هي "سيّارة"، فاسم الفاعل من "السير" هو '
'"سائر/ سائرة" وإذا أردت الدلالة على المبالغة في سير الآراء (أي '
'شهرتها وكثرة تداولها) قلت "آراء سيّارة".\r\n'
'أما إعراب الجملة فهو: "الجاحظُ": مبتدأ مرفوع بالضمة. "أديبٌ": خبر '
'مرفوع بالضمة. "سيارةٌ": خبر متقدم، و"آراؤُه": مبتدأ متأخر، وهو '
'مضاف والهاء ضمير متصل مضاف إليه في محل جر. ويمكن اعتبار "سيارة" '
'مبتدأ وهو وصف يعمل عمل فعله، و"آراؤُه" فاعل سدّ مسدّ الخبر.\r\n'
'وفي الحالتين فجملة "سيارة آراؤه" جملة اسمية في محل رفع نعت '
'لـ"أديب".'
```
Contributers
------------
There were at least 12 contributors to the annotation of CIDAR. You can check the list here.
️ Limitations and Future Work
-----------------------------
CIDAR is intended for research purposes only. The authors disclaim any responsibility for misuse and condemn any use contrary to Arabic culture or Islamic values. Even though subjected to human verification, there is no guarantee that responses are entirely aligned with Arabic culture and Islamic values. Users of the dataset are urged to exercise caution, employ critical thinking, and seek guidance from representative figures when necessary.
License
-------
CIDAR is intended and licensed for research use only. The dataset and weight diffs are licensed under CC BY NC 4.0 (LIMITED TO NON-COMMERCIAL USE). Models trained using the dataset should not be used outside of research purposes.
Creative Commons NonCommercial (CC BY-NC 4.0).
| [] | [
"TAGS\n#task_categories-text-generation #size_categories-1K<n<10K #language-Arabic #license-cc-by-nc-4.0 #Instruction #arxiv-2402.03177 #region-us \n"
] |
29d4c3c19cc0a941648dda946ad9ef3a9705cee8 | A library of pose images for use with Controlnet.
ComfyUI workflow is embedded in the images.
The only limit is your imagination!
Enjoy :::D | Birdfingers/XenodimensionalPoseotron | [
"license:cc0-1.0",
"region:us"
] | 2024-01-20T11:45:34+00:00 | {"license": "cc0-1.0"} | 2024-01-20T12:00:24+00:00 | [] | [] | TAGS
#license-cc0-1.0 #region-us
| A library of pose images for use with Controlnet.
ComfyUI workflow is embedded in the images.
The only limit is your imagination!
Enjoy :::D | [] | [
"TAGS\n#license-cc0-1.0 #region-us \n"
] |
15a4b34fd8f13f253a0d9b349ad628c9e12be80e |
# Dataset Card for Evaluation run of 222gate/bleagle-7b-v0.1-test
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [222gate/bleagle-7b-v0.1-test](https://huggingface.co/222gate/bleagle-7b-v0.1-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_222gate__bleagle-7b-v0.1-test",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-20T12:06:12.141667](https://huggingface.co/datasets/open-llm-leaderboard/details_222gate__bleagle-7b-v0.1-test/blob/main/results_2024-01-20T12-06-12.141667.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6495153065236577,
"acc_stderr": 0.03225742893008422,
"acc_norm": 0.6491278190911731,
"acc_norm_stderr": 0.032934365884389465,
"mc1": 0.5483476132190942,
"mc1_stderr": 0.01742148030027764,
"mc2": 0.6782993454076689,
"mc2_stderr": 0.015293466947336146
},
"harness|arc:challenge|25": {
"acc": 0.7030716723549488,
"acc_stderr": 0.013352025976725228,
"acc_norm": 0.7226962457337884,
"acc_norm_stderr": 0.013082095839059374
},
"harness|hellaswag|10": {
"acc": 0.7178848834893448,
"acc_stderr": 0.004491093528113409,
"acc_norm": 0.8823939454291974,
"acc_norm_stderr": 0.0032148270694168255
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.037827289808654706,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.037827289808654706
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337142,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337142
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.03510766597959215,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.03510766597959215
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328974,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328974
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465066,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465066
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.01606005626853033,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.01606005626853033
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.013547415658662257,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.013547415658662257
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044283,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044283
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.45251396648044695,
"acc_stderr": 0.016646914804438775,
"acc_norm": 0.45251396648044695,
"acc_norm_stderr": 0.016646914804438775
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4576271186440678,
"acc_stderr": 0.012724296550980188,
"acc_norm": 0.4576271186440678,
"acc_norm_stderr": 0.012724296550980188
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6580882352941176,
"acc_stderr": 0.028814722422254184,
"acc_norm": 0.6580882352941176,
"acc_norm_stderr": 0.028814722422254184
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784603,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5483476132190942,
"mc1_stderr": 0.01742148030027764,
"mc2": 0.6782993454076689,
"mc2_stderr": 0.015293466947336146
},
"harness|winogrande|5": {
"acc": 0.8547750591949487,
"acc_stderr": 0.009902153904760829
},
"harness|gsm8k|5": {
"acc": 0.6512509476876421,
"acc_stderr": 0.01312722705503586
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_222gate__bleagle-7b-v0.1-test | [
"region:us"
] | 2024-01-20T12:08:32+00:00 | {"pretty_name": "Evaluation run of 222gate/bleagle-7b-v0.1-test", "dataset_summary": "Dataset automatically created during the evaluation run of model [222gate/bleagle-7b-v0.1-test](https://huggingface.co/222gate/bleagle-7b-v0.1-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_222gate__bleagle-7b-v0.1-test\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-20T12:06:12.141667](https://huggingface.co/datasets/open-llm-leaderboard/details_222gate__bleagle-7b-v0.1-test/blob/main/results_2024-01-20T12-06-12.141667.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6495153065236577,\n \"acc_stderr\": 0.03225742893008422,\n \"acc_norm\": 0.6491278190911731,\n \"acc_norm_stderr\": 0.032934365884389465,\n \"mc1\": 0.5483476132190942,\n \"mc1_stderr\": 0.01742148030027764,\n \"mc2\": 0.6782993454076689,\n \"mc2_stderr\": 0.015293466947336146\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7030716723549488,\n \"acc_stderr\": 0.013352025976725228,\n \"acc_norm\": 0.7226962457337884,\n \"acc_norm_stderr\": 0.013082095839059374\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7178848834893448,\n \"acc_stderr\": 0.004491093528113409,\n \"acc_norm\": 0.8823939454291974,\n \"acc_norm_stderr\": 0.0032148270694168255\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.037827289808654706,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.037827289808654706\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337142,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337142\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.03510766597959215,\n \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.03510766597959215\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328974,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328974\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465066,\n \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465066\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8311926605504587,\n \"acc_stderr\": 0.01606005626853033,\n \"acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.01606005626853033\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.013547415658662257,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.013547415658662257\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044283,\n \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044283\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.45251396648044695,\n \"acc_stderr\": 0.016646914804438775,\n \"acc_norm\": 0.45251396648044695,\n \"acc_norm_stderr\": 0.016646914804438775\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4576271186440678,\n \"acc_stderr\": 0.012724296550980188,\n \"acc_norm\": 0.4576271186440678,\n \"acc_norm_stderr\": 0.012724296550980188\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6580882352941176,\n \"acc_stderr\": 0.028814722422254184,\n \"acc_norm\": 0.6580882352941176,\n \"acc_norm_stderr\": 0.028814722422254184\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784603,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784603\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5483476132190942,\n \"mc1_stderr\": 0.01742148030027764,\n \"mc2\": 0.6782993454076689,\n \"mc2_stderr\": 0.015293466947336146\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8547750591949487,\n \"acc_stderr\": 0.009902153904760829\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6512509476876421,\n \"acc_stderr\": 0.01312722705503586\n }\n}\n```", "repo_url": "https://huggingface.co/222gate/bleagle-7b-v0.1-test", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|arc:challenge|25_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|gsm8k|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hellaswag|10_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T12-06-12.141667.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["**/details_harness|winogrande|5_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-20T12-06-12.141667.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_20T12_06_12.141667", "path": ["results_2024-01-20T12-06-12.141667.parquet"]}, {"split": "latest", "path": ["results_2024-01-20T12-06-12.141667.parquet"]}]}]} | 2024-01-20T12:08:53+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of 222gate/bleagle-7b-v0.1-test
Dataset automatically created during the evaluation run of model 222gate/bleagle-7b-v0.1-test on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-20T12:06:12.141667(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of 222gate/bleagle-7b-v0.1-test\n\n\n\nDataset automatically created during the evaluation run of model 222gate/bleagle-7b-v0.1-test on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T12:06:12.141667(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of 222gate/bleagle-7b-v0.1-test\n\n\n\nDataset automatically created during the evaluation run of model 222gate/bleagle-7b-v0.1-test on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T12:06:12.141667(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
d04607d8868c22f28ef653bd525c2a9e7b0c614f |
# Dataset Card for Evaluation run of alnrg2arg/blockchainlabs_7B_merged_test2_4_prune
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [alnrg2arg/blockchainlabs_7B_merged_test2_4_prune](https://huggingface.co/alnrg2arg/blockchainlabs_7B_merged_test2_4_prune) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_alnrg2arg__blockchainlabs_7B_merged_test2_4_prune",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-20T12:08:51.547790](https://huggingface.co/datasets/open-llm-leaderboard/details_alnrg2arg__blockchainlabs_7B_merged_test2_4_prune/blob/main/results_2024-01-20T12-08-51.547790.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5235864683456122,
"acc_stderr": 0.0342174975692429,
"acc_norm": 0.5284479425508523,
"acc_norm_stderr": 0.03496859005639417,
"mc1": 0.42962056303549573,
"mc1_stderr": 0.0173292345804091,
"mc2": 0.5902640868436692,
"mc2_stderr": 0.015985277759229078
},
"harness|arc:challenge|25": {
"acc": 0.5887372013651877,
"acc_stderr": 0.014379441068522084,
"acc_norm": 0.60580204778157,
"acc_norm_stderr": 0.014280522667467325
},
"harness|hellaswag|10": {
"acc": 0.5762796255725952,
"acc_stderr": 0.004931372657129799,
"acc_norm": 0.7774347739494125,
"acc_norm_stderr": 0.004151185615952065
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5592105263157895,
"acc_stderr": 0.04040311062490436,
"acc_norm": 0.5592105263157895,
"acc_norm_stderr": 0.04040311062490436
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5320754716981132,
"acc_stderr": 0.03070948699255655,
"acc_norm": 0.5320754716981132,
"acc_norm_stderr": 0.03070948699255655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.04140685639111503,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.04140685639111503
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4508670520231214,
"acc_stderr": 0.037940126746970296,
"acc_norm": 0.4508670520231214,
"acc_norm_stderr": 0.037940126746970296
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.41702127659574467,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.41702127659574467,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4413793103448276,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.4413793103448276,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.02455229220934266,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.02455229220934266
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6225806451612903,
"acc_stderr": 0.02757596072327823,
"acc_norm": 0.6225806451612903,
"acc_norm_stderr": 0.02757596072327823
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.034819048444388045,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.034819048444388045
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.03756335775187896,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.03756335775187896
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.03427308652999934,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.03427308652999934
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7772020725388601,
"acc_stderr": 0.03003114797764154,
"acc_norm": 0.7772020725388601,
"acc_norm_stderr": 0.03003114797764154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.49743589743589745,
"acc_stderr": 0.025350672979412195,
"acc_norm": 0.49743589743589745,
"acc_norm_stderr": 0.025350672979412195
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114986,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114986
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.46638655462184875,
"acc_stderr": 0.03240501447690071,
"acc_norm": 0.46638655462184875,
"acc_norm_stderr": 0.03240501447690071
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7119266055045872,
"acc_stderr": 0.019416445892636032,
"acc_norm": 0.7119266055045872,
"acc_norm_stderr": 0.019416445892636032
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.41203703703703703,
"acc_stderr": 0.03356787758160835,
"acc_norm": 0.41203703703703703,
"acc_norm_stderr": 0.03356787758160835
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6421568627450981,
"acc_stderr": 0.033644872860882996,
"acc_norm": 0.6421568627450981,
"acc_norm_stderr": 0.033644872860882996
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6962025316455697,
"acc_stderr": 0.029936696387138608,
"acc_norm": 0.6962025316455697,
"acc_norm_stderr": 0.029936696387138608
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5419847328244275,
"acc_stderr": 0.04369802690578756,
"acc_norm": 0.5419847328244275,
"acc_norm_stderr": 0.04369802690578756
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6942148760330579,
"acc_stderr": 0.04205953933884123,
"acc_norm": 0.6942148760330579,
"acc_norm_stderr": 0.04205953933884123
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.0471282125742677,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.0471282125742677
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6073619631901841,
"acc_stderr": 0.03836740907831029,
"acc_norm": 0.6073619631901841,
"acc_norm_stderr": 0.03836740907831029
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.6504854368932039,
"acc_stderr": 0.047211885060971716,
"acc_norm": 0.6504854368932039,
"acc_norm_stderr": 0.047211885060971716
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8290598290598291,
"acc_stderr": 0.024662496845209807,
"acc_norm": 0.8290598290598291,
"acc_norm_stderr": 0.024662496845209807
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.719029374201788,
"acc_stderr": 0.016073127851221232,
"acc_norm": 0.719029374201788,
"acc_norm_stderr": 0.016073127851221232
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5867052023121387,
"acc_stderr": 0.02651126136940925,
"acc_norm": 0.5867052023121387,
"acc_norm_stderr": 0.02651126136940925
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3474860335195531,
"acc_stderr": 0.01592556406020815,
"acc_norm": 0.3474860335195531,
"acc_norm_stderr": 0.01592556406020815
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5686274509803921,
"acc_stderr": 0.02835895631342355,
"acc_norm": 0.5686274509803921,
"acc_norm_stderr": 0.02835895631342355
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5530546623794212,
"acc_stderr": 0.028237769422085335,
"acc_norm": 0.5530546623794212,
"acc_norm_stderr": 0.028237769422085335
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5308641975308642,
"acc_stderr": 0.027767689606833932,
"acc_norm": 0.5308641975308642,
"acc_norm_stderr": 0.027767689606833932
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4148936170212766,
"acc_stderr": 0.0293922365846125,
"acc_norm": 0.4148936170212766,
"acc_norm_stderr": 0.0293922365846125
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3709256844850065,
"acc_stderr": 0.012337391684530312,
"acc_norm": 0.3709256844850065,
"acc_norm_stderr": 0.012337391684530312
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4889705882352941,
"acc_stderr": 0.030365446477275675,
"acc_norm": 0.4889705882352941,
"acc_norm_stderr": 0.030365446477275675
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.020192808271433795,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.020192808271433795
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731571,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731571
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.031680911612338825,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.031680911612338825
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6965174129353234,
"acc_stderr": 0.03251006816458618,
"acc_norm": 0.6965174129353234,
"acc_norm_stderr": 0.03251006816458618
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079022,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079022
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7076023391812866,
"acc_stderr": 0.03488647713457922,
"acc_norm": 0.7076023391812866,
"acc_norm_stderr": 0.03488647713457922
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42962056303549573,
"mc1_stderr": 0.0173292345804091,
"mc2": 0.5902640868436692,
"mc2_stderr": 0.015985277759229078
},
"harness|winogrande|5": {
"acc": 0.7640094711917916,
"acc_stderr": 0.011933828850275626
},
"harness|gsm8k|5": {
"acc": 0.21455648218347234,
"acc_stderr": 0.011307604104052885
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_alnrg2arg__blockchainlabs_7B_merged_test2_4_prune | [
"region:us"
] | 2024-01-20T12:11:06+00:00 | {"pretty_name": "Evaluation run of alnrg2arg/blockchainlabs_7B_merged_test2_4_prune", "dataset_summary": "Dataset automatically created during the evaluation run of model [alnrg2arg/blockchainlabs_7B_merged_test2_4_prune](https://huggingface.co/alnrg2arg/blockchainlabs_7B_merged_test2_4_prune) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_alnrg2arg__blockchainlabs_7B_merged_test2_4_prune\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-20T12:08:51.547790](https://huggingface.co/datasets/open-llm-leaderboard/details_alnrg2arg__blockchainlabs_7B_merged_test2_4_prune/blob/main/results_2024-01-20T12-08-51.547790.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5235864683456122,\n \"acc_stderr\": 0.0342174975692429,\n \"acc_norm\": 0.5284479425508523,\n \"acc_norm_stderr\": 0.03496859005639417,\n \"mc1\": 0.42962056303549573,\n \"mc1_stderr\": 0.0173292345804091,\n \"mc2\": 0.5902640868436692,\n \"mc2_stderr\": 0.015985277759229078\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5887372013651877,\n \"acc_stderr\": 0.014379441068522084,\n \"acc_norm\": 0.60580204778157,\n \"acc_norm_stderr\": 0.014280522667467325\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5762796255725952,\n \"acc_stderr\": 0.004931372657129799,\n \"acc_norm\": 0.7774347739494125,\n \"acc_norm_stderr\": 0.004151185615952065\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5592105263157895,\n \"acc_stderr\": 0.04040311062490436,\n \"acc_norm\": 0.5592105263157895,\n \"acc_norm_stderr\": 0.04040311062490436\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5320754716981132,\n \"acc_stderr\": 0.03070948699255655,\n \"acc_norm\": 0.5320754716981132,\n \"acc_norm_stderr\": 0.03070948699255655\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5694444444444444,\n \"acc_stderr\": 0.04140685639111503,\n \"acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.04140685639111503\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4508670520231214,\n \"acc_stderr\": 0.037940126746970296,\n \"acc_norm\": 0.4508670520231214,\n \"acc_norm_stderr\": 0.037940126746970296\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482758,\n \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482758\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3492063492063492,\n \"acc_stderr\": 0.02455229220934266,\n \"acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.02455229220934266\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6225806451612903,\n \"acc_stderr\": 0.02757596072327823,\n \"acc_norm\": 0.6225806451612903,\n \"acc_norm_stderr\": 0.02757596072327823\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.034819048444388045,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.034819048444388045\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.03756335775187896,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.03756335775187896\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.03427308652999934,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.03427308652999934\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7772020725388601,\n \"acc_stderr\": 0.03003114797764154,\n \"acc_norm\": 0.7772020725388601,\n \"acc_norm_stderr\": 0.03003114797764154\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.49743589743589745,\n \"acc_stderr\": 0.025350672979412195,\n \"acc_norm\": 0.49743589743589745,\n \"acc_norm_stderr\": 0.025350672979412195\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114986,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114986\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.46638655462184875,\n \"acc_stderr\": 0.03240501447690071,\n \"acc_norm\": 0.46638655462184875,\n \"acc_norm_stderr\": 0.03240501447690071\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7119266055045872,\n \"acc_stderr\": 0.019416445892636032,\n \"acc_norm\": 0.7119266055045872,\n \"acc_norm_stderr\": 0.019416445892636032\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.41203703703703703,\n \"acc_stderr\": 0.03356787758160835,\n \"acc_norm\": 0.41203703703703703,\n \"acc_norm_stderr\": 0.03356787758160835\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6421568627450981,\n \"acc_stderr\": 0.033644872860882996,\n \"acc_norm\": 0.6421568627450981,\n \"acc_norm_stderr\": 0.033644872860882996\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6962025316455697,\n \"acc_stderr\": 0.029936696387138608,\n \"acc_norm\": 0.6962025316455697,\n \"acc_norm_stderr\": 0.029936696387138608\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5419847328244275,\n \"acc_stderr\": 0.04369802690578756,\n \"acc_norm\": 0.5419847328244275,\n \"acc_norm_stderr\": 0.04369802690578756\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6942148760330579,\n \"acc_stderr\": 0.04205953933884123,\n \"acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.04205953933884123\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.0471282125742677,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.0471282125742677\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6073619631901841,\n \"acc_stderr\": 0.03836740907831029,\n \"acc_norm\": 0.6073619631901841,\n \"acc_norm_stderr\": 0.03836740907831029\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6504854368932039,\n \"acc_stderr\": 0.047211885060971716,\n \"acc_norm\": 0.6504854368932039,\n \"acc_norm_stderr\": 0.047211885060971716\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n \"acc_stderr\": 0.024662496845209807,\n \"acc_norm\": 0.8290598290598291,\n \"acc_norm_stderr\": 0.024662496845209807\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.719029374201788,\n \"acc_stderr\": 0.016073127851221232,\n \"acc_norm\": 0.719029374201788,\n \"acc_norm_stderr\": 0.016073127851221232\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5867052023121387,\n \"acc_stderr\": 0.02651126136940925,\n \"acc_norm\": 0.5867052023121387,\n \"acc_norm_stderr\": 0.02651126136940925\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3474860335195531,\n \"acc_stderr\": 0.01592556406020815,\n \"acc_norm\": 0.3474860335195531,\n \"acc_norm_stderr\": 0.01592556406020815\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5686274509803921,\n \"acc_stderr\": 0.02835895631342355,\n \"acc_norm\": 0.5686274509803921,\n \"acc_norm_stderr\": 0.02835895631342355\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5530546623794212,\n \"acc_stderr\": 0.028237769422085335,\n \"acc_norm\": 0.5530546623794212,\n \"acc_norm_stderr\": 0.028237769422085335\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5308641975308642,\n \"acc_stderr\": 0.027767689606833932,\n \"acc_norm\": 0.5308641975308642,\n \"acc_norm_stderr\": 0.027767689606833932\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4148936170212766,\n \"acc_stderr\": 0.0293922365846125,\n \"acc_norm\": 0.4148936170212766,\n \"acc_norm_stderr\": 0.0293922365846125\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3709256844850065,\n \"acc_stderr\": 0.012337391684530312,\n \"acc_norm\": 0.3709256844850065,\n \"acc_norm_stderr\": 0.012337391684530312\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4889705882352941,\n \"acc_stderr\": 0.030365446477275675,\n \"acc_norm\": 0.4889705882352941,\n \"acc_norm_stderr\": 0.030365446477275675\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.020192808271433795,\n \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.020192808271433795\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n \"acc_stderr\": 0.04724577405731571,\n \"acc_norm\": 0.5818181818181818,\n \"acc_norm_stderr\": 0.04724577405731571\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.031680911612338825,\n \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.031680911612338825\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6965174129353234,\n \"acc_stderr\": 0.03251006816458618,\n \"acc_norm\": 0.6965174129353234,\n \"acc_norm_stderr\": 0.03251006816458618\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n \"acc_stderr\": 0.03828401115079022,\n \"acc_norm\": 0.40963855421686746,\n \"acc_norm_stderr\": 0.03828401115079022\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7076023391812866,\n \"acc_stderr\": 0.03488647713457922,\n \"acc_norm\": 0.7076023391812866,\n \"acc_norm_stderr\": 0.03488647713457922\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42962056303549573,\n \"mc1_stderr\": 0.0173292345804091,\n \"mc2\": 0.5902640868436692,\n \"mc2_stderr\": 0.015985277759229078\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7640094711917916,\n \"acc_stderr\": 0.011933828850275626\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.21455648218347234,\n \"acc_stderr\": 0.011307604104052885\n }\n}\n```", "repo_url": "https://huggingface.co/alnrg2arg/blockchainlabs_7B_merged_test2_4_prune", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|arc:challenge|25_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|gsm8k|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hellaswag|10_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T12-08-51.547790.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["**/details_harness|winogrande|5_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-20T12-08-51.547790.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_20T12_08_51.547790", "path": ["results_2024-01-20T12-08-51.547790.parquet"]}, {"split": "latest", "path": ["results_2024-01-20T12-08-51.547790.parquet"]}]}]} | 2024-01-20T12:11:27+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of alnrg2arg/blockchainlabs_7B_merged_test2_4_prune
Dataset automatically created during the evaluation run of model alnrg2arg/blockchainlabs_7B_merged_test2_4_prune on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-20T12:08:51.547790(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of alnrg2arg/blockchainlabs_7B_merged_test2_4_prune\n\n\n\nDataset automatically created during the evaluation run of model alnrg2arg/blockchainlabs_7B_merged_test2_4_prune on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T12:08:51.547790(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of alnrg2arg/blockchainlabs_7B_merged_test2_4_prune\n\n\n\nDataset automatically created during the evaluation run of model alnrg2arg/blockchainlabs_7B_merged_test2_4_prune on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T12:08:51.547790(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
fa305749d0ab8819151b883d5e6af45f4543b2d7 | 1gb Russian-English dataset containing articles from [Habr](https://habr.com/ru/articles/) and [Wikipedia](https://ru.wikipedia.org/wiki/%D0%97%D0%B0%D0%B3%D0%BB%D0%B0%D0%B2%D0%BD%D0%B0%D1%8F_%D1%81%D1%82%D1%80%D0%B0%D0%BD%D0%B8%D1%86%D0%B0). | gozh/habr_and_wikipedia | [
"task_categories:text-generation",
"size_categories:100K<n<1M",
"language:ru",
"language:en",
"license:mit",
"region:us"
] | 2024-01-20T12:21:52+00:00 | {"language": ["ru", "en"], "license": "mit", "size_categories": ["100K<n<1M"], "task_categories": ["text-generation"], "pretty_name": "wikihabr"} | 2024-01-20T17:33:30+00:00 | [] | [
"ru",
"en"
] | TAGS
#task_categories-text-generation #size_categories-100K<n<1M #language-Russian #language-English #license-mit #region-us
| 1gb Russian-English dataset containing articles from Habr and Wikipedia. | [] | [
"TAGS\n#task_categories-text-generation #size_categories-100K<n<1M #language-Russian #language-English #license-mit #region-us \n"
] |
7680a46382d2a7a9dc7be33e0bdd461e61ec451f | # lilac/glaive-code-assistant
This dataset is a [Lilac](http://lilacml.com) processed dataset. Original dataset: [https://huggingface.co/datasets/glaiveai/glaive-code-assistant](https://huggingface.co/datasets/glaiveai/glaive-code-assistant)
To download the dataset to a local directory:
```bash
lilac download lilacai/lilac-glaive-code-assistant
```
or from python with:
```py
ll.download("lilacai/lilac-glaive-code-assistant")
```
| lilacai/lilac-glaive-code-assistant | [
"Lilac",
"region:us"
] | 2024-01-20T12:58:55+00:00 | {"tags": ["Lilac"]} | 2024-01-20T13:37:38+00:00 | [] | [] | TAGS
#Lilac #region-us
| # lilac/glaive-code-assistant
This dataset is a Lilac processed dataset. Original dataset: URL
To download the dataset to a local directory:
or from python with:
| [
"# lilac/glaive-code-assistant\nThis dataset is a Lilac processed dataset. Original dataset: URL\n\nTo download the dataset to a local directory:\n\n\n\nor from python with:"
] | [
"TAGS\n#Lilac #region-us \n",
"# lilac/glaive-code-assistant\nThis dataset is a Lilac processed dataset. Original dataset: URL\n\nTo download the dataset to a local directory:\n\n\n\nor from python with:"
] |
6fca3c3e3f64447b0f94ae31a3a88332c7e56244 | # Dataset Card for "instruction_data_train_hf"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Coooori/instruction_data_train_hf | [
"region:us"
] | 2024-01-20T13:14:20+00:00 | {"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 9997995, "num_examples": 8943}], "download_size": 1879848, "dataset_size": 9997995}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-26T12:41:35+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "instruction_data_train_hf"
More Information needed | [
"# Dataset Card for \"instruction_data_train_hf\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"instruction_data_train_hf\"\n\nMore Information needed"
] |
fc36e4d5ed52b7e1710503de6052e2ab0345322b | # Dataset Card for "instruction_data_dev_hf"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Coooori/instruction_data_dev_hf | [
"region:us"
] | 2024-01-20T13:14:25+00:00 | {"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1205865, "num_examples": 1087}], "download_size": 234027, "dataset_size": 1205865}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-26T12:41:43+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "instruction_data_dev_hf"
More Information needed | [
"# Dataset Card for \"instruction_data_dev_hf\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"instruction_data_dev_hf\"\n\nMore Information needed"
] |
edbc66cd00ba5337d4cce5fbf25793f54fe655e9 | # Dataset Card for "instruction_data_test_hf"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Coooori/instruction_data_test_hf | [
"region:us"
] | 2024-01-20T13:14:30+00:00 | {"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1185067, "num_examples": 1099}], "download_size": 228178, "dataset_size": 1185067}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-26T12:41:50+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "instruction_data_test_hf"
More Information needed | [
"# Dataset Card for \"instruction_data_test_hf\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"instruction_data_test_hf\"\n\nMore Information needed"
] |
04e5989422b9dbae826df68f349ad4268c5b346e | # Dataset Card for "whalley_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | MartinKu/whalley_dataset | [
"region:us"
] | 2024-01-20T13:21:32+00:00 | {"dataset_info": {"features": [{"name": "TEXT", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1462302, "num_examples": 2682}], "download_size": 823459, "dataset_size": 1462302}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-20T13:45:10+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "whalley_dataset"
More Information needed | [
"# Dataset Card for \"whalley_dataset\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"whalley_dataset\"\n\nMore Information needed"
] |
e32cc5e5cf93d2d8f49c9cced2e07844534b2fff |
# Dataset Card for Evaluation run of yunconglong/7Bx4_DPO_700
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [yunconglong/7Bx4_DPO_700](https://huggingface.co/yunconglong/7Bx4_DPO_700) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yunconglong__7Bx4_DPO_700",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-20T13:27:07.293796](https://huggingface.co/datasets/open-llm-leaderboard/details_yunconglong__7Bx4_DPO_700/blob/main/results_2024-01-20T13-27-07.293796.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6264963262252387,
"acc_stderr": 0.032586235357354824,
"acc_norm": 0.6267812687226082,
"acc_norm_stderr": 0.03325199889771283,
"mc1": 0.5042839657282742,
"mc1_stderr": 0.017502858577371272,
"mc2": 0.6898757991807779,
"mc2_stderr": 0.015194284964225467
},
"harness|arc:challenge|25": {
"acc": 0.6331058020477816,
"acc_stderr": 0.014084133118104296,
"acc_norm": 0.6467576791808873,
"acc_norm_stderr": 0.013967822714840053
},
"harness|hellaswag|10": {
"acc": 0.6800438159729137,
"acc_stderr": 0.00465505930860262,
"acc_norm": 0.8611830312686716,
"acc_norm_stderr": 0.003450488042964998
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926605,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800893,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800893
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.037738099906869334,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.037738099906869334
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247078,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247078
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.02525303255499769,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.02525303255499769
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949097,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949097
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7548387096774194,
"acc_stderr": 0.024472243840895528,
"acc_norm": 0.7548387096774194,
"acc_norm_stderr": 0.024472243840895528
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.035145285621750094,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.035145285621750094
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.025787723180723872,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.025787723180723872
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6512820512820513,
"acc_stderr": 0.024162780284017724,
"acc_norm": 0.6512820512820513,
"acc_norm_stderr": 0.024162780284017724
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871937,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871937
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.016197807956848043,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.016197807956848043
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.026361651668389094,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.026361651668389094
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159464,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159464
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516302,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946315,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946315
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.039166677628225836,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.039166677628225836
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077802,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077802
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8109833971902938,
"acc_stderr": 0.014000791294406999,
"acc_norm": 0.8109833971902938,
"acc_norm_stderr": 0.014000791294406999
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.02447699407624734,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.02447699407624734
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4402234636871508,
"acc_stderr": 0.01660256461504994,
"acc_norm": 0.4402234636871508,
"acc_norm_stderr": 0.01660256461504994
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.02625605383571896,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.02625605383571896
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495026,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495026
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4432624113475177,
"acc_stderr": 0.029634838473766002,
"acc_norm": 0.4432624113475177,
"acc_norm_stderr": 0.029634838473766002
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45827900912646674,
"acc_stderr": 0.01272570165695364,
"acc_norm": 0.45827900912646674,
"acc_norm_stderr": 0.01272570165695364
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6580882352941176,
"acc_stderr": 0.028814722422254184,
"acc_norm": 0.6580882352941176,
"acc_norm_stderr": 0.028814722422254184
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.01943177567703731,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.01943177567703731
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6979591836734694,
"acc_stderr": 0.0293936093198798,
"acc_norm": 0.6979591836734694,
"acc_norm_stderr": 0.0293936093198798
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5042839657282742,
"mc1_stderr": 0.017502858577371272,
"mc2": 0.6898757991807779,
"mc2_stderr": 0.015194284964225467
},
"harness|winogrande|5": {
"acc": 0.7971586424625099,
"acc_stderr": 0.011301439925936654
},
"harness|gsm8k|5": {
"acc": 0.6338134950720242,
"acc_stderr": 0.013270100238748835
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_yunconglong__7Bx4_DPO_700 | [
"region:us"
] | 2024-01-20T13:29:21+00:00 | {"pretty_name": "Evaluation run of yunconglong/7Bx4_DPO_700", "dataset_summary": "Dataset automatically created during the evaluation run of model [yunconglong/7Bx4_DPO_700](https://huggingface.co/yunconglong/7Bx4_DPO_700) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yunconglong__7Bx4_DPO_700\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-20T13:27:07.293796](https://huggingface.co/datasets/open-llm-leaderboard/details_yunconglong__7Bx4_DPO_700/blob/main/results_2024-01-20T13-27-07.293796.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6264963262252387,\n \"acc_stderr\": 0.032586235357354824,\n \"acc_norm\": 0.6267812687226082,\n \"acc_norm_stderr\": 0.03325199889771283,\n \"mc1\": 0.5042839657282742,\n \"mc1_stderr\": 0.017502858577371272,\n \"mc2\": 0.6898757991807779,\n \"mc2_stderr\": 0.015194284964225467\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6331058020477816,\n \"acc_stderr\": 0.014084133118104296,\n \"acc_norm\": 0.6467576791808873,\n \"acc_norm_stderr\": 0.013967822714840053\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6800438159729137,\n \"acc_stderr\": 0.00465505930860262,\n \"acc_norm\": 0.8611830312686716,\n \"acc_norm_stderr\": 0.003450488042964998\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926605,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926605\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800893,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800893\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247078,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247078\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4021164021164021,\n \"acc_stderr\": 0.02525303255499769,\n \"acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.02525303255499769\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.04343525428949097,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.04343525428949097\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7548387096774194,\n \"acc_stderr\": 0.024472243840895528,\n \"acc_norm\": 0.7548387096774194,\n \"acc_norm_stderr\": 0.024472243840895528\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.035145285621750094,\n \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.035145285621750094\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723872,\n \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723872\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.024162780284017724,\n \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.024162780284017724\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871937,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871937\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8275229357798165,\n \"acc_stderr\": 0.016197807956848043,\n \"acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.016197807956848043\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5509259259259259,\n \"acc_stderr\": 0.03392238405321617,\n \"acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.03392238405321617\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389094,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389094\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159464,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159464\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516302,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516302\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.042365112580946315,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.042365112580946315\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.039166677628225836,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.039166677628225836\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.022509033937077802,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.022509033937077802\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n \"acc_stderr\": 0.014000791294406999,\n \"acc_norm\": 0.8109833971902938,\n \"acc_norm_stderr\": 0.014000791294406999\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.02447699407624734,\n \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.02447699407624734\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4402234636871508,\n \"acc_stderr\": 0.01660256461504994,\n \"acc_norm\": 0.4402234636871508,\n \"acc_norm_stderr\": 0.01660256461504994\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.02625605383571896,\n \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.02625605383571896\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495026,\n \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495026\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4432624113475177,\n \"acc_stderr\": 0.029634838473766002,\n \"acc_norm\": 0.4432624113475177,\n \"acc_norm_stderr\": 0.029634838473766002\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45827900912646674,\n \"acc_stderr\": 0.01272570165695364,\n \"acc_norm\": 0.45827900912646674,\n \"acc_norm_stderr\": 0.01272570165695364\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6580882352941176,\n \"acc_stderr\": 0.028814722422254184,\n \"acc_norm\": 0.6580882352941176,\n \"acc_norm_stderr\": 0.028814722422254184\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.01943177567703731,\n \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.01943177567703731\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.0293936093198798,\n \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.0293936093198798\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5042839657282742,\n \"mc1_stderr\": 0.017502858577371272,\n \"mc2\": 0.6898757991807779,\n \"mc2_stderr\": 0.015194284964225467\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7971586424625099,\n \"acc_stderr\": 0.011301439925936654\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6338134950720242,\n \"acc_stderr\": 0.013270100238748835\n }\n}\n```", "repo_url": "https://huggingface.co/yunconglong/7Bx4_DPO_700", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|arc:challenge|25_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|gsm8k|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hellaswag|10_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T13-27-07.293796.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["**/details_harness|winogrande|5_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-20T13-27-07.293796.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_20T13_27_07.293796", "path": ["results_2024-01-20T13-27-07.293796.parquet"]}, {"split": "latest", "path": ["results_2024-01-20T13-27-07.293796.parquet"]}]}]} | 2024-01-20T13:29:42+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of yunconglong/7Bx4_DPO_700
Dataset automatically created during the evaluation run of model yunconglong/7Bx4_DPO_700 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-20T13:27:07.293796(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of yunconglong/7Bx4_DPO_700\n\n\n\nDataset automatically created during the evaluation run of model yunconglong/7Bx4_DPO_700 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T13:27:07.293796(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of yunconglong/7Bx4_DPO_700\n\n\n\nDataset automatically created during the evaluation run of model yunconglong/7Bx4_DPO_700 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T13:27:07.293796(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
c04c0d69682ed4df01e844ac669b4466a1921e34 |
# Dataset Card for Evaluation run of yunconglong/10.7Bx2_DPO_200
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [yunconglong/10.7Bx2_DPO_200](https://huggingface.co/yunconglong/10.7Bx2_DPO_200) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yunconglong__10.7Bx2_DPO_200",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-20T13:31:45.156743](https://huggingface.co/datasets/open-llm-leaderboard/details_yunconglong__10.7Bx2_DPO_200/blob/main/results_2024-01-20T13-31-45.156743.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6651670611224345,
"acc_stderr": 0.031427367721252715,
"acc_norm": 0.6668881977957413,
"acc_norm_stderr": 0.03205462104028336,
"mc1": 0.5924112607099143,
"mc1_stderr": 0.017201949234553107,
"mc2": 0.7538120166331955,
"mc2_stderr": 0.014190041419041042
},
"harness|arc:challenge|25": {
"acc": 0.681740614334471,
"acc_stderr": 0.013611993916971451,
"acc_norm": 0.7022184300341296,
"acc_norm_stderr": 0.013363080107244484
},
"harness|hellaswag|10": {
"acc": 0.7027484564827724,
"acc_stderr": 0.004561141293448453,
"acc_norm": 0.8822943636725752,
"acc_norm_stderr": 0.003216006357760382
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7631578947368421,
"acc_stderr": 0.03459777606810535,
"acc_norm": 0.7631578947368421,
"acc_norm_stderr": 0.03459777606810535
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.028049186315695248,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.028049186315695248
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.625531914893617,
"acc_stderr": 0.03163910665367291,
"acc_norm": 0.625531914893617,
"acc_norm_stderr": 0.03163910665367291
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.025680564640056882,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.025680564640056882
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8096774193548387,
"acc_stderr": 0.022331707611823078,
"acc_norm": 0.8096774193548387,
"acc_norm_stderr": 0.022331707611823078
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8121212121212121,
"acc_stderr": 0.03050193405942914,
"acc_norm": 0.8121212121212121,
"acc_norm_stderr": 0.03050193405942914
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8737373737373737,
"acc_stderr": 0.023664359402880232,
"acc_norm": 0.8737373737373737,
"acc_norm_stderr": 0.023664359402880232
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328971,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328971
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6512820512820513,
"acc_stderr": 0.02416278028401772,
"acc_norm": 0.6512820512820513,
"acc_norm_stderr": 0.02416278028401772
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.029837962388291943,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.029837962388291943
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.039837983066598075,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.039837983066598075
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.015555802713590172,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.015555802713590172
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.03344887382997865,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.03344887382997865
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8774509803921569,
"acc_stderr": 0.023015389732458254,
"acc_norm": 0.8774509803921569,
"acc_norm_stderr": 0.023015389732458254
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8565400843881856,
"acc_stderr": 0.022818291821017012,
"acc_norm": 0.8565400843881856,
"acc_norm_stderr": 0.022818291821017012
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7085201793721974,
"acc_stderr": 0.03050028317654585,
"acc_norm": 0.7085201793721974,
"acc_norm_stderr": 0.03050028317654585
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.0345727283691767,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.0345727283691767
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.046840993210771065,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.046840993210771065
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.034926064766237906,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.034926064766237906
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281372,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281372
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4301675977653631,
"acc_stderr": 0.01655860163604104,
"acc_norm": 0.4301675977653631,
"acc_norm_stderr": 0.01655860163604104
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.761437908496732,
"acc_stderr": 0.02440439492808787,
"acc_norm": 0.761437908496732,
"acc_norm_stderr": 0.02440439492808787
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818777,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818777
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7808641975308642,
"acc_stderr": 0.023016705640262196,
"acc_norm": 0.7808641975308642,
"acc_norm_stderr": 0.023016705640262196
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.02982074719142244,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.02982074719142244
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49022164276401564,
"acc_stderr": 0.012767793787729336,
"acc_norm": 0.49022164276401564,
"acc_norm_stderr": 0.012767793787729336
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7536764705882353,
"acc_stderr": 0.02617343857052,
"acc_norm": 0.7536764705882353,
"acc_norm_stderr": 0.02617343857052
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.018850084696468723,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.018850084696468723
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7551020408163265,
"acc_stderr": 0.02752963744017492,
"acc_norm": 0.7551020408163265,
"acc_norm_stderr": 0.02752963744017492
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.02768691358801301,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.02768691358801301
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685515,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5924112607099143,
"mc1_stderr": 0.017201949234553107,
"mc2": 0.7538120166331955,
"mc2_stderr": 0.014190041419041042
},
"harness|winogrande|5": {
"acc": 0.819258089976322,
"acc_stderr": 0.010814911009613983
},
"harness|gsm8k|5": {
"acc": 0.6095526914329037,
"acc_stderr": 0.013437829864668582
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_yunconglong__10.7Bx2_DPO_200 | [
"region:us"
] | 2024-01-20T13:34:01+00:00 | {"pretty_name": "Evaluation run of yunconglong/10.7Bx2_DPO_200", "dataset_summary": "Dataset automatically created during the evaluation run of model [yunconglong/10.7Bx2_DPO_200](https://huggingface.co/yunconglong/10.7Bx2_DPO_200) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yunconglong__10.7Bx2_DPO_200\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-20T13:31:45.156743](https://huggingface.co/datasets/open-llm-leaderboard/details_yunconglong__10.7Bx2_DPO_200/blob/main/results_2024-01-20T13-31-45.156743.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6651670611224345,\n \"acc_stderr\": 0.031427367721252715,\n \"acc_norm\": 0.6668881977957413,\n \"acc_norm_stderr\": 0.03205462104028336,\n \"mc1\": 0.5924112607099143,\n \"mc1_stderr\": 0.017201949234553107,\n \"mc2\": 0.7538120166331955,\n \"mc2_stderr\": 0.014190041419041042\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.681740614334471,\n \"acc_stderr\": 0.013611993916971451,\n \"acc_norm\": 0.7022184300341296,\n \"acc_norm_stderr\": 0.013363080107244484\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7027484564827724,\n \"acc_stderr\": 0.004561141293448453,\n \"acc_norm\": 0.8822943636725752,\n \"acc_norm_stderr\": 0.003216006357760382\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7631578947368421,\n \"acc_stderr\": 0.03459777606810535,\n \"acc_norm\": 0.7631578947368421,\n \"acc_norm_stderr\": 0.03459777606810535\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.028049186315695248,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.028049186315695248\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.625531914893617,\n \"acc_stderr\": 0.03163910665367291,\n \"acc_norm\": 0.625531914893617,\n \"acc_norm_stderr\": 0.03163910665367291\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.025680564640056882,\n \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.025680564640056882\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8096774193548387,\n \"acc_stderr\": 0.022331707611823078,\n \"acc_norm\": 0.8096774193548387,\n \"acc_norm_stderr\": 0.022331707611823078\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8121212121212121,\n \"acc_stderr\": 0.03050193405942914,\n \"acc_norm\": 0.8121212121212121,\n \"acc_norm_stderr\": 0.03050193405942914\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8737373737373737,\n \"acc_stderr\": 0.023664359402880232,\n \"acc_norm\": 0.8737373737373737,\n \"acc_norm_stderr\": 0.023664359402880232\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328971,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328971\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.02416278028401772,\n \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.02416278028401772\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.029837962388291943,\n \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.029837962388291943\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.39072847682119205,\n \"acc_stderr\": 0.039837983066598075,\n \"acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.039837983066598075\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590172,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590172\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5972222222222222,\n \"acc_stderr\": 0.03344887382997865,\n \"acc_norm\": 0.5972222222222222,\n \"acc_norm_stderr\": 0.03344887382997865\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8774509803921569,\n \"acc_stderr\": 0.023015389732458254,\n \"acc_norm\": 0.8774509803921569,\n \"acc_norm_stderr\": 0.023015389732458254\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8565400843881856,\n \"acc_stderr\": 0.022818291821017012,\n \"acc_norm\": 0.8565400843881856,\n \"acc_norm_stderr\": 0.022818291821017012\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n \"acc_stderr\": 0.03050028317654585,\n \"acc_norm\": 0.7085201793721974,\n \"acc_norm_stderr\": 0.03050028317654585\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8264462809917356,\n \"acc_stderr\": 0.0345727283691767,\n \"acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.0345727283691767\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.046840993210771065,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.046840993210771065\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.034926064766237906,\n \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.034926064766237906\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281372,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281372\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4301675977653631,\n \"acc_stderr\": 0.01655860163604104,\n \"acc_norm\": 0.4301675977653631,\n \"acc_norm_stderr\": 0.01655860163604104\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.761437908496732,\n \"acc_stderr\": 0.02440439492808787,\n \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.02440439492808787\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818777,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818777\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7808641975308642,\n \"acc_stderr\": 0.023016705640262196,\n \"acc_norm\": 0.7808641975308642,\n \"acc_norm_stderr\": 0.023016705640262196\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.02982074719142244,\n \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.02982074719142244\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49022164276401564,\n \"acc_stderr\": 0.012767793787729336,\n \"acc_norm\": 0.49022164276401564,\n \"acc_norm_stderr\": 0.012767793787729336\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7536764705882353,\n \"acc_stderr\": 0.02617343857052,\n \"acc_norm\": 0.7536764705882353,\n \"acc_norm_stderr\": 0.02617343857052\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6813725490196079,\n \"acc_stderr\": 0.018850084696468723,\n \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.018850084696468723\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.02752963744017492,\n \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.02752963744017492\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n \"acc_stderr\": 0.02768691358801301,\n \"acc_norm\": 0.8109452736318408,\n \"acc_norm_stderr\": 0.02768691358801301\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5924112607099143,\n \"mc1_stderr\": 0.017201949234553107,\n \"mc2\": 0.7538120166331955,\n \"mc2_stderr\": 0.014190041419041042\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.819258089976322,\n \"acc_stderr\": 0.010814911009613983\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6095526914329037,\n \"acc_stderr\": 0.013437829864668582\n }\n}\n```", "repo_url": "https://huggingface.co/yunconglong/10.7Bx2_DPO_200", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|arc:challenge|25_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|gsm8k|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hellaswag|10_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T13-31-45.156743.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["**/details_harness|winogrande|5_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-20T13-31-45.156743.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_20T13_31_45.156743", "path": ["results_2024-01-20T13-31-45.156743.parquet"]}, {"split": "latest", "path": ["results_2024-01-20T13-31-45.156743.parquet"]}]}]} | 2024-01-20T13:34:22+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of yunconglong/10.7Bx2_DPO_200
Dataset automatically created during the evaluation run of model yunconglong/10.7Bx2_DPO_200 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-20T13:31:45.156743(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of yunconglong/10.7Bx2_DPO_200\n\n\n\nDataset automatically created during the evaluation run of model yunconglong/10.7Bx2_DPO_200 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T13:31:45.156743(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of yunconglong/10.7Bx2_DPO_200\n\n\n\nDataset automatically created during the evaluation run of model yunconglong/10.7Bx2_DPO_200 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T13:31:45.156743(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
9f2e0187329972e2eccc2f66817e2296ec4106f9 |
# Synthetic Clinical Notes
This dataset is post-processed version of [starmpcc/Asclepius-Synthetic-Clinical-Notes](https://huggingface.co/datasets/starmpcc/Asclepius-Synthetic-Clinical-Notes):
- Turn into Alpaca format (`instruction`, `input`, and `output`)
- Add embeddings for `input` and `output` columns using [BAAI/bge-small-en-v1.5](https://huggingface.co/datasets/BAAI/bge-small-en-v1.5)
| | Details |
| --------------------- | -------------------------------------------------- |
| Sample Count | 158k |
| Token Count | 648m |
| Origin | https://figshare.com/authors/Zhengyun_Zhao/16480335|
| Source of raw data | PubMed Central (PMC) and MIMIC 3 |
| Processing details | [original](https://huggingface.co/datasets/starmpcc/Asclepius-Synthetic-Clinical-Notes), [paper](https://arxiv.org/pdf/2309.00237.pdf) <a target="_blank" href="https://colab.research.google.com/drive/12nk-nLo46P8GOVqpBIA2wDAYj5SnUGW5?usp=sharing"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/></a> |
| Embedding Model | [BAAI/bge-small-en-v1.5](https://huggingface.co/datasets/BAAI/bge-small-en-v1.5) |
## Data Diversity
| index | Example Output | GPT-4 Rationale | GPT-4 Diversity Rating |
|-------|----------------|-----------------|------------------------|
| 137083| The coreferential expressions used to refer to the patient's severe bioprosthetic mitral valve stenosis and severe tricuspid regurgitation in the hospital course section of the discharge summary were "the patient had an irregular heartbeat with a diastolic murmur detected by auscultation" and "Transthoracic echocardiography revealed severe bioprosthetic mitral valve stenosis and severe tricuspid regurgitation." | Cardiology, Diagnostic Imaging, Physical Examination | 5 |
| 113558| The coreference resolved in the hospital course section related to the patient's perforation in the sigmoid colon is that the perforation found in the colon was 3-cm long and located 5cm above the anastomosis. This led to a colon segmental resection with loop sigmoid colostomy and subsequent recovery with no complications. | Gastrointestinal Surgery, Perforation Location, Post-surgical Recovery | 5 |
| 97204 | The prescribed biologic medications, Adalimumab and later Certolizumab, were used to treat the resurgence of the patient's tattoo manifestations after tapering of systemic glucocorticoids, but Adalimumab caused an injection site reaction, which prompted a change to Certolizumab. | Pharmacology, Medication Adjustment, Treatment Complications | 5 |
| 53669 | In the hospital course of the discharge summary, coreferences for the patient's respiratory status are resolved using terms such as "her pulmonary clinical signs," "she presented no signs of septic shock," and "her clinical condition finally improved." Coreferences for the patient's treatment are resolved using phrases such as "she was given three doses of spiramycin," "antimicrobial therapy with ceftriaxone was initiated," and "triple antimicrobial therapy with piperacillin-tazobactam, spiramycin, and amikacin was introduced." | Respiratory Infection, Antimicrobial Therapy, Clinical Improvement | 5 |
| 39865 | Using Named Entity Recognition in the discharge summary, the identified named entities related to Stickler syndrome are "Stickler syndrome" and "beaded vitreous phenotype." The identified named entities related to diagnostic testing are "Multiplex Ligation-dependent Probe Amplification (MLPA)" and "exons 41 and 42 [c.3025-3168, p.Gly1009-Val1056]." However, it should be noted that the discharge summary does not provide a comprehensive list of all named entities related to Stickler syndrome and diagnostic testing, and further review of the patient's medical records may be necessary for a complete analysis. | Genetic Testing, Stickler Syndrome, Diagnostic Specificity | 5 |
| 85187 | The patient was diagnosed with metastatic Leydig cell tumour of the spine and underwent surgery through a right subscapular 3rd rib thoracotomy followed by postoperative radiotherapy with radical intent. The patient is advised to follow up regularly as per oncologist's advice and to come back immediately in case of any medical emergency. No discharge medications were given as per the discharge summary. | Oncology, Surgical Approach, Radiotherapy | 5 |
| 99107 | The patient had a complicated problem with their heart's aortic valve and the wall dividing the two chambers of their heart. The valve became detached and the wall had growths on it, likely from an infection. Despite treatment, the patient's condition worsened and they were made comfortable with symptom control and palliative care before passing away. | Cardiac Condition, Palliative Care, End-of-Life | 5 |
| 65981 | The diagnosis for the 10-year-old female patient was a non-displaced scaphoid fracture, and the diagnostic studies used were a dual-energy computed tomography (DECT) scan which showed bone marrow edema (BME) in the scaphoid bone on VNCa images and a confirmatory magnetic resonance imaging (MRI). | Pediatric Orthopedics, Advanced Imaging, Fracture Diagnosis | 5 |
| 68814 | The expanded forms of the abbreviations in the hospital course section are: transnasal endoscopic excision (removal of pituitary adenoma using an endoscope through the nasal cavity) and MRN (medical record number). The diagnosis section abbreviations do not need expansion as they are already spelled out (pituitary adenoma). | Endoscopic Surgery, Pituitary Adenoma, Abbreviation Clarification | 5 |
| 16059 | Based on the given discharge summary, the named entities related to Patient 1's diagnosis of influenza B that can be identified are the diagnosis itself and the prescribed medication, oseltamivir. However, there is no mention of the patient's prior immunization history or any recommendations for future vaccination. Therefore, we cannot fully respond to the healthcare professional's instruction regarding receiving the influenza vaccination to prevent future infections. | Infectious Disease, Influenza B Treatment, Pharmacological Management | 5 |

## Data Lineage
```text
Technoculture/Synthetic-Clinical-Notes
↳ starmpcc/Asclepius-Synthetic-Clinical-Notes
↳ zhengyun21/PMC-Patients [code](https://github.com/zhao-zy15/PMC-Patients)
↳ PubMed Central (PMC)
```
---
> prompt for GPT-4 based annotation on diversity
> ```text
> | index | Example Output |
> |--------|---------------|
> | 137083 | The coreferential expressions used to refer to the patient's severe bioprosthetic mitral valve stenosis and severe tricuspid regurgitation in the hospital course section of the discharge summary were "the patient had an irregular heartbeat with a diastolic murmur detected by auscultation" and "Transthoracic echocardiography revealed severe bioprosthetic mitral valve stenosis and severe tricuspid regurgitation." |
>
> for each row, add 2 columns.
>
> Column 3 named 'GPT-4 Rationale': Rationale for how it is is similar or/and diverse with respect to all the other examples in the table.
> Column 4 named 'GPT-4 Diversity Rating': mark for how diverse the example is from all the other examples in the table.
>
> Rating System:
> 0-1: Not Diverse - Almost identical to another example in the table
> 2-3: Very Similar - A somewhat similar example exists in the table
> 4: Fairly Diverse - A fairly dissimilar example from any other example in the table
> 5: Very Diverse - Completely dissimilar to any other example in the table
>
> Return escaped markdown so it can be copied pasted as is.
> ``` | Technoculture/synthetic-clinical-notes-embedded | [
"task_categories:question-answering",
"task_categories:summarization",
"size_categories:100K<n<1M",
"language:en",
"license:mit",
"starmpcc/Asclepius-Synthetic-Clinical-Notes",
"BAAI/bge-small-en-v1.5",
"medical",
"arxiv:2309.00237",
"region:us"
] | 2024-01-20T13:38:35+00:00 | {"language": ["en"], "license": "mit", "size_categories": ["100K<n<1M"], "task_categories": ["question-answering", "summarization"], "pretty_name": "Synthetic Clinical Notes", "tags": ["starmpcc/Asclepius-Synthetic-Clinical-Notes", "BAAI/bge-small-en-v1.5", "medical"], "dataset_info": {"features": [{"name": "output", "dtype": "string"}, {"name": "task", "dtype": "string"}, {"name": "instruction", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "input_embedding", "sequence": "float32"}, {"name": "output_embedding", "sequence": "float64"}], "splits": [{"name": "train", "num_bytes": 1199998956, "num_examples": 158114}], "download_size": 967764780, "dataset_size": 1199998956}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-11T17:39:23+00:00 | [
"2309.00237"
] | [
"en"
] | TAGS
#task_categories-question-answering #task_categories-summarization #size_categories-100K<n<1M #language-English #license-mit #starmpcc/Asclepius-Synthetic-Clinical-Notes #BAAI/bge-small-en-v1.5 #medical #arxiv-2309.00237 #region-us
| Synthetic Clinical Notes
========================
This dataset is post-processed version of starmpcc/Asclepius-Synthetic-Clinical-Notes:
* Turn into Alpaca format ('instruction', 'input', and 'output')
* Add embeddings for 'input' and 'output' columns using BAAI/bge-small-en-v1.5
Data Diversity
--------------
!image/png
Data Lineage
------------
---
>
> prompt for GPT-4 based annotation on diversity
>
>
>
| [] | [
"TAGS\n#task_categories-question-answering #task_categories-summarization #size_categories-100K<n<1M #language-English #license-mit #starmpcc/Asclepius-Synthetic-Clinical-Notes #BAAI/bge-small-en-v1.5 #medical #arxiv-2309.00237 #region-us \n"
] |
8a3e3218be45dbca82d9a38b2970fa067f190085 | # Dataset Card for "whalley_dataset_ver1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | MartinKu/whalley_dataset_ver1 | [
"region:us"
] | 2024-01-20T13:47:01+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "TEXT", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1462302, "num_examples": 2682}, {"name": "validation", "num_bytes": 1462302, "num_examples": 2682}, {"name": "test", "num_bytes": 1462302, "num_examples": 2682}], "download_size": 0, "dataset_size": 4386906}} | 2024-01-20T14:12:35+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "whalley_dataset_ver1"
More Information needed | [
"# Dataset Card for \"whalley_dataset_ver1\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"whalley_dataset_ver1\"\n\nMore Information needed"
] |
001c0cc8400fb1b1caaf1e021a60d258be106e8c | # Dataset Card for "MathInstruct-Core-DifficultyAware"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tongyx361/MathInstruct-Core-DifficultyAware | [
"region:us"
] | 2024-01-20T13:59:06+00:00 | {"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "difficulty_level", "dtype": "int64"}, {"name": "output", "dtype": "string"}, {"name": "err_rate", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 100411438, "num_examples": 168810}], "download_size": 53068288, "dataset_size": 100411438}} | 2024-01-20T13:59:15+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "MathInstruct-Core-DifficultyAware"
More Information needed | [
"# Dataset Card for \"MathInstruct-Core-DifficultyAware\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"MathInstruct-Core-DifficultyAware\"\n\nMore Information needed"
] |
28307256f57a161ff2ed9d6941647b621d629a4e | # Dataset Card for "whalley_dataset_ver2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | MartinKu/whalley_dataset_ver2 | [
"region:us"
] | 2024-01-20T14:13:21+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "TEXT", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1426305, "num_examples": 2200}, {"name": "validation", "num_bytes": 1426305, "num_examples": 2200}, {"name": "test", "num_bytes": 1426305, "num_examples": 2200}], "download_size": 1887594, "dataset_size": 4278915}} | 2024-01-20T14:16:51+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "whalley_dataset_ver2"
More Information needed | [
"# Dataset Card for \"whalley_dataset_ver2\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"whalley_dataset_ver2\"\n\nMore Information needed"
] |
9cf2d68a31fce4e17f821be8b9bbbd10d2733f47 | Meta table on MySQL
```sql
mysql> DESCRIBE danbooru_posts;
+------------------------+-------------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+------------------------+-------------+------+-----+---------+----------------+
| id | bigint | NO | PRI | NULL | auto_increment |
| created_at | datetime | YES | | NULL | |
| uploader_id | int | YES | | NULL | |
| score | int | YES | | NULL | |
| source | text | YES | | NULL | |
| md5 | varchar(32) | YES | | NULL | |
| last_comment_bumped_at | datetime | YES | | NULL | |
| rating | varchar(1) | YES | | NULL | |
| image_width | int | YES | | NULL | |
| image_height | int | YES | | NULL | |
| tag_string | text | YES | | NULL | |
| fav_count | int | YES | | NULL | |
| file_ext | varchar(8) | YES | | NULL | |
| last_noted_at | datetime | YES | | NULL | |
| parent_id | int | YES | | NULL | |
| has_children | tinyint(1) | YES | | NULL | |
| approver_id | int | YES | | NULL | |
| tag_count_general | int | YES | | NULL | |
| tag_count_artist | int | YES | | NULL | |
| tag_count_character | int | YES | | NULL | |
| tag_count_copyright | int | YES | | NULL | |
| file_size | int | YES | | NULL | |
| up_score | int | YES | | NULL | |
| down_score | int | YES | | NULL | |
| is_pending | tinyint(1) | YES | | NULL | |
| is_flagged | tinyint(1) | YES | | NULL | |
| is_deleted | tinyint(1) | YES | | NULL | |
| tag_count | int | YES | | NULL | |
| updated_at | datetime | YES | | NULL | |
| is_banned | tinyint(1) | YES | | NULL | |
| pixiv_id | int | YES | | NULL | |
| last_commented_at | datetime | YES | | NULL | |
| has_active_children | tinyint(1) | YES | | NULL | |
| bit_flags | int | YES | | NULL | |
| tag_count_meta | int | YES | | NULL | |
| has_large | tinyint(1) | YES | | NULL | |
| has_visible_children | tinyint(1) | YES | | NULL | |
| media_asset | json | YES | | NULL | |
| tag_string_general | text | YES | | NULL | |
| tag_string_character | text | YES | | NULL | |
| tag_string_copyright | text | YES | | NULL | |
| tag_string_artist | text | YES | | NULL | |
| tag_string_meta | text | YES | | NULL | |
| file_url | text | YES | | NULL | |
| large_file_url | text | YES | | NULL | |
| preview_file_url | text | YES | | NULL | |
+------------------------+-------------+------+-----+---------+----------------+
```
you can resotre with `mysql -u <your user> -p ACG < danbooru.sql` and create an `ACG` database if not exists | Chars/Danbooru2023-Meta | [
"region:us"
] | 2024-01-20T14:15:13+00:00 | {} | 2024-01-20T15:45:45+00:00 | [] | [] | TAGS
#region-us
| Meta table on MySQL
you can resotre with 'mysql -u <your user> -p ACG < URL' and create an 'ACG' database if not exists | [] | [
"TAGS\n#region-us \n"
] |
d1ae81e198f9bf53658cd5136d16f35ab7144667 | # Dataset Card for "math_23k_double_standalone"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | zhangshuoming/math_23k_double_standalone | [
"region:us"
] | 2024-01-20T14:31:26+00:00 | {"dataset_info": {"features": [{"name": "text", "struct": [{"name": "asm", "dtype": "string"}, {"name": "c", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 27029874, "num_examples": 21104}], "download_size": 0, "dataset_size": 27029874}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-21T17:51:28+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "math_23k_double_standalone"
More Information needed | [
"# Dataset Card for \"math_23k_double_standalone\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"math_23k_double_standalone\"\n\nMore Information needed"
] |
a429cd99ad2ed36db5d7a6c2787e479e076fa30a | # Dataset Card for "math_23k_double_value_init"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | zhangshuoming/math_23k_double_value_init | [
"region:us"
] | 2024-01-20T14:35:10+00:00 | {"dataset_info": {"features": [{"name": "text", "struct": [{"name": "asm", "dtype": "string"}, {"name": "c", "dtype": "string"}, {"name": "driver", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 22718470, "num_examples": 21104}], "download_size": 0, "dataset_size": 22718470}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-21T17:55:10+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "math_23k_double_value_init"
More Information needed | [
"# Dataset Card for \"math_23k_double_value_init\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"math_23k_double_value_init\"\n\nMore Information needed"
] |
631dcf43cfab6a4cef394ea963257c6dceddf123 | MIT
q8 lora
| segmond/mistral_7b_v0_1_q8_0_shakespeare_lora | [
"region:us"
] | 2024-01-20T14:36:26+00:00 | {} | 2024-01-20T17:14:02+00:00 | [] | [] | TAGS
#region-us
| MIT
q8 lora
| [] | [
"TAGS\n#region-us \n"
] |
e4bf8edbc6a0bab6decf3ecbbf904117dde58423 | # lilac/open-assistant-conversations-2
This dataset is a [Lilac](http://lilacml.com) processed dataset. Original dataset: [https://huggingface.co/datasets/OpenAssistant/oasst2](https://huggingface.co/datasets/OpenAssistant/oasst2)
To download the dataset to a local directory:
```bash
lilac download lilacai/lilac-open-assistant-conversations-2
```
or from python with:
```py
ll.download("lilacai/lilac-open-assistant-conversations-2")
```
| lilacai/lilac-open-assistant-conversations-2 | [
"Lilac",
"region:us"
] | 2024-01-20T15:01:10+00:00 | {"tags": ["Lilac"]} | 2024-01-20T15:02:06+00:00 | [] | [] | TAGS
#Lilac #region-us
| # lilac/open-assistant-conversations-2
This dataset is a Lilac processed dataset. Original dataset: URL
To download the dataset to a local directory:
or from python with:
| [
"# lilac/open-assistant-conversations-2\nThis dataset is a Lilac processed dataset. Original dataset: URL\n\nTo download the dataset to a local directory:\n\n\n\nor from python with:"
] | [
"TAGS\n#Lilac #region-us \n",
"# lilac/open-assistant-conversations-2\nThis dataset is a Lilac processed dataset. Original dataset: URL\n\nTo download the dataset to a local directory:\n\n\n\nor from python with:"
] |
3afab6909fb3db688d6335fdff385f039b1f8f63 | # Dataset Card for "math_23k_train_numeric_double"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | zhangshuoming/math_23k_train_numeric_double | [
"region:us"
] | 2024-01-20T15:02:31+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 21505369.00691812, "num_examples": 21086}], "download_size": 2785918, "dataset_size": 21505369.00691812}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-21T06:46:13+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "math_23k_train_numeric_double"
More Information needed | [
"# Dataset Card for \"math_23k_train_numeric_double\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"math_23k_train_numeric_double\"\n\nMore Information needed"
] |
a3de2b24b3a0ebd806eb47c028d5ec97dd7d8437 |
# Dataset Card for Evaluation run of andrijdavid/Macaroni-7b-Tied
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [andrijdavid/Macaroni-7b-Tied](https://huggingface.co/andrijdavid/Macaroni-7b-Tied) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_andrijdavid__Macaroni-7b-Tied",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-20T15:12:32.316850](https://huggingface.co/datasets/open-llm-leaderboard/details_andrijdavid__Macaroni-7b-Tied/blob/main/results_2024-01-20T15-12-32.316850.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6532995839608818,
"acc_stderr": 0.03214024172738054,
"acc_norm": 0.6526745103539395,
"acc_norm_stderr": 0.03280863311802526,
"mc1": 0.5740514075887393,
"mc1_stderr": 0.01731047190407654,
"mc2": 0.7054055300758924,
"mc2_stderr": 0.014960422273112972
},
"harness|arc:challenge|25": {
"acc": 0.7064846416382252,
"acc_stderr": 0.013307250444941113,
"acc_norm": 0.7286689419795221,
"acc_norm_stderr": 0.012993807727545796
},
"harness|hellaswag|10": {
"acc": 0.712109141605258,
"acc_stderr": 0.004518546274738884,
"acc_norm": 0.8813981278629756,
"acc_norm_stderr": 0.0032265867834212906
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493864,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.02544636563440678,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.02544636563440678
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8032258064516129,
"acc_stderr": 0.022616409420742025,
"acc_norm": 0.8032258064516129,
"acc_norm_stderr": 0.022616409420742025
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02874204090394848,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02874204090394848
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977938,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977938
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.015776239256163224,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.015776239256163224
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098823,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098823
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281372,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281372
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903341,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903341
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.44581005586592176,
"acc_stderr": 0.01662399851333311,
"acc_norm": 0.44581005586592176,
"acc_norm_stderr": 0.01662399851333311
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4706649282920469,
"acc_stderr": 0.012748238397365549,
"acc_norm": 0.4706649282920469,
"acc_norm_stderr": 0.012748238397365549
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.02806499816704009,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.02806499816704009
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.019070985589687495,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.019070985589687495
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5740514075887393,
"mc1_stderr": 0.01731047190407654,
"mc2": 0.7054055300758924,
"mc2_stderr": 0.014960422273112972
},
"harness|winogrande|5": {
"acc": 0.819258089976322,
"acc_stderr": 0.010814911009613985
},
"harness|gsm8k|5": {
"acc": 0.7156937073540561,
"acc_stderr": 0.012425078188395982
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_andrijdavid__Macaroni-7b-Tied | [
"region:us"
] | 2024-01-20T15:14:53+00:00 | {"pretty_name": "Evaluation run of andrijdavid/Macaroni-7b-Tied", "dataset_summary": "Dataset automatically created during the evaluation run of model [andrijdavid/Macaroni-7b-Tied](https://huggingface.co/andrijdavid/Macaroni-7b-Tied) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_andrijdavid__Macaroni-7b-Tied\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-20T15:12:32.316850](https://huggingface.co/datasets/open-llm-leaderboard/details_andrijdavid__Macaroni-7b-Tied/blob/main/results_2024-01-20T15-12-32.316850.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6532995839608818,\n \"acc_stderr\": 0.03214024172738054,\n \"acc_norm\": 0.6526745103539395,\n \"acc_norm_stderr\": 0.03280863311802526,\n \"mc1\": 0.5740514075887393,\n \"mc1_stderr\": 0.01731047190407654,\n \"mc2\": 0.7054055300758924,\n \"mc2_stderr\": 0.014960422273112972\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7064846416382252,\n \"acc_stderr\": 0.013307250444941113,\n \"acc_norm\": 0.7286689419795221,\n \"acc_norm_stderr\": 0.012993807727545796\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.712109141605258,\n \"acc_stderr\": 0.004518546274738884,\n \"acc_norm\": 0.8813981278629756,\n \"acc_norm_stderr\": 0.0032265867834212906\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440678,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440678\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8032258064516129,\n \"acc_stderr\": 0.022616409420742025,\n \"acc_norm\": 0.8032258064516129,\n \"acc_norm_stderr\": 0.022616409420742025\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394848,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394848\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977938,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977938\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163224,\n \"acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163224\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098823,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098823\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281372,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281372\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n \"acc_stderr\": 0.013586619219903341,\n \"acc_norm\": 0.8250319284802043,\n \"acc_norm_stderr\": 0.013586619219903341\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.44581005586592176,\n \"acc_stderr\": 0.01662399851333311,\n \"acc_norm\": 0.44581005586592176,\n \"acc_norm_stderr\": 0.01662399851333311\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n \"acc_stderr\": 0.012748238397365549,\n \"acc_norm\": 0.4706649282920469,\n \"acc_norm_stderr\": 0.012748238397365549\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687495,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687495\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5740514075887393,\n \"mc1_stderr\": 0.01731047190407654,\n \"mc2\": 0.7054055300758924,\n \"mc2_stderr\": 0.014960422273112972\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.819258089976322,\n \"acc_stderr\": 0.010814911009613985\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7156937073540561,\n \"acc_stderr\": 0.012425078188395982\n }\n}\n```", "repo_url": "https://huggingface.co/andrijdavid/Macaroni-7b-Tied", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|arc:challenge|25_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|gsm8k|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hellaswag|10_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T15-12-32.316850.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["**/details_harness|winogrande|5_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-20T15-12-32.316850.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_20T15_12_32.316850", "path": ["results_2024-01-20T15-12-32.316850.parquet"]}, {"split": "latest", "path": ["results_2024-01-20T15-12-32.316850.parquet"]}]}]} | 2024-01-20T15:15:15+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of andrijdavid/Macaroni-7b-Tied
Dataset automatically created during the evaluation run of model andrijdavid/Macaroni-7b-Tied on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-20T15:12:32.316850(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of andrijdavid/Macaroni-7b-Tied\n\n\n\nDataset automatically created during the evaluation run of model andrijdavid/Macaroni-7b-Tied on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T15:12:32.316850(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of andrijdavid/Macaroni-7b-Tied\n\n\n\nDataset automatically created during the evaluation run of model andrijdavid/Macaroni-7b-Tied on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T15:12:32.316850(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
c3ed0d20f78a2bbfab07ab809f7b0fa18f50a9f2 | # Code Textbook
200K+ Synthetic Textbook Samples generated with various Open-Source LLMs including **Nous Hermes Mixtral 8x7B, OpenHermes-2.5-Mistral, OpenChat and DeepSeek-Coder**. | vilm/code-textbooks | [
"region:us"
] | 2024-01-20T15:23:05+00:00 | {"dataset_info": {"features": [{"name": "max_stars_repo_path", "dtype": "large_string"}, {"name": "max_stars_repo_name", "dtype": "large_string"}, {"name": "id", "dtype": "large_string"}, {"name": "language", "dtype": "large_string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2407352255, "num_examples": 206644}], "download_size": 894488166, "dataset_size": 2407352255}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-20T15:34:37+00:00 | [] | [] | TAGS
#region-us
| # Code Textbook
200K+ Synthetic Textbook Samples generated with various Open-Source LLMs including Nous Hermes Mixtral 8x7B, OpenHermes-2.5-Mistral, OpenChat and DeepSeek-Coder. | [
"# Code Textbook\n\n200K+ Synthetic Textbook Samples generated with various Open-Source LLMs including Nous Hermes Mixtral 8x7B, OpenHermes-2.5-Mistral, OpenChat and DeepSeek-Coder."
] | [
"TAGS\n#region-us \n",
"# Code Textbook\n\n200K+ Synthetic Textbook Samples generated with various Open-Source LLMs including Nous Hermes Mixtral 8x7B, OpenHermes-2.5-Mistral, OpenChat and DeepSeek-Coder."
] |
4ef764d415208cf3c1892a719d9e35f52aac5b37 |
# Dataset Card for Evaluation run of tenyx/TenyxChat-8x7B-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [tenyx/TenyxChat-8x7B-v1](https://huggingface.co/tenyx/TenyxChat-8x7B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_tenyx__TenyxChat-8x7B-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-20T15:44:33.051558](https://huggingface.co/datasets/open-llm-leaderboard/details_tenyx__TenyxChat-8x7B-v1/blob/main/results_2024-01-20T15-44-33.051558.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7101891541491676,
"acc_stderr": 0.030258624657643698,
"acc_norm": 0.7137715225758183,
"acc_norm_stderr": 0.030842789389844256,
"mc1": 0.5018359853121175,
"mc1_stderr": 0.01750338304687705,
"mc2": 0.6541929389144224,
"mc2_stderr": 0.015163572290637445
},
"harness|arc:challenge|25": {
"acc": 0.6715017064846417,
"acc_stderr": 0.013724978465537298,
"acc_norm": 0.697098976109215,
"acc_norm_stderr": 0.013428241573185349
},
"harness|hellaswag|10": {
"acc": 0.6890061740689106,
"acc_stderr": 0.004619542392006391,
"acc_norm": 0.8776140211113324,
"acc_norm_stderr": 0.003270612753613399
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6814814814814815,
"acc_stderr": 0.04024778401977108,
"acc_norm": 0.6814814814814815,
"acc_norm_stderr": 0.04024778401977108
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03317672787533157,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03317672787533157
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7773584905660378,
"acc_stderr": 0.025604233470899098,
"acc_norm": 0.7773584905660378,
"acc_norm_stderr": 0.025604233470899098
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03309615177059006,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03309615177059006
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.03295304696818318,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.03295304696818318
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6638297872340425,
"acc_stderr": 0.030881618520676942,
"acc_norm": 0.6638297872340425,
"acc_norm_stderr": 0.030881618520676942
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5964912280701754,
"acc_stderr": 0.04615186962583707,
"acc_norm": 0.5964912280701754,
"acc_norm_stderr": 0.04615186962583707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6551724137931034,
"acc_stderr": 0.03960933549451208,
"acc_norm": 0.6551724137931034,
"acc_norm_stderr": 0.03960933549451208
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.025733641991838994,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.025733641991838994
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5158730158730159,
"acc_stderr": 0.044698818540726076,
"acc_norm": 0.5158730158730159,
"acc_norm_stderr": 0.044698818540726076
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8483870967741935,
"acc_stderr": 0.02040261665441676,
"acc_norm": 0.8483870967741935,
"acc_norm_stderr": 0.02040261665441676
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6157635467980296,
"acc_stderr": 0.03422398565657551,
"acc_norm": 0.6157635467980296,
"acc_norm_stderr": 0.03422398565657551
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695482995,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695482995
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9637305699481865,
"acc_stderr": 0.01349265975129515,
"acc_norm": 0.9637305699481865,
"acc_norm_stderr": 0.01349265975129515
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6948717948717948,
"acc_stderr": 0.023346335293325887,
"acc_norm": 0.6948717948717948,
"acc_norm_stderr": 0.023346335293325887
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.029560707392465718,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.029560707392465718
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8025210084033614,
"acc_stderr": 0.02585916412205145,
"acc_norm": 0.8025210084033614,
"acc_norm_stderr": 0.02585916412205145
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.040752249922169775,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.040752249922169775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8844036697247707,
"acc_stderr": 0.013708749534172636,
"acc_norm": 0.8844036697247707,
"acc_norm_stderr": 0.013708749534172636
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.03362277436608043,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.03362277436608043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250454,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250454
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8565400843881856,
"acc_stderr": 0.022818291821017016,
"acc_norm": 0.8565400843881856,
"acc_norm_stderr": 0.022818291821017016
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7668161434977578,
"acc_stderr": 0.028380391147094702,
"acc_norm": 0.7668161434977578,
"acc_norm_stderr": 0.028380391147094702
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.034465133507525975,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.034465133507525975
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035202,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035202
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.036028141763926456,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.036028141763926456
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8098159509202454,
"acc_stderr": 0.030833491146281224,
"acc_norm": 0.8098159509202454,
"acc_norm_stderr": 0.030833491146281224
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.625,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.625,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9230769230769231,
"acc_stderr": 0.017456987872436193,
"acc_norm": 0.9230769230769231,
"acc_norm_stderr": 0.017456987872436193
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.879948914431673,
"acc_stderr": 0.011622736692041283,
"acc_norm": 0.879948914431673,
"acc_norm_stderr": 0.011622736692041283
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7832369942196532,
"acc_stderr": 0.022183477668412856,
"acc_norm": 0.7832369942196532,
"acc_norm_stderr": 0.022183477668412856
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4335195530726257,
"acc_stderr": 0.01657402721951763,
"acc_norm": 0.4335195530726257,
"acc_norm_stderr": 0.01657402721951763
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.021828596053108395,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.021828596053108395
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7909967845659164,
"acc_stderr": 0.023093140398374224,
"acc_norm": 0.7909967845659164,
"acc_norm_stderr": 0.023093140398374224
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8271604938271605,
"acc_stderr": 0.021038517770157365,
"acc_norm": 0.8271604938271605,
"acc_norm_stderr": 0.021038517770157365
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5460992907801419,
"acc_stderr": 0.029700453247291474,
"acc_norm": 0.5460992907801419,
"acc_norm_stderr": 0.029700453247291474
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5443285528031291,
"acc_stderr": 0.012719949543032228,
"acc_norm": 0.5443285528031291,
"acc_norm_stderr": 0.012719949543032228
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7830882352941176,
"acc_stderr": 0.025035845227711274,
"acc_norm": 0.7830882352941176,
"acc_norm_stderr": 0.025035845227711274
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.01716058723504635,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.01716058723504635
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7714285714285715,
"acc_stderr": 0.026882144922307744,
"acc_norm": 0.7714285714285715,
"acc_norm_stderr": 0.026882144922307744
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8905472636815921,
"acc_stderr": 0.02207632610182466,
"acc_norm": 0.8905472636815921,
"acc_norm_stderr": 0.02207632610182466
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835816,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835816
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.024103384202072864,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.024103384202072864
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5018359853121175,
"mc1_stderr": 0.01750338304687705,
"mc2": 0.6541929389144224,
"mc2_stderr": 0.015163572290637445
},
"harness|winogrande|5": {
"acc": 0.8121546961325967,
"acc_stderr": 0.010977481103435093
},
"harness|gsm8k|5": {
"acc": 0.6110689916603488,
"acc_stderr": 0.013428382481274249
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_tenyx__TenyxChat-8x7B-v1 | [
"region:us"
] | 2024-01-20T15:46:52+00:00 | {"pretty_name": "Evaluation run of tenyx/TenyxChat-8x7B-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [tenyx/TenyxChat-8x7B-v1](https://huggingface.co/tenyx/TenyxChat-8x7B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_tenyx__TenyxChat-8x7B-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-20T15:44:33.051558](https://huggingface.co/datasets/open-llm-leaderboard/details_tenyx__TenyxChat-8x7B-v1/blob/main/results_2024-01-20T15-44-33.051558.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7101891541491676,\n \"acc_stderr\": 0.030258624657643698,\n \"acc_norm\": 0.7137715225758183,\n \"acc_norm_stderr\": 0.030842789389844256,\n \"mc1\": 0.5018359853121175,\n \"mc1_stderr\": 0.01750338304687705,\n \"mc2\": 0.6541929389144224,\n \"mc2_stderr\": 0.015163572290637445\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6715017064846417,\n \"acc_stderr\": 0.013724978465537298,\n \"acc_norm\": 0.697098976109215,\n \"acc_norm_stderr\": 0.013428241573185349\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6890061740689106,\n \"acc_stderr\": 0.004619542392006391,\n \"acc_norm\": 0.8776140211113324,\n \"acc_norm_stderr\": 0.003270612753613399\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6814814814814815,\n \"acc_stderr\": 0.04024778401977108,\n \"acc_norm\": 0.6814814814814815,\n \"acc_norm_stderr\": 0.04024778401977108\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03317672787533157,\n \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03317672787533157\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7773584905660378,\n \"acc_stderr\": 0.025604233470899098,\n \"acc_norm\": 0.7773584905660378,\n \"acc_norm_stderr\": 0.025604233470899098\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.03309615177059006,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.03309615177059006\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.03295304696818318,\n \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.03295304696818318\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6638297872340425,\n \"acc_stderr\": 0.030881618520676942,\n \"acc_norm\": 0.6638297872340425,\n \"acc_norm_stderr\": 0.030881618520676942\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5964912280701754,\n \"acc_stderr\": 0.04615186962583707,\n \"acc_norm\": 0.5964912280701754,\n \"acc_norm_stderr\": 0.04615186962583707\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6551724137931034,\n \"acc_stderr\": 0.03960933549451208,\n \"acc_norm\": 0.6551724137931034,\n \"acc_norm_stderr\": 0.03960933549451208\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.025733641991838994,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.025733641991838994\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5158730158730159,\n \"acc_stderr\": 0.044698818540726076,\n \"acc_norm\": 0.5158730158730159,\n \"acc_norm_stderr\": 0.044698818540726076\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8483870967741935,\n \"acc_stderr\": 0.02040261665441676,\n \"acc_norm\": 0.8483870967741935,\n \"acc_norm_stderr\": 0.02040261665441676\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6157635467980296,\n \"acc_stderr\": 0.03422398565657551,\n \"acc_norm\": 0.6157635467980296,\n \"acc_norm_stderr\": 0.03422398565657551\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695482995,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695482995\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9637305699481865,\n \"acc_stderr\": 0.01349265975129515,\n \"acc_norm\": 0.9637305699481865,\n \"acc_norm_stderr\": 0.01349265975129515\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6948717948717948,\n \"acc_stderr\": 0.023346335293325887,\n \"acc_norm\": 0.6948717948717948,\n \"acc_norm_stderr\": 0.023346335293325887\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37777777777777777,\n \"acc_stderr\": 0.029560707392465718,\n \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.029560707392465718\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8025210084033614,\n \"acc_stderr\": 0.02585916412205145,\n \"acc_norm\": 0.8025210084033614,\n \"acc_norm_stderr\": 0.02585916412205145\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8844036697247707,\n \"acc_stderr\": 0.013708749534172636,\n \"acc_norm\": 0.8844036697247707,\n \"acc_norm_stderr\": 0.013708749534172636\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.03362277436608043,\n \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.03362277436608043\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250454,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250454\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8565400843881856,\n \"acc_stderr\": 0.022818291821017016,\n \"acc_norm\": 0.8565400843881856,\n \"acc_norm_stderr\": 0.022818291821017016\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7668161434977578,\n \"acc_stderr\": 0.028380391147094702,\n \"acc_norm\": 0.7668161434977578,\n \"acc_norm_stderr\": 0.028380391147094702\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.034465133507525975,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.034465133507525975\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035202,\n \"acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035202\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.036028141763926456,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.036028141763926456\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8098159509202454,\n \"acc_stderr\": 0.030833491146281224,\n \"acc_norm\": 0.8098159509202454,\n \"acc_norm_stderr\": 0.030833491146281224\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.036756688322331886,\n \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.036756688322331886\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9230769230769231,\n \"acc_stderr\": 0.017456987872436193,\n \"acc_norm\": 0.9230769230769231,\n \"acc_norm_stderr\": 0.017456987872436193\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.879948914431673,\n \"acc_stderr\": 0.011622736692041283,\n \"acc_norm\": 0.879948914431673,\n \"acc_norm_stderr\": 0.011622736692041283\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7832369942196532,\n \"acc_stderr\": 0.022183477668412856,\n \"acc_norm\": 0.7832369942196532,\n \"acc_norm_stderr\": 0.022183477668412856\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4335195530726257,\n \"acc_stderr\": 0.01657402721951763,\n \"acc_norm\": 0.4335195530726257,\n \"acc_norm_stderr\": 0.01657402721951763\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.021828596053108395,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.021828596053108395\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7909967845659164,\n \"acc_stderr\": 0.023093140398374224,\n \"acc_norm\": 0.7909967845659164,\n \"acc_norm_stderr\": 0.023093140398374224\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8271604938271605,\n \"acc_stderr\": 0.021038517770157365,\n \"acc_norm\": 0.8271604938271605,\n \"acc_norm_stderr\": 0.021038517770157365\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5460992907801419,\n \"acc_stderr\": 0.029700453247291474,\n \"acc_norm\": 0.5460992907801419,\n \"acc_norm_stderr\": 0.029700453247291474\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5443285528031291,\n \"acc_stderr\": 0.012719949543032228,\n \"acc_norm\": 0.5443285528031291,\n \"acc_norm_stderr\": 0.012719949543032228\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7830882352941176,\n \"acc_stderr\": 0.025035845227711274,\n \"acc_norm\": 0.7830882352941176,\n \"acc_norm_stderr\": 0.025035845227711274\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.01716058723504635,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.01716058723504635\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7714285714285715,\n \"acc_stderr\": 0.026882144922307744,\n \"acc_norm\": 0.7714285714285715,\n \"acc_norm_stderr\": 0.026882144922307744\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8905472636815921,\n \"acc_stderr\": 0.02207632610182466,\n \"acc_norm\": 0.8905472636815921,\n \"acc_norm_stderr\": 0.02207632610182466\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835816,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835816\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.024103384202072864,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.024103384202072864\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5018359853121175,\n \"mc1_stderr\": 0.01750338304687705,\n \"mc2\": 0.6541929389144224,\n \"mc2_stderr\": 0.015163572290637445\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8121546961325967,\n \"acc_stderr\": 0.010977481103435093\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6110689916603488,\n \"acc_stderr\": 0.013428382481274249\n }\n}\n```", "repo_url": "https://huggingface.co/tenyx/TenyxChat-8x7B-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|arc:challenge|25_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|gsm8k|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hellaswag|10_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T15-44-33.051558.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["**/details_harness|winogrande|5_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-20T15-44-33.051558.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_20T15_44_33.051558", "path": ["results_2024-01-20T15-44-33.051558.parquet"]}, {"split": "latest", "path": ["results_2024-01-20T15-44-33.051558.parquet"]}]}]} | 2024-01-20T15:47:13+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of tenyx/TenyxChat-8x7B-v1
Dataset automatically created during the evaluation run of model tenyx/TenyxChat-8x7B-v1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-20T15:44:33.051558(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of tenyx/TenyxChat-8x7B-v1\n\n\n\nDataset automatically created during the evaluation run of model tenyx/TenyxChat-8x7B-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T15:44:33.051558(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of tenyx/TenyxChat-8x7B-v1\n\n\n\nDataset automatically created during the evaluation run of model tenyx/TenyxChat-8x7B-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T15:44:33.051558(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
dc803be4af7b242e718d3a9e86b7e520e7219729 |
# Simple Math: 2+2=4 -1=3 (LoLo: Learning Only Logical Operations)
Just like my teacher gave me homework, i thought maybe we can also add some of these basics on the trainings of our models.
It was created with very simple code that is in the repo, if you add more complex operations and so.. **please share the code** :D thank you
Current Code Version: 20240127.fblgit (A modification over @win10 for progressive and DPO operation)

## Does it Works?
### 34BEAGLES Evaluation:
```
hf (pretrained=/data/models/UNA-34Beagles-v1-final,dtype=bfloat16,trust_remote_code=True), gen_kwargs: (None), limit: None, num_fewshot: None, batch_size: auto (8)
| Tasks |Version|Filter|n-shot| Metric |Value | |Stderr|
|--------------|-------|------|-----:|--------|-----:|---|-----:|
|arc_challenge |Yaml |none | 25|acc |0.7039|± |0.0133|
| | |none | 25|acc_norm|0.7321|± |0.0129|
|truthfulqa_mc2|Yaml |none | 0|acc |0.7387|± |0.0141|
hf (pretrained=/data/models/UNA-34Beagles-v1-final,dtype=bfloat16,trust_remote_code=True), gen_kwargs: (None), limit: None, num_fewshot: None, batch_size: auto
|Tasks|Version| Filter |n-shot| Metric |Value | |Stderr|
|-----|-------|----------|-----:|-----------|-----:|---|-----:|
|gsm8k|Yaml |get-answer| 5|exact_match|0.6399|± |0.0132|
| Groups |Version|Filter|n-shot|Metric|Value | |Stderr|
|------------------|-------|------|-----:|------|-----:|---|-----:|
|mmlu |N/A |none | 0|acc |0.7477|± |0.1079|
| - humanities |N/A |none | 0|acc |0.7188|± |0.0855|
| - other |N/A |none | 0|acc |0.7950|± |0.1057|
| - social_sciences|N/A |none | 0|acc |0.8297|± |0.0664|
| - stem |N/A |none | 0|acc |0.6641|± |0.1291|
```
### 34BEAGLES-MATH Evaluation
```
hf (pretrained=/data/models/34BeaglesMath-v1,dtype=bfloat16,trust_remote_code=True), gen_kwargs: (None), limit: None, num_fewshot: None, batch_size: auto
|Tasks|Version| Filter |n-shot| Metric |Value | |Stderr|
|-----|-------|----------|-----:|-----------|-----:|---|-----:|
|gsm8k|Yaml |get-answer| 5|exact_match|0.6505|± |0.0131|
hf (pretrained=/data/models/34BeaglesMath-v1,dtype=bfloat16,trust_remote_code=True), gen_kwargs: (None), limit: None, num_fewshot: None, batch_size: auto (8)
| Tasks |Version|Filter|n-shot| Metric |Value | |Stderr|
|--------------|-------|------|-----:|--------|-----:|---|-----:|
|arc_challenge |Yaml |none | 25|acc |0.7090|± |0.0133|
| | |none | 25|acc_norm|0.7329|± |0.0129|
|truthfulqa_mc2|Yaml |none | 0|acc |0.7378|± |0.0141|
| Groups |Version|Filter|n-shot|Metric|Value | |Stderr|
|------------------|-------|------|-----:|------|-----:|---|-----:|
|mmlu |N/A |none | 0|acc |0.7524|± |0.1045|
| - humanities |N/A |none | 0|acc |0.7307|± |0.0846|
| - other |N/A |none | 0|acc |0.7937|± |0.1029|
| - social_sciences|N/A |none | 0|acc |0.8274|± |0.0667|
| - stem |N/A |none | 0|acc |0.6708|± |0.1236|
```
But it gets better, because when increasing length and complexity, the marks are even superior:
```
|Tasks|Version| Filter |n-shot| Metric |Value | |Stderr|
|-----|-------|----------|-----:|-----------|-----:|---|-----:|
|gsm8k|Yaml |get-answer| 5|exact_match|0.6611|± | 0.013|
```
On a 3.20% GSM Improvement compared to its base model.
## Note to contributors:
**thank you to those contributing on the experiment with beautiful commits and good spirit**
* Feel free to contribute on the readme Evaluation tests.
* Lets aim to build an ablation & paper together. All contributors will be cited.
## Versions
```
27.01.24 Added new code to generate the dataset, seed 42 and now also generates DPO.
24.01.24 Added gradual complexity on a separate script
20-23.01.24 Multiple contributions with operations and increased complexity on the main generator script.
```
## Citations
If you use Simple Math o train your model, please cite on the modelcard or the paper.
```
@misc{simplemath,
title={Simple-Math: 2+2=4 4-1=3},
author={Xavier Murias},
year={2024},
publisher = {Juanako.AI},
journal = {HuggingFace repository},
howpublished = {\url{https://huggingface.co/datasets/fblgit/simple-math}},
}
``` | fblgit/simple-math | [
"task_categories:text-generation",
"task_categories:question-answering",
"size_categories:100K<n<1M",
"license:cc-by-nc-nd-4.0",
"math",
"finance",
"region:us"
] | 2024-01-20T15:54:15+00:00 | {"license": "cc-by-nc-nd-4.0", "size_categories": ["100K<n<1M"], "task_categories": ["text-generation", "question-answering"], "pretty_name": "Simple Math", "dataset_info": {"features": [{"name": "output", "dtype": "string"}, {"name": "instruction", "dtype": "string"}], "splits": [{"name": "arithmetic.float2_train", "num_bytes": 645500.3, "num_examples": 19000}, {"name": "arithmetic.float2_valid", "num_bytes": 33973.7, "num_examples": 1000}, {"name": "arithmetic.float3_train", "num_bytes": 1890863.85, "num_examples": 47500}, {"name": "arithmetic.float3_valid", "num_bytes": 99519.15, "num_examples": 2500}, {"name": "arithmetic.float34_train", "num_bytes": 9321513.05, "num_examples": 218500}, {"name": "arithmetic.float34_valid", "num_bytes": 490605.95, "num_examples": 11500}, {"name": "arithmetic.float4_train", "num_bytes": 21671996.6, "num_examples": 475000}, {"name": "arithmetic.float4_valid", "num_bytes": 1140631.4, "num_examples": 25000}], "download_size": 27928049, "dataset_size": 35294604}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "tags": ["math", "finance"]} | 2024-01-27T12:40:28+00:00 | [] | [] | TAGS
#task_categories-text-generation #task_categories-question-answering #size_categories-100K<n<1M #license-cc-by-nc-nd-4.0 #math #finance #region-us
|
# Simple Math: 2+2=4 -1=3 (LoLo: Learning Only Logical Operations)
Just like my teacher gave me homework, i thought maybe we can also add some of these basics on the trainings of our models.
It was created with very simple code that is in the repo, if you add more complex operations and so.. please share the code :D thank you
Current Code Version: URL (A modification over @win10 for progressive and DPO operation)
!LoLo: Learning Only Logical Operations
## Does it Works?
### 34BEAGLES Evaluation:
### 34BEAGLES-MATH Evaluation
But it gets better, because when increasing length and complexity, the marks are even superior:
On a 3.20% GSM Improvement compared to its base model.
## Note to contributors:
thank you to those contributing on the experiment with beautiful commits and good spirit
* Feel free to contribute on the readme Evaluation tests.
* Lets aim to build an ablation & paper together. All contributors will be cited.
## Versions
s
If you use Simple Math o train your model, please cite on the modelcard or the paper.
| [
"# Simple Math: 2+2=4 -1=3 (LoLo: Learning Only Logical Operations)\n\nJust like my teacher gave me homework, i thought maybe we can also add some of these basics on the trainings of our models.\n\nIt was created with very simple code that is in the repo, if you add more complex operations and so.. please share the code :D thank you\n\nCurrent Code Version: URL (A modification over @win10 for progressive and DPO operation)\n!LoLo: Learning Only Logical Operations",
"## Does it Works?",
"### 34BEAGLES Evaluation:",
"### 34BEAGLES-MATH Evaluation\n\n\nBut it gets better, because when increasing length and complexity, the marks are even superior:\n\nOn a 3.20% GSM Improvement compared to its base model.",
"## Note to contributors:\nthank you to those contributing on the experiment with beautiful commits and good spirit\n\n* Feel free to contribute on the readme Evaluation tests.\n* Lets aim to build an ablation & paper together. All contributors will be cited.",
"## Versions\n\n\ns\nIf you use Simple Math o train your model, please cite on the modelcard or the paper."
] | [
"TAGS\n#task_categories-text-generation #task_categories-question-answering #size_categories-100K<n<1M #license-cc-by-nc-nd-4.0 #math #finance #region-us \n",
"# Simple Math: 2+2=4 -1=3 (LoLo: Learning Only Logical Operations)\n\nJust like my teacher gave me homework, i thought maybe we can also add some of these basics on the trainings of our models.\n\nIt was created with very simple code that is in the repo, if you add more complex operations and so.. please share the code :D thank you\n\nCurrent Code Version: URL (A modification over @win10 for progressive and DPO operation)\n!LoLo: Learning Only Logical Operations",
"## Does it Works?",
"### 34BEAGLES Evaluation:",
"### 34BEAGLES-MATH Evaluation\n\n\nBut it gets better, because when increasing length and complexity, the marks are even superior:\n\nOn a 3.20% GSM Improvement compared to its base model.",
"## Note to contributors:\nthank you to those contributing on the experiment with beautiful commits and good spirit\n\n* Feel free to contribute on the readme Evaluation tests.\n* Lets aim to build an ablation & paper together. All contributors will be cited.",
"## Versions\n\n\ns\nIf you use Simple Math o train your model, please cite on the modelcard or the paper."
] |
a0cb33af47ca8e8ce5fdcb583841c08811c7ced2 | # Small RedPajama-v2
500K Samples from RedPajama-v2 | vilm/RedPajama-v2-small | [
"region:us"
] | 2024-01-20T16:21:03+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2554107597, "num_examples": 500000}], "download_size": 1479056742, "dataset_size": 2554107597}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-20T16:39:06+00:00 | [] | [] | TAGS
#region-us
| # Small RedPajama-v2
500K Samples from RedPajama-v2 | [
"# Small RedPajama-v2\n500K Samples from RedPajama-v2"
] | [
"TAGS\n#region-us \n",
"# Small RedPajama-v2\n500K Samples from RedPajama-v2"
] |
7b5057c0537ac6a25a243d448b6a601a968addd0 |
## Worded Math
- Version 1.1
- Updated for general improvements
1 Million examples of word-based math (in English) with number results.
Created from [fglbit/simple-math](https://huggingface.co/datasets/fblgit/simple-math).
Using this python code (requires `inflect`):
```py
import random
import inflect
import math
inf = inflect.engine()
# Define the number of samples you want to generate
num_samples = 1000000
# Define the range for the random numbers
min_value = -999.9
max_value = 999.9
# Define the arithmetic operations
operations = ["+", "-", "*", "/"]
div = ["divided by", "divided into"]
plus = ["plus", "added to"]
minus = ["minus", "subtracted from"]
times = ["times", "multiplied by"]
splitted = num_samples/5
# Generate data
train_data = []
for i in range(num_samples):
# Limit max num1,num2 to -+9,999,999
if i > 100000:
ti = 100000
else:
ti = i
multfactor = math.trunc((ti/10)+1)
# Randomly sometimes float randomly don't
if i % 2 == 0 and i >= splitted:
num1 = float("%.3f" % random.uniform(min_value*multfactor, max_value*multfactor))
num2 = float("%.3f" % random.uniform(min_value*multfactor, max_value*multfactor))
else:
num1 = math.trunc(random.uniform(min_value*multfactor, max_value*multfactor))
num2 = math.trunc(random.uniform(min_value*multfactor, max_value*multfactor))
while num2 == 0.0:
num2 = float("%.3f" % random.uniform(min_value, max_value))
while num1 == 0.0:
num1 = float("%.3f" % random.uniform(min_value, max_value))
operation = random.choice(operations)
if operation == "/":
result = num1 / num2
opp = random.choice(div)
elif operation == '-':
result = num1 - num2
opp = random.choice(minus)
elif operation == '*':
result = num1 * num2
opp = random.choice(times)
elif operation == '+':
result = num1 + num2
opp = random.choice(plus)
output = round(result, 4)
num1 = inf.number_to_words(num1)
num2 = inf.number_to_words(num2)
if random.randint(0, 1) == 1:
output = inf.number_to_words(output)
else:
output = str(output)
instruction = f"{num1} {opp} {num2}"
train_data.append({'instruction': instruction, 'output': output})
# Create the dataset
import json
# Output test data
test_data = []
to_pop = []
for re in range(num_samples):
if re % 40 == 0:
if (re/40) % 2 == 0:
test_data.append(train_data[re])
to_pop.append(re)
else:
test_data.append(train_data[re-1])
to_pop.append(re-1)
# Pop test data from train data
popi = 0
for pop in to_pop:
train_data.pop(pop-popi)
popi += 1
# Output test data
test_out_file = 'worded-math-test-v1.1.json'
with open(test_out_file, 'w') as f:
json.dump(test_data, f)
# Output train data
train_out_file = 'worded-math-train-v1.1.json'
with open(train_out_file, 'w') as f:
json.dump(train_data, f)
``` | distantquant/worded-math | [
"size_categories:100K<n<1M",
"language:en",
"license:cc-by-4.0",
"region:us"
] | 2024-01-20T16:25:01+00:00 | {"language": ["en"], "license": "cc-by-4.0", "size_categories": ["100K<n<1M"], "pretty_name": "Worded Math"} | 2024-01-21T20:26:25+00:00 | [] | [
"en"
] | TAGS
#size_categories-100K<n<1M #language-English #license-cc-by-4.0 #region-us
|
## Worded Math
- Version 1.1
- Updated for general improvements
1 Million examples of word-based math (in English) with number results.
Created from fglbit/simple-math.
Using this python code (requires 'inflect'):
| [
"## Worded Math\n\n- Version 1.1\n- Updated for general improvements\n\n1 Million examples of word-based math (in English) with number results.\n\nCreated from fglbit/simple-math.\n\nUsing this python code (requires 'inflect'):"
] | [
"TAGS\n#size_categories-100K<n<1M #language-English #license-cc-by-4.0 #region-us \n",
"## Worded Math\n\n- Version 1.1\n- Updated for general improvements\n\n1 Million examples of word-based math (in English) with number results.\n\nCreated from fglbit/simple-math.\n\nUsing this python code (requires 'inflect'):"
] |
75cf3110077527cbab8451a2acffee4a72374b92 | # Extra Small RedPajama-v2
250K Samples from RedPajama-v2 | vilm/RedPajama-v2-xsmall | [
"region:us"
] | 2024-01-20T16:30:12+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1288941618, "num_examples": 250000}], "download_size": 745507877, "dataset_size": 1288941618}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-20T16:40:36+00:00 | [] | [] | TAGS
#region-us
| # Extra Small RedPajama-v2
250K Samples from RedPajama-v2 | [
"# Extra Small RedPajama-v2\n250K Samples from RedPajama-v2"
] | [
"TAGS\n#region-us \n",
"# Extra Small RedPajama-v2\n250K Samples from RedPajama-v2"
] |
5724b1ec52f7a7b1615ff7bad90a9275238e97f0 |
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | Zainabsa99/mitre_attack | [
"region:us"
] | 2024-01-20T16:33:48+00:00 | {} | 2024-01-20T19:42:37+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Dataset Name
This dataset card aims to be a base template for new datasets. It has been generated using this raw template.
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
452d98229ce74dc4954309c7968e2a96d77bdbb6 | Using https://huggingface.co/cagliostrolab/animagine-xl-3.0, I created samples for the top 7500 artist tags based on dataset occurrences of the tags, reaching down to tags with ~40 occurrences.
Images are prefixed with the occurence count to make it easier to sort by higher occurances, as they are more likely to reproduce styles better.
I have attached an image with the generation metadata containing the generation settings I used for the samples.
When inserting tags into prompts, I would always escape parentheses AND replace underscores with spaces.
**NOTE: While I would classify the samples as almost all SFW, some might be *a bit* ecchi, depending on your standards.**
Zip version available at https://huggingface.co/datasets/deus-ex-machina/animagine-xl-3.0-artist-comparison/blob/zip/images/images.zip
**Raw generation metadata example**
```
hews, 1girl, solo, looking at viewer, standing, cowboy shot
Negative prompt: (worst quality, low quality:1.2), lowres, jpeg artifacts, (blurry:0.7), @_@, greyscale, nude, panties, underwear, pussy, nipples, cleavage, ass, micro, mini, bottomless
Steps: 28, Sampler: Euler a, CFG scale: 8, Seed: 407081055, Size: 915x1144, Model hash: 1449e5b0b9, Model: animagine-xl-3.0, Denoising strength: 0.5, Hires upscale: 1.35, Hires steps: 14, Hires upscaler: 4x_foolhardy_Remacri, Version: v1.7.0-375-gf939bce8
``` | deus-ex-machina/animagine-xl-3.0-artist-comparison | [
"task_categories:text-to-image",
"size_categories:1K<n<10K",
"language:en",
"license:apache-2.0",
"sample",
"example",
"comparison",
"stable-diffusion",
"stable-diffusion-xl",
"animagine",
"region:us"
] | 2024-01-20T16:38:01+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["1K<n<10K"], "task_categories": ["text-to-image"], "tags": ["sample", "example", "comparison", "stable-diffusion", "stable-diffusion-xl", "animagine"], "viewer": false} | 2024-01-20T19:03:29+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-to-image #size_categories-1K<n<10K #language-English #license-apache-2.0 #sample #example #comparison #stable-diffusion #stable-diffusion-xl #animagine #region-us
| Using URL I created samples for the top 7500 artist tags based on dataset occurrences of the tags, reaching down to tags with ~40 occurrences.
Images are prefixed with the occurence count to make it easier to sort by higher occurances, as they are more likely to reproduce styles better.
I have attached an image with the generation metadata containing the generation settings I used for the samples.
When inserting tags into prompts, I would always escape parentheses AND replace underscores with spaces.
NOTE: While I would classify the samples as almost all SFW, some might be *a bit* ecchi, depending on your standards.
Zip version available at URL
Raw generation metadata example
| [] | [
"TAGS\n#task_categories-text-to-image #size_categories-1K<n<10K #language-English #license-apache-2.0 #sample #example #comparison #stable-diffusion #stable-diffusion-xl #animagine #region-us \n"
] |
02186378694618d1585eab68ceafae69c4ace9ab | # The Stack smol-XL
A cleaned version of The Stack smol-XL | vilm/the-stack-smol-xl-cleaned | [
"region:us"
] | 2024-01-20T16:52:23+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1147766215, "num_examples": 205173}], "download_size": 368132773, "dataset_size": 1147766215}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-20T16:55:20+00:00 | [] | [] | TAGS
#region-us
| # The Stack smol-XL
A cleaned version of The Stack smol-XL | [
"# The Stack smol-XL\nA cleaned version of The Stack smol-XL"
] | [
"TAGS\n#region-us \n",
"# The Stack smol-XL\nA cleaned version of The Stack smol-XL"
] |
c99b44c68cf198b679bb9f34391d34971bc45b84 | This dataset comprises synthetic client notes from nursing care residents, specifically created for natural language processing (NLP) projects.
Its primary application is in the development of models aimed at predicting agitation scores in healthcare settings, such as nursing homes.
While the dataset serves well for training purposes, it is important to note that it lacks the variation and nuance typically found in real
client notes. Therefore, while beneficial for initial model training, it may not fully represent the complexities encountered in actual
clinical environments.
### Data Collection
- **Data type:** Text (synthetic notes)
- **Source:** Generated by OpenAI API
- **Language:** Dutch
- **Period:** January 2024
- **Method:** Generated by OpenAI's language model 'gpt-3.5-turbo'
### Dataset Characteristics
- **Size:** 8,699 rows (train: 16 clients, 5788 notes, valid: 4 clients, 1455 notes, test: 4 clients, 1456 notes)
- **Format:** csv
- **Fields:**
- ct_id: Client ID
- datum: Date
- discipline: Role of the author of the note
- rapportage: Note text
- onrust: Boolean value set to True if the agitation score exceeds 50
- onrustscore: Level of agitation described in the nursing care note.
## Intended Use
- **Primary Use:** Fine-tuning LLMs for applications in healthcare, focusing on nursing homes.
- **Use Cases:** Predicting agitation scores.
## Ethical Considerations
- **Bias:** The dataset, generated by AI, may exhibit language and contextual biases, potentially influencing model predictions and affecting their fairness and accuracy
## Versioning
- **Version:** 1.0
- **Date:** 20-01-2024
| ekrombouts/GenCareAI | [
"task_categories:text-classification",
"size_categories:1K<n<10K",
"language:nl",
"license:gpl",
"medical",
"synthetic",
"region:us"
] | 2024-01-20T17:00:22+00:00 | {"language": ["nl"], "license": "gpl", "size_categories": ["1K<n<10K"], "task_categories": ["text-classification"], "pretty_name": "Synthetic client notes", "tags": ["medical", "synthetic"]} | 2024-01-21T09:25:38+00:00 | [] | [
"nl"
] | TAGS
#task_categories-text-classification #size_categories-1K<n<10K #language-Dutch #license-gpl #medical #synthetic #region-us
| This dataset comprises synthetic client notes from nursing care residents, specifically created for natural language processing (NLP) projects.
Its primary application is in the development of models aimed at predicting agitation scores in healthcare settings, such as nursing homes.
While the dataset serves well for training purposes, it is important to note that it lacks the variation and nuance typically found in real
client notes. Therefore, while beneficial for initial model training, it may not fully represent the complexities encountered in actual
clinical environments.
### Data Collection
- Data type: Text (synthetic notes)
- Source: Generated by OpenAI API
- Language: Dutch
- Period: January 2024
- Method: Generated by OpenAI's language model 'gpt-3.5-turbo'
### Dataset Characteristics
- Size: 8,699 rows (train: 16 clients, 5788 notes, valid: 4 clients, 1455 notes, test: 4 clients, 1456 notes)
- Format: csv
- Fields:
- ct_id: Client ID
- datum: Date
- discipline: Role of the author of the note
- rapportage: Note text
- onrust: Boolean value set to True if the agitation score exceeds 50
- onrustscore: Level of agitation described in the nursing care note.
## Intended Use
- Primary Use: Fine-tuning LLMs for applications in healthcare, focusing on nursing homes.
- Use Cases: Predicting agitation scores.
## Ethical Considerations
- Bias: The dataset, generated by AI, may exhibit language and contextual biases, potentially influencing model predictions and affecting their fairness and accuracy
## Versioning
- Version: 1.0
- Date: 20-01-2024
| [
"### Data Collection\n\n- Data type: Text (synthetic notes)\n- Source: Generated by OpenAI API\n- Language: Dutch\n- Period: January 2024\n- Method: Generated by OpenAI's language model 'gpt-3.5-turbo'",
"### Dataset Characteristics\n\n- Size: 8,699 rows (train: 16 clients, 5788 notes, valid: 4 clients, 1455 notes, test: 4 clients, 1456 notes)\n- Format: csv\n- Fields:\n - ct_id: Client ID\n - datum: Date\n - discipline: Role of the author of the note\n - rapportage: Note text\n - onrust: Boolean value set to True if the agitation score exceeds 50\n - onrustscore: Level of agitation described in the nursing care note.",
"## Intended Use\n\n- Primary Use: Fine-tuning LLMs for applications in healthcare, focusing on nursing homes.\n- Use Cases: Predicting agitation scores.",
"## Ethical Considerations\n\n- Bias: The dataset, generated by AI, may exhibit language and contextual biases, potentially influencing model predictions and affecting their fairness and accuracy",
"## Versioning\n\n- Version: 1.0\n- Date: 20-01-2024"
] | [
"TAGS\n#task_categories-text-classification #size_categories-1K<n<10K #language-Dutch #license-gpl #medical #synthetic #region-us \n",
"### Data Collection\n\n- Data type: Text (synthetic notes)\n- Source: Generated by OpenAI API\n- Language: Dutch\n- Period: January 2024\n- Method: Generated by OpenAI's language model 'gpt-3.5-turbo'",
"### Dataset Characteristics\n\n- Size: 8,699 rows (train: 16 clients, 5788 notes, valid: 4 clients, 1455 notes, test: 4 clients, 1456 notes)\n- Format: csv\n- Fields:\n - ct_id: Client ID\n - datum: Date\n - discipline: Role of the author of the note\n - rapportage: Note text\n - onrust: Boolean value set to True if the agitation score exceeds 50\n - onrustscore: Level of agitation described in the nursing care note.",
"## Intended Use\n\n- Primary Use: Fine-tuning LLMs for applications in healthcare, focusing on nursing homes.\n- Use Cases: Predicting agitation scores.",
"## Ethical Considerations\n\n- Bias: The dataset, generated by AI, may exhibit language and contextual biases, potentially influencing model predictions and affecting their fairness and accuracy",
"## Versioning\n\n- Version: 1.0\n- Date: 20-01-2024"
] |
2ac5d29ddd61c62078db0e10c2abcc47925dc35f |
Conversion of [databricks/databricks-dolly-15k](https://huggingface.co/datasets/databricks/databricks-dolly-15k) dataset to be used in pretraining.
Python code used for conversion:
```python
from datasets import load_dataset
import pandas
dataset = load_dataset("databricks/databricks-dolly-15k", split="train")
def format(columns):
instruction = columns["instruction"].strip()
answer = columns["response"].strip()
return f"{instruction}\n\n{answer}"
pandas.DataFrame({"text": [format(columns) for columns in dataset]}).to_csv("train.csv", index=False)
```
| Felladrin/pretrain-databricks-dolly-15k | [
"source_datasets:databricks/databricks-dolly-15k",
"license:cc-by-sa-3.0",
"region:us"
] | 2024-01-20T17:06:10+00:00 | {"license": "cc-by-sa-3.0", "source_datasets": ["databricks/databricks-dolly-15k"]} | 2024-01-20T19:02:59+00:00 | [] | [] | TAGS
#source_datasets-databricks/databricks-dolly-15k #license-cc-by-sa-3.0 #region-us
|
Conversion of databricks/databricks-dolly-15k dataset to be used in pretraining.
Python code used for conversion:
| [] | [
"TAGS\n#source_datasets-databricks/databricks-dolly-15k #license-cc-by-sa-3.0 #region-us \n"
] |
6795302736a58e59fb97e6dbce3edf9774d7bf12 |
# ALFRED Dataset for ABP
We provide the ALFRED dataset used for <a href="https://bhkim94.github.io/projects/ABP" target="_new">ABP</a> including ResNet-18 features of egocentric and surrounding views, annotations, etc.
The surrdounding views are from four navigable actions defined in ALFRED: RotateLeft (90°), LookUp(15°), LookDown(15°), and RotateRight(90°).
The file structure is almost identical to the ALFRED dataset, so refer to <a href="https://github.com/askforalfred/alfred">ALFRED</a> for more details.
## Download the dataset
Move to the root (denoted by ALFRED_ROOT below) of the ABP (or related work) repo and clone this repository by following the commands below.\
**Note**: This dataset is quite large (~1.6T).
```
cd $ALFRED_ROOT/data
git clone https://huggingface.co/byeonghwikim/abp_dataset json_feat_2.1.0
```
After downloading the dataset, you may directly load a surrounding feature and the expected outcome is as below.
```
>> import torch
>> filename = 'train/look_at_obj_in_light-AlarmClock-None-DeskLamp-301/trial_T20190907_174127_043461/feat_conv_panoramic.pt'
>> im = torch.load(filename) # [5, T, 512, 7, 7], T the length of a trajectory
>> im.shape
torch.Size([5, T, 512, 7, 7])
```
The 0-dimension of the feature corresponds to the respective view directions as below.
<ul>
<li>0: left view (RotateLeft)</li>
<li>1: up view (LookUp)</li>
<li>2: front (egocentric) view (no action)</li>
<li>3: down view (LookDown)</li>
<li>4: right view (RotateRight)</li>
</ul>
Inspired by <a href="https://bhkim94.github.io/projects/MOCA">MOCA</a>, we apply image augmentation to the agent's visual observation.
We apply two types of image augmentation: 1) swapping color channels of images and 2) AutoAugment.
- No augmentation: (feat_conv_panoramic.pt)
- Swapping color channels: (feat_conv_colorSwap1_panoramic.pt, feat_conv_colorSwap2_panoramic.pt)
- AutoAugment: (feat_conv_onlyAutoAug1_panoramic.pt ~ feat_conv_onlyAutoAug4_panoramic.pt)
## Related work that uses this dataset
<ul>
<li>
<a href="https://bhkim94.github.io/projects/CL-ALFRED/">
Online Continual Learning for Interactive Instruction Following Agents
</a>
<br>
<a href="https://bhkim94.github.io/" target="_new">
Byeonghwi Kim
</a><sup>*</sup>,
<a href="" target="_new">
Minhyuk Seo
</a><sup>*</sup>,
<a href="http://ppolon.github.io/" target="_new">
Jonghyun Choi
</a>
<br>
ICLR 2024
</li>
<li>
<a href="https://bhkim94.github.io/projects/MCR-Agent" target="_new">
Multi-Level Compositional Reasoning for Interactive Instruction Following
</a>
<br>
<a href="https://www.linkedin.com/in/suvaansh-bhambri-1784bab7/" target="_new">
Suvaansh Bhambri
</a><sup>*</sup>,
<a href="https://bhkim94.github.io/" target="_new">
Byeonghwi Kim
</a><sup>*</sup>,
<a href="http://ppolon.github.io/" target="_new">
Jonghyun Choi
</a>
<br>
AAAI 2023 (Oral)
</li>
<li>
<a href="https://bhkim94.github.io/projects/MOCA" target="_new">
Factorizing Perception and Policy for Interactive Instruction Following
</a>
<br>
<a href="https://kunalmessi10.github.io/" target="_new">
Kunal Pratap Singh
</a><sup>*</sup>,
<a href="https://www.linkedin.com/in/suvaansh-bhambri-1784bab7/" target="_new">
Suvaansh Bhambri
</a><sup>*</sup>,
<a href="https://bhkim94.github.io/" target="_new">
Byeonghwi Kim
</a><sup>*</sup>,
<a href="http://roozbehm.info/" target="_new">
Roozbeh Mottaghi
</a>,
<a href="http://ppolon.github.io/" target="_new">
Jonghyun Choi
</a>.
<br>
ICCV 2021
</li>
<li>
<a href="https://bhkim94.github.io/projects/ABP" target="_new">
Agent with the Big Picture: Perceiving Surroundings for Interactive Instruction Following
</a>
<br>
<a href="https://bhkim94.github.io/" target="_new">
Byeonghwi Kim
</a>,
<a href="https://www.linkedin.com/in/suvaansh-bhambri-1784bab7/" target="_new">
Suvaansh Bhambri
</a>,
<a href="https://kunalmessi10.github.io/" target="_new">
Kunal Pratap Singh
</a>,
<a href="http://roozbehm.info/" target="_new">
Roozbeh Mottaghi
</a>,
<a href="http://ppolon.github.io/" target="_new">
Jonghyun Choi
</a>.
<br>
Embodied AI Workshop @ CVPR 2021
</li>
</ul>
## Citation
If you find this repository useful, please cite this repository.
```
@inproceedings{kim2021agent,
author = {Kim, Byeonghwi and Bhambri, Suvaansh and Singh, Kunal Pratap and Mottaghi, Roozbeh and Choi, Jonghyun},
title = {Agent with the Big Picture: Perceiving Surroundings for Interactive Instruction Following},
booktitle = {Embodied AI Workshop @ CVPR 2021},
year = {2021},
}
``` | byeonghwikim/abp_dataset | [
"license:mit",
"region:us"
] | 2024-01-20T17:09:16+00:00 | {"license": "mit"} | 2024-01-30T12:27:34+00:00 | [] | [] | TAGS
#license-mit #region-us
|
# ALFRED Dataset for ABP
We provide the ALFRED dataset used for <a href="URL target="_new">ABP</a> including ResNet-18 features of egocentric and surrounding views, annotations, etc.
The surrdounding views are from four navigable actions defined in ALFRED: RotateLeft (90°), LookUp(15°), LookDown(15°), and RotateRight(90°).
The file structure is almost identical to the ALFRED dataset, so refer to <a href="URL for more details.
## Download the dataset
Move to the root (denoted by ALFRED_ROOT below) of the ABP (or related work) repo and clone this repository by following the commands below.\
Note: This dataset is quite large (~1.6T).
After downloading the dataset, you may directly load a surrounding feature and the expected outcome is as below.
The 0-dimension of the feature corresponds to the respective view directions as below.
<ul>
<li>0: left view (RotateLeft)</li>
<li>1: up view (LookUp)</li>
<li>2: front (egocentric) view (no action)</li>
<li>3: down view (LookDown)</li>
<li>4: right view (RotateRight)</li>
</ul>
Inspired by <a href="URL we apply image augmentation to the agent's visual observation.
We apply two types of image augmentation: 1) swapping color channels of images and 2) AutoAugment.
- No augmentation: (feat_conv_panoramic.pt)
- Swapping color channels: (feat_conv_colorSwap1_panoramic.pt, feat_conv_colorSwap2_panoramic.pt)
- AutoAugment: (feat_conv_onlyAutoAug1_panoramic.pt ~ feat_conv_onlyAutoAug4_panoramic.pt)
## Related work that uses this dataset
<ul>
<li>
<a href="URL
Online Continual Learning for Interactive Instruction Following Agents
</a>
<br>
<a href="URL target="_new">
Byeonghwi Kim
</a><sup>*</sup>,
<a href="" target="_new">
Minhyuk Seo
</a><sup>*</sup>,
<a href="URL target="_new">
Jonghyun Choi
</a>
<br>
ICLR 2024
</li>
<li>
<a href="URL target="_new">
Multi-Level Compositional Reasoning for Interactive Instruction Following
</a>
<br>
<a href="URL target="_new">
Suvaansh Bhambri
</a><sup>*</sup>,
<a href="URL target="_new">
Byeonghwi Kim
</a><sup>*</sup>,
<a href="URL target="_new">
Jonghyun Choi
</a>
<br>
AAAI 2023 (Oral)
</li>
<li>
<a href="URL target="_new">
Factorizing Perception and Policy for Interactive Instruction Following
</a>
<br>
<a href="URL target="_new">
Kunal Pratap Singh
</a><sup>*</sup>,
<a href="URL target="_new">
Suvaansh Bhambri
</a><sup>*</sup>,
<a href="URL target="_new">
Byeonghwi Kim
</a><sup>*</sup>,
<a href="URL target="_new">
Roozbeh Mottaghi
</a>,
<a href="URL target="_new">
Jonghyun Choi
</a>.
<br>
ICCV 2021
</li>
<li>
<a href="URL target="_new">
Agent with the Big Picture: Perceiving Surroundings for Interactive Instruction Following
</a>
<br>
<a href="URL target="_new">
Byeonghwi Kim
</a>,
<a href="URL target="_new">
Suvaansh Bhambri
</a>,
<a href="URL target="_new">
Kunal Pratap Singh
</a>,
<a href="URL target="_new">
Roozbeh Mottaghi
</a>,
<a href="URL target="_new">
Jonghyun Choi
</a>.
<br>
Embodied AI Workshop @ CVPR 2021
</li>
</ul>
If you find this repository useful, please cite this repository.
| [
"# ALFRED Dataset for ABP\nWe provide the ALFRED dataset used for <a href=\"URL target=\"_new\">ABP</a> including ResNet-18 features of egocentric and surrounding views, annotations, etc.\nThe surrdounding views are from four navigable actions defined in ALFRED: RotateLeft (90°), LookUp(15°), LookDown(15°), and RotateRight(90°).\nThe file structure is almost identical to the ALFRED dataset, so refer to <a href=\"URL for more details.",
"## Download the dataset\nMove to the root (denoted by ALFRED_ROOT below) of the ABP (or related work) repo and clone this repository by following the commands below.\\\nNote: This dataset is quite large (~1.6T).\n\n\nAfter downloading the dataset, you may directly load a surrounding feature and the expected outcome is as below.\n\n\nThe 0-dimension of the feature corresponds to the respective view directions as below.\n<ul>\n <li>0: left view (RotateLeft)</li>\n <li>1: up view (LookUp)</li>\n <li>2: front (egocentric) view (no action)</li>\n <li>3: down view (LookDown)</li>\n <li>4: right view (RotateRight)</li>\n</ul>\n\nInspired by <a href=\"URL we apply image augmentation to the agent's visual observation.\nWe apply two types of image augmentation: 1) swapping color channels of images and 2) AutoAugment.\n- No augmentation: (feat_conv_panoramic.pt)\n- Swapping color channels: (feat_conv_colorSwap1_panoramic.pt, feat_conv_colorSwap2_panoramic.pt)\n- AutoAugment: (feat_conv_onlyAutoAug1_panoramic.pt ~ feat_conv_onlyAutoAug4_panoramic.pt)",
"## Related work that uses this dataset\n<ul>\n <li>\n <a href=\"URL\n Online Continual Learning for Interactive Instruction Following Agents\n </a>\n <br>\n <a href=\"URL target=\"_new\">\n Byeonghwi Kim\n </a><sup>*</sup>,\n <a href=\"\" target=\"_new\">\n Minhyuk Seo\n </a><sup>*</sup>,\n <a href=\"URL target=\"_new\">\n Jonghyun Choi\n </a>\n <br>\n ICLR 2024\n </li>\n\n <li>\n <a href=\"URL target=\"_new\">\n Multi-Level Compositional Reasoning for Interactive Instruction Following\n </a>\n <br>\n <a href=\"URL target=\"_new\">\n Suvaansh Bhambri\n </a><sup>*</sup>,\n <a href=\"URL target=\"_new\">\n Byeonghwi Kim\n </a><sup>*</sup>,\n <a href=\"URL target=\"_new\">\n Jonghyun Choi\n </a>\n <br>\n AAAI 2023 (Oral)\n </li>\n \n <li>\n <a href=\"URL target=\"_new\">\n Factorizing Perception and Policy for Interactive Instruction Following\n </a>\n <br>\n <a href=\"URL target=\"_new\">\n Kunal Pratap Singh\n </a><sup>*</sup>,\n <a href=\"URL target=\"_new\">\n Suvaansh Bhambri\n </a><sup>*</sup>,\n <a href=\"URL target=\"_new\">\n Byeonghwi Kim\n </a><sup>*</sup>,\n <a href=\"URL target=\"_new\">\n Roozbeh Mottaghi\n </a>,\n <a href=\"URL target=\"_new\">\n Jonghyun Choi\n </a>.\n <br>\n ICCV 2021\n </li>\n \n <li>\n <a href=\"URL target=\"_new\">\n Agent with the Big Picture: Perceiving Surroundings for Interactive Instruction Following\n </a>\n <br>\n <a href=\"URL target=\"_new\">\n Byeonghwi Kim\n </a>,\n <a href=\"URL target=\"_new\">\n Suvaansh Bhambri\n </a>,\n <a href=\"URL target=\"_new\">\n Kunal Pratap Singh\n </a>,\n <a href=\"URL target=\"_new\">\n Roozbeh Mottaghi\n </a>,\n <a href=\"URL target=\"_new\">\n Jonghyun Choi\n </a>.\n <br>\n Embodied AI Workshop @ CVPR 2021\n </li>\n</ul>\n\nIf you find this repository useful, please cite this repository."
] | [
"TAGS\n#license-mit #region-us \n",
"# ALFRED Dataset for ABP\nWe provide the ALFRED dataset used for <a href=\"URL target=\"_new\">ABP</a> including ResNet-18 features of egocentric and surrounding views, annotations, etc.\nThe surrdounding views are from four navigable actions defined in ALFRED: RotateLeft (90°), LookUp(15°), LookDown(15°), and RotateRight(90°).\nThe file structure is almost identical to the ALFRED dataset, so refer to <a href=\"URL for more details.",
"## Download the dataset\nMove to the root (denoted by ALFRED_ROOT below) of the ABP (or related work) repo and clone this repository by following the commands below.\\\nNote: This dataset is quite large (~1.6T).\n\n\nAfter downloading the dataset, you may directly load a surrounding feature and the expected outcome is as below.\n\n\nThe 0-dimension of the feature corresponds to the respective view directions as below.\n<ul>\n <li>0: left view (RotateLeft)</li>\n <li>1: up view (LookUp)</li>\n <li>2: front (egocentric) view (no action)</li>\n <li>3: down view (LookDown)</li>\n <li>4: right view (RotateRight)</li>\n</ul>\n\nInspired by <a href=\"URL we apply image augmentation to the agent's visual observation.\nWe apply two types of image augmentation: 1) swapping color channels of images and 2) AutoAugment.\n- No augmentation: (feat_conv_panoramic.pt)\n- Swapping color channels: (feat_conv_colorSwap1_panoramic.pt, feat_conv_colorSwap2_panoramic.pt)\n- AutoAugment: (feat_conv_onlyAutoAug1_panoramic.pt ~ feat_conv_onlyAutoAug4_panoramic.pt)",
"## Related work that uses this dataset\n<ul>\n <li>\n <a href=\"URL\n Online Continual Learning for Interactive Instruction Following Agents\n </a>\n <br>\n <a href=\"URL target=\"_new\">\n Byeonghwi Kim\n </a><sup>*</sup>,\n <a href=\"\" target=\"_new\">\n Minhyuk Seo\n </a><sup>*</sup>,\n <a href=\"URL target=\"_new\">\n Jonghyun Choi\n </a>\n <br>\n ICLR 2024\n </li>\n\n <li>\n <a href=\"URL target=\"_new\">\n Multi-Level Compositional Reasoning for Interactive Instruction Following\n </a>\n <br>\n <a href=\"URL target=\"_new\">\n Suvaansh Bhambri\n </a><sup>*</sup>,\n <a href=\"URL target=\"_new\">\n Byeonghwi Kim\n </a><sup>*</sup>,\n <a href=\"URL target=\"_new\">\n Jonghyun Choi\n </a>\n <br>\n AAAI 2023 (Oral)\n </li>\n \n <li>\n <a href=\"URL target=\"_new\">\n Factorizing Perception and Policy for Interactive Instruction Following\n </a>\n <br>\n <a href=\"URL target=\"_new\">\n Kunal Pratap Singh\n </a><sup>*</sup>,\n <a href=\"URL target=\"_new\">\n Suvaansh Bhambri\n </a><sup>*</sup>,\n <a href=\"URL target=\"_new\">\n Byeonghwi Kim\n </a><sup>*</sup>,\n <a href=\"URL target=\"_new\">\n Roozbeh Mottaghi\n </a>,\n <a href=\"URL target=\"_new\">\n Jonghyun Choi\n </a>.\n <br>\n ICCV 2021\n </li>\n \n <li>\n <a href=\"URL target=\"_new\">\n Agent with the Big Picture: Perceiving Surroundings for Interactive Instruction Following\n </a>\n <br>\n <a href=\"URL target=\"_new\">\n Byeonghwi Kim\n </a>,\n <a href=\"URL target=\"_new\">\n Suvaansh Bhambri\n </a>,\n <a href=\"URL target=\"_new\">\n Kunal Pratap Singh\n </a>,\n <a href=\"URL target=\"_new\">\n Roozbeh Mottaghi\n </a>,\n <a href=\"URL target=\"_new\">\n Jonghyun Choi\n </a>.\n <br>\n Embodied AI Workshop @ CVPR 2021\n </li>\n</ul>\n\nIf you find this repository useful, please cite this repository."
] |
0e8a74a84e8fee6ce959528ec6e1a70d93b9aa6f | # RefinedWeb 1M Medium
Curated RefinedWeb with medium context length (2048 <= ctx_len <= 8192) | vilm/refinedweb-1m-medium | [
"region:us"
] | 2024-01-20T17:09:44+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5454844691, "num_examples": 1000000}], "download_size": 3346600355, "dataset_size": 5454844691}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-20T17:15:34+00:00 | [] | [] | TAGS
#region-us
| # RefinedWeb 1M Medium
Curated RefinedWeb with medium context length (2048 <= ctx_len <= 8192) | [
"# RefinedWeb 1M Medium\nCurated RefinedWeb with medium context length (2048 <= ctx_len <= 8192)"
] | [
"TAGS\n#region-us \n",
"# RefinedWeb 1M Medium\nCurated RefinedWeb with medium context length (2048 <= ctx_len <= 8192)"
] |
643cfa84162c1a88791702c49bf0548be01f378f |
# Dataset Card for Evaluation run of ConvexAI/Metabird-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ConvexAI/Metabird-7B](https://huggingface.co/ConvexAI/Metabird-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ConvexAI__Metabird-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-20T17:15:55.821392](https://huggingface.co/datasets/open-llm-leaderboard/details_ConvexAI__Metabird-7B/blob/main/results_2024-01-20T17-15-55.821392.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6562196818092919,
"acc_stderr": 0.031888120668163725,
"acc_norm": 0.6572328630765639,
"acc_norm_stderr": 0.03253749439366192,
"mc1": 0.40758873929008566,
"mc1_stderr": 0.01720194923455311,
"mc2": 0.579354979250579,
"mc2_stderr": 0.015314953137607924
},
"harness|arc:challenge|25": {
"acc": 0.6680887372013652,
"acc_stderr": 0.013760988200880541,
"acc_norm": 0.6953924914675768,
"acc_norm_stderr": 0.01344952210993249
},
"harness|hellaswag|10": {
"acc": 0.6965743875721968,
"acc_stderr": 0.0045879786255824785,
"acc_norm": 0.8754232224656443,
"acc_norm_stderr": 0.0032956349076664645
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.040943762699967926,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.040943762699967926
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7916666666666666,
"acc_stderr": 0.033961162058453336,
"acc_norm": 0.7916666666666666,
"acc_norm_stderr": 0.033961162058453336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411018,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411018
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.04966570903978529,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.04966570903978529
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8,
"acc_stderr": 0.022755204959542946,
"acc_norm": 0.8,
"acc_norm_stderr": 0.022755204959542946
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.02805779167298902,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.02805779167298902
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.02889774874113114,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.02889774874113114
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.029837962388291932,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.029837962388291932
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660836,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660836
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179326,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179326
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8326947637292464,
"acc_stderr": 0.013347327202920332,
"acc_norm": 0.8326947637292464,
"acc_norm_stderr": 0.013347327202920332
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258172,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.45139664804469276,
"acc_stderr": 0.016643307372315876,
"acc_norm": 0.45139664804469276,
"acc_norm_stderr": 0.016643307372315876
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.02540383297817961,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.02540383297817961
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042107,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042107
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.028920583220675602,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.028920583220675602
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40758873929008566,
"mc1_stderr": 0.01720194923455311,
"mc2": 0.579354979250579,
"mc2_stderr": 0.015314953137607924
},
"harness|winogrande|5": {
"acc": 0.8303078137332282,
"acc_stderr": 0.010549542647363698
},
"harness|gsm8k|5": {
"acc": 0.6285064442759667,
"acc_stderr": 0.013309839075706485
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_ConvexAI__Metabird-7B | [
"region:us"
] | 2024-01-20T17:18:23+00:00 | {"pretty_name": "Evaluation run of ConvexAI/Metabird-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [ConvexAI/Metabird-7B](https://huggingface.co/ConvexAI/Metabird-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ConvexAI__Metabird-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-20T17:15:55.821392](https://huggingface.co/datasets/open-llm-leaderboard/details_ConvexAI__Metabird-7B/blob/main/results_2024-01-20T17-15-55.821392.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6562196818092919,\n \"acc_stderr\": 0.031888120668163725,\n \"acc_norm\": 0.6572328630765639,\n \"acc_norm_stderr\": 0.03253749439366192,\n \"mc1\": 0.40758873929008566,\n \"mc1_stderr\": 0.01720194923455311,\n \"mc2\": 0.579354979250579,\n \"mc2_stderr\": 0.015314953137607924\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6680887372013652,\n \"acc_stderr\": 0.013760988200880541,\n \"acc_norm\": 0.6953924914675768,\n \"acc_norm_stderr\": 0.01344952210993249\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6965743875721968,\n \"acc_stderr\": 0.0045879786255824785,\n \"acc_norm\": 0.8754232224656443,\n \"acc_norm_stderr\": 0.0032956349076664645\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n \"acc_stderr\": 0.040943762699967926,\n \"acc_norm\": 0.6592592592592592,\n \"acc_norm_stderr\": 0.040943762699967926\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7916666666666666,\n \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.7916666666666666,\n \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411018,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411018\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.04966570903978529,\n \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.04966570903978529\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.022755204959542946,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.022755204959542946\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.02805779167298902,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.02805779167298902\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.02889774874113114,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.02889774874113114\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.029837962388291932,\n \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.029837962388291932\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660836,\n \"acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660836\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5509259259259259,\n \"acc_stderr\": 0.03392238405321617,\n \"acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.03392238405321617\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.020930193185179326,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.020930193185179326\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8326947637292464,\n \"acc_stderr\": 0.013347327202920332,\n \"acc_norm\": 0.8326947637292464,\n \"acc_norm_stderr\": 0.013347327202920332\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258172,\n \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258172\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.45139664804469276,\n \"acc_stderr\": 0.016643307372315876,\n \"acc_norm\": 0.45139664804469276,\n \"acc_norm_stderr\": 0.016643307372315876\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n \"acc_stderr\": 0.02540383297817961,\n \"acc_norm\": 0.7234726688102894,\n \"acc_norm_stderr\": 0.02540383297817961\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042107,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042107\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.028920583220675602,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.028920583220675602\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40758873929008566,\n \"mc1_stderr\": 0.01720194923455311,\n \"mc2\": 0.579354979250579,\n \"mc2_stderr\": 0.015314953137607924\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8303078137332282,\n \"acc_stderr\": 0.010549542647363698\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6285064442759667,\n \"acc_stderr\": 0.013309839075706485\n }\n}\n```", "repo_url": "https://huggingface.co/ConvexAI/Metabird-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|arc:challenge|25_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|gsm8k|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hellaswag|10_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T17-15-55.821392.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["**/details_harness|winogrande|5_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-20T17-15-55.821392.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_20T17_15_55.821392", "path": ["results_2024-01-20T17-15-55.821392.parquet"]}, {"split": "latest", "path": ["results_2024-01-20T17-15-55.821392.parquet"]}]}]} | 2024-01-20T17:18:43+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of ConvexAI/Metabird-7B
Dataset automatically created during the evaluation run of model ConvexAI/Metabird-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-20T17:15:55.821392(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of ConvexAI/Metabird-7B\n\n\n\nDataset automatically created during the evaluation run of model ConvexAI/Metabird-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T17:15:55.821392(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ConvexAI/Metabird-7B\n\n\n\nDataset automatically created during the evaluation run of model ConvexAI/Metabird-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T17:15:55.821392(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
1326845b4e45e0ff3797356d3171ab216771cf4d |
* Top 1% conversations of https://huggingface.co/datasets/OpenAssistant/oasst2
* language-filtered: en
* generated using https://github.com/blancsw/deep_4_all/blob/main/datasets/oasst/convert.py
* assistant answers replaced with answers by [Mixtral-8x7B](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1)
* _Note_: This is an unfiltered dataset, it for sure contains very bad answers. | g-ronimo/oasst2_top1_en_answers-mixtral | [
"license:apache-2.0",
"synthetic",
"region:us"
] | 2024-01-20T17:21:03+00:00 | {"license": "apache-2.0", "dataset_info": {"features": [{"name": "conversation", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 14166584, "num_examples": 5419}], "download_size": 7059605, "dataset_size": 14166584}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["synthetic"]} | 2024-02-05T05:27:34+00:00 | [] | [] | TAGS
#license-apache-2.0 #synthetic #region-us
|
* Top 1% conversations of URL
* language-filtered: en
* generated using URL
* assistant answers replaced with answers by Mixtral-8x7B
* _Note_: This is an unfiltered dataset, it for sure contains very bad answers. | [] | [
"TAGS\n#license-apache-2.0 #synthetic #region-us \n"
] |
c4ae9e2058b3e6e8fb2a50d0f811f2891384994c |
* Top 1% conversations of https://huggingface.co/datasets/OpenAssistant/oasst2
* language-filtered: en
* generated using https://github.com/blancsw/deep_4_all/blob/main/datasets/oasst/convert.py
* assistant answers replaced with answers by [Mistral-7B-Instruct-v0.2](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2)
* _Note_: This is an unfiltered dataset, it for sure contains very bad answers.
| g-ronimo/oasst2_top1_en_answers-mistral | [
"license:apache-2.0",
"synthetic",
"region:us"
] | 2024-01-20T17:21:55+00:00 | {"license": "apache-2.0", "dataset_info": {"features": [{"name": "conversation", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 15564639, "num_examples": 5419}], "download_size": 8427747, "dataset_size": 15564639}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["synthetic"]} | 2024-02-05T05:27:58+00:00 | [] | [] | TAGS
#license-apache-2.0 #synthetic #region-us
|
* Top 1% conversations of URL
* language-filtered: en
* generated using URL
* assistant answers replaced with answers by Mistral-7B-Instruct-v0.2
* _Note_: This is an unfiltered dataset, it for sure contains very bad answers.
| [] | [
"TAGS\n#license-apache-2.0 #synthetic #region-us \n"
] |
099d3ada6648cef2fa0ba9cd999ede795571135e | # Dataset Card for "noisy_vctk_16k_extract_unit"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Codec-SUPERB/noisy_vctk_16k_extract_unit | [
"region:us"
] | 2024-01-20T18:40:48+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "academicodec_hifi_16k_320d", "path": "data/academicodec_hifi_16k_320d-*"}, {"split": "academicodec_hifi_16k_320d_large_uni", "path": "data/academicodec_hifi_16k_320d_large_uni-*"}, {"split": "academicodec_hifi_24k_320d", "path": "data/academicodec_hifi_24k_320d-*"}, {"split": "audiodec_24k_320d", "path": "data/audiodec_24k_320d-*"}, {"split": "dac_16k", "path": "data/dac_16k-*"}, {"split": "dac_24k", "path": "data/dac_24k-*"}, {"split": "dac_44k", "path": "data/dac_44k-*"}, {"split": "encodec_24k", "path": "data/encodec_24k-*"}, {"split": "funcodec_en_libritts_16k_gr1nq32ds320", "path": "data/funcodec_en_libritts_16k_gr1nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_gr8nq32ds320", "path": "data/funcodec_en_libritts_16k_gr8nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_nq32ds320", "path": "data/funcodec_en_libritts_16k_nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_nq32ds640", "path": "data/funcodec_en_libritts_16k_nq32ds640-*"}, {"split": "funcodec_zh_en_16k_nq32ds320", "path": "data/funcodec_zh_en_16k_nq32ds320-*"}, {"split": "funcodec_zh_en_16k_nq32ds640", "path": "data/funcodec_zh_en_16k_nq32ds640-*"}, {"split": "speech_tokenizer_16k", "path": "data/speech_tokenizer_16k-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "unit", "sequence": {"sequence": "int64"}}], "splits": [{"name": "academicodec_hifi_16k_320d", "num_bytes": 116057288, "num_examples": 24792}, {"name": "academicodec_hifi_16k_320d_large_uni", "num_bytes": 116057288, "num_examples": 24792}, {"name": "academicodec_hifi_24k_320d", "num_bytes": 173451464, "num_examples": 24792}, {"name": "audiodec_24k_320d", "num_bytes": 370177480, "num_examples": 24792}, {"name": "dac_16k", "num_bytes": 743941448, "num_examples": 24792}, {"name": "dac_24k", "num_bytes": 2140233288, "num_examples": 24792}, {"name": "dac_44k", "num_bytes": 641562920, "num_examples": 24792}, {"name": "encodec_24k", "num_bytes": 87719016, "num_examples": 24792}, {"name": "funcodec_en_libritts_16k_gr1nq32ds320", "num_bytes": 926382152, "num_examples": 24792}, {"name": "funcodec_en_libritts_16k_gr8nq32ds320", "num_bytes": 926382152, "num_examples": 24792}, {"name": "funcodec_en_libritts_16k_nq32ds320", "num_bytes": 925795400, "num_examples": 24792}, {"name": "funcodec_en_libritts_16k_nq32ds640", "num_bytes": 466669128, "num_examples": 24792}, {"name": "funcodec_zh_en_16k_nq32ds320", "num_bytes": 925795400, "num_examples": 24792}, {"name": "funcodec_zh_en_16k_nq32ds640", "num_bytes": 466669128, "num_examples": 24792}, {"name": "speech_tokenizer_16k", "num_bytes": 232351304, "num_examples": 24792}], "download_size": 1435458632, "dataset_size": 9259244856}} | 2024-01-20T18:43:17+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "noisy_vctk_16k_extract_unit"
More Information needed | [
"# Dataset Card for \"noisy_vctk_16k_extract_unit\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"noisy_vctk_16k_extract_unit\"\n\nMore Information needed"
] |
95c15c678a2cabc2516fa052d85d048caf95eb25 |
# Dataset Card for Evaluation run of freecs/Zero-7B-test-2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [freecs/Zero-7B-test-2](https://huggingface.co/freecs/Zero-7B-test-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_freecs__Zero-7B-test-2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-20T18:46:57.239901](https://huggingface.co/datasets/open-llm-leaderboard/details_freecs__Zero-7B-test-2/blob/main/results_2024-01-20T18-46-57.239901.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6310493818362459,
"acc_stderr": 0.032481623842244296,
"acc_norm": 0.6340213840269038,
"acc_norm_stderr": 0.03313260403421946,
"mc1": 0.42717258261933905,
"mc1_stderr": 0.017316834410963926,
"mc2": 0.5995330460127621,
"mc2_stderr": 0.015385793036833406
},
"harness|arc:challenge|25": {
"acc": 0.6143344709897611,
"acc_stderr": 0.014224250973257186,
"acc_norm": 0.6612627986348123,
"acc_norm_stderr": 0.013830568927974332
},
"harness|hellaswag|10": {
"acc": 0.6450906193985262,
"acc_stderr": 0.0047750796365670966,
"acc_norm": 0.8477394941246763,
"acc_norm_stderr": 0.003585389636472374
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.028254200344438662,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.028254200344438662
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105654,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778398,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778398
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7290322580645161,
"acc_stderr": 0.025284416114900156,
"acc_norm": 0.7290322580645161,
"acc_norm_stderr": 0.025284416114900156
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.031584153240477114,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.031584153240477114
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.02985751567338642,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.02985751567338642
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121437,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121437
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6384615384615384,
"acc_stderr": 0.024359581465396993,
"acc_norm": 0.6384615384615384,
"acc_norm_stderr": 0.024359581465396993
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524575,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524575
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977945,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977945
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.016399436366612907,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.016399436366612907
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.02646056956124063,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.02646056956124063
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.03457272836917669,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.03457272836917669
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179333,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368982,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368982
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.024818350129436593,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.024818350129436593
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.34972067039106147,
"acc_stderr": 0.01594930879023364,
"acc_norm": 0.34972067039106147,
"acc_norm_stderr": 0.01594930879023364
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.025457756696667885,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.025457756696667885
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.026160584450140446,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.026160584450140446
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.024922001168886335,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.024922001168886335
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4589308996088657,
"acc_stderr": 0.012727084826799798,
"acc_norm": 0.4589308996088657,
"acc_norm_stderr": 0.012727084826799798
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983576,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983576
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6503267973856209,
"acc_stderr": 0.01929196189506638,
"acc_norm": 0.6503267973856209,
"acc_norm_stderr": 0.01929196189506638
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.02853556033712844,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.02853556033712844
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7313432835820896,
"acc_stderr": 0.03134328358208954,
"acc_norm": 0.7313432835820896,
"acc_norm_stderr": 0.03134328358208954
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.036845294917747066,
"acc_norm": 0.84,
"acc_norm_stderr": 0.036845294917747066
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42717258261933905,
"mc1_stderr": 0.017316834410963926,
"mc2": 0.5995330460127621,
"mc2_stderr": 0.015385793036833406
},
"harness|winogrande|5": {
"acc": 0.8003157063930545,
"acc_stderr": 0.011235328382625849
},
"harness|gsm8k|5": {
"acc": 0.5360121304018196,
"acc_stderr": 0.013736715929950318
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_freecs__Zero-7B-test-2 | [
"region:us"
] | 2024-01-20T18:49:19+00:00 | {"pretty_name": "Evaluation run of freecs/Zero-7B-test-2", "dataset_summary": "Dataset automatically created during the evaluation run of model [freecs/Zero-7B-test-2](https://huggingface.co/freecs/Zero-7B-test-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_freecs__Zero-7B-test-2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-20T18:46:57.239901](https://huggingface.co/datasets/open-llm-leaderboard/details_freecs__Zero-7B-test-2/blob/main/results_2024-01-20T18-46-57.239901.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6310493818362459,\n \"acc_stderr\": 0.032481623842244296,\n \"acc_norm\": 0.6340213840269038,\n \"acc_norm_stderr\": 0.03313260403421946,\n \"mc1\": 0.42717258261933905,\n \"mc1_stderr\": 0.017316834410963926,\n \"mc2\": 0.5995330460127621,\n \"mc2_stderr\": 0.015385793036833406\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6143344709897611,\n \"acc_stderr\": 0.014224250973257186,\n \"acc_norm\": 0.6612627986348123,\n \"acc_norm_stderr\": 0.013830568927974332\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6450906193985262,\n \"acc_stderr\": 0.0047750796365670966,\n \"acc_norm\": 0.8477394941246763,\n \"acc_norm_stderr\": 0.003585389636472374\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438662,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438662\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105654,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419035,\n \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419035\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778398,\n \"acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778398\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7290322580645161,\n \"acc_stderr\": 0.025284416114900156,\n \"acc_norm\": 0.7290322580645161,\n \"acc_norm_stderr\": 0.025284416114900156\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.031584153240477114,\n \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.031584153240477114\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338642,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338642\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121437,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121437\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6384615384615384,\n \"acc_stderr\": 0.024359581465396993,\n \"acc_norm\": 0.6384615384615384,\n \"acc_norm_stderr\": 0.024359581465396993\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524575,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524575\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977945,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977945\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612907,\n \"acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612907\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.02646056956124063,\n \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.02646056956124063\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917669,\n \"acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917669\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n \"acc_stderr\": 0.013702643715368982,\n \"acc_norm\": 0.8212005108556832,\n \"acc_norm_stderr\": 0.013702643715368982\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.024818350129436593,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.024818350129436593\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.34972067039106147,\n \"acc_stderr\": 0.01594930879023364,\n \"acc_norm\": 0.34972067039106147,\n \"acc_norm_stderr\": 0.01594930879023364\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.025457756696667885,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.025457756696667885\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n \"acc_stderr\": 0.026160584450140446,\n \"acc_norm\": 0.6945337620578779,\n \"acc_norm_stderr\": 0.026160584450140446\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.024922001168886335,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.024922001168886335\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4589308996088657,\n \"acc_stderr\": 0.012727084826799798,\n \"acc_norm\": 0.4589308996088657,\n \"acc_norm_stderr\": 0.012727084826799798\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983576,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983576\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6503267973856209,\n \"acc_stderr\": 0.01929196189506638,\n \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.01929196189506638\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.02853556033712844,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.02853556033712844\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7313432835820896,\n \"acc_stderr\": 0.03134328358208954,\n \"acc_norm\": 0.7313432835820896,\n \"acc_norm_stderr\": 0.03134328358208954\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.036845294917747066,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.036845294917747066\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42717258261933905,\n \"mc1_stderr\": 0.017316834410963926,\n \"mc2\": 0.5995330460127621,\n \"mc2_stderr\": 0.015385793036833406\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8003157063930545,\n \"acc_stderr\": 0.011235328382625849\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5360121304018196,\n \"acc_stderr\": 0.013736715929950318\n }\n}\n```", "repo_url": "https://huggingface.co/freecs/Zero-7B-test-2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|arc:challenge|25_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|gsm8k|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hellaswag|10_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T18-46-57.239901.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["**/details_harness|winogrande|5_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-20T18-46-57.239901.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_20T18_46_57.239901", "path": ["results_2024-01-20T18-46-57.239901.parquet"]}, {"split": "latest", "path": ["results_2024-01-20T18-46-57.239901.parquet"]}]}]} | 2024-01-20T18:49:40+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of freecs/Zero-7B-test-2
Dataset automatically created during the evaluation run of model freecs/Zero-7B-test-2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-20T18:46:57.239901(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of freecs/Zero-7B-test-2\n\n\n\nDataset automatically created during the evaluation run of model freecs/Zero-7B-test-2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T18:46:57.239901(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of freecs/Zero-7B-test-2\n\n\n\nDataset automatically created during the evaluation run of model freecs/Zero-7B-test-2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T18:46:57.239901(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
24eba86dbefd1e4aa321105e7ed8de39fcf21d41 | # Dataset Card for "Quant-Trading-Instruct"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | lumalik/Quant-Trading-Instruct | [
"region:us"
] | 2024-01-20T18:53:55+00:00 | {"dataset_info": {"features": [{"name": "context", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "question", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1479263, "num_examples": 386}], "download_size": 485412, "dataset_size": 1479263}} | 2024-01-20T18:54:03+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "Quant-Trading-Instruct"
More Information needed | [
"# Dataset Card for \"Quant-Trading-Instruct\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"Quant-Trading-Instruct\"\n\nMore Information needed"
] |
206f3575579f1187548c6f47042ae9174c0a51fc |
# CC0 Stock Images Dataset
This dataset contains a collection of stock images that are covered by the Creative Commons Zero (CC0) License, meaning they are free for personal and commercial use with no attribution required. It is designed to support a variety of computer vision tasks such as image tagging, categorization, and machine learning model training.
## Disclaimer
While every effort has been made to ensure the reliability and correctness of the data presented, the dataset is provided "as is" without any guarantee. If you find any issues, please report them to the dataset maintainers.
## Dataset Structure
The dataset includes the following features:
- `image`: The raw bytes of the image, which can be read using image processing libraries like PIL or OpenCV.
- `tags`: A string containing comma-separated tags related to the content of the image.
## Size of the Dataset
The size of the dataset is _1000_ images. (To be updated with more soon)
## Use Cases
This dataset can be used for a variety of purposes, including but not limited to:
- Training and evaluating image classification models.
- Developing and testing image tagging algorithms.
- Visual data analysis and machine learning research.
- Creating artwork and design projects.
## License
All images in this dataset are available under the CC0 License. You can copy, modify, distribute, and perform the work, even for commercial purposes, all without asking permission.
## Acknowledgements
This dataset has been compiled from various sources that provide free stock images under the CC0 License. We extend our gratitude to the photographers and creators who have contributed their work to the public domain.
## Accessing the Dataset
This dataset is hosted on the Hugging Face Hub. You can access and download the dataset using the Hugging Face `datasets` library with the following command:
```python
from datasets import load_dataset
dataset = load_dataset('KoalaAI/StockImages-CC0')
```
## Contributing
We welcome contributions to this dataset, whether it's adding more images, improving the tags, or any other improvements you can offer. Please follow the standard procedures for contributing to datasets on the Hugging Face Hub. | KoalaAI/StockImages-CC0 | [
"task_categories:image-to-text",
"task_categories:image-to-image",
"task_categories:text-to-image",
"size_categories:1K<n<10K",
"language:en",
"license:cc0-1.0",
"cc0",
"public domain",
"copyright-free",
"stock photos",
"images",
"region:us"
] | 2024-01-20T19:21:21+00:00 | {"language": ["en"], "license": "cc0-1.0", "size_categories": ["1K<n<10K"], "task_categories": ["image-to-text", "image-to-image", "text-to-image"], "pretty_name": "Stock Images CC0 (public domain)", "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "tags", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 893124622.125, "num_examples": 3999}], "download_size": 888910102, "dataset_size": 893124622.125}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["cc0", "public domain", "copyright-free", "stock photos", "images"]} | 2024-02-05T15:23:27+00:00 | [] | [
"en"
] | TAGS
#task_categories-image-to-text #task_categories-image-to-image #task_categories-text-to-image #size_categories-1K<n<10K #language-English #license-cc0-1.0 #cc0 #public domain #copyright-free #stock photos #images #region-us
|
# CC0 Stock Images Dataset
This dataset contains a collection of stock images that are covered by the Creative Commons Zero (CC0) License, meaning they are free for personal and commercial use with no attribution required. It is designed to support a variety of computer vision tasks such as image tagging, categorization, and machine learning model training.
## Disclaimer
While every effort has been made to ensure the reliability and correctness of the data presented, the dataset is provided "as is" without any guarantee. If you find any issues, please report them to the dataset maintainers.
## Dataset Structure
The dataset includes the following features:
- 'image': The raw bytes of the image, which can be read using image processing libraries like PIL or OpenCV.
- 'tags': A string containing comma-separated tags related to the content of the image.
## Size of the Dataset
The size of the dataset is _1000_ images. (To be updated with more soon)
## Use Cases
This dataset can be used for a variety of purposes, including but not limited to:
- Training and evaluating image classification models.
- Developing and testing image tagging algorithms.
- Visual data analysis and machine learning research.
- Creating artwork and design projects.
## License
All images in this dataset are available under the CC0 License. You can copy, modify, distribute, and perform the work, even for commercial purposes, all without asking permission.
## Acknowledgements
This dataset has been compiled from various sources that provide free stock images under the CC0 License. We extend our gratitude to the photographers and creators who have contributed their work to the public domain.
## Accessing the Dataset
This dataset is hosted on the Hugging Face Hub. You can access and download the dataset using the Hugging Face 'datasets' library with the following command:
## Contributing
We welcome contributions to this dataset, whether it's adding more images, improving the tags, or any other improvements you can offer. Please follow the standard procedures for contributing to datasets on the Hugging Face Hub. | [
"# CC0 Stock Images Dataset\n\nThis dataset contains a collection of stock images that are covered by the Creative Commons Zero (CC0) License, meaning they are free for personal and commercial use with no attribution required. It is designed to support a variety of computer vision tasks such as image tagging, categorization, and machine learning model training.",
"## Disclaimer\nWhile every effort has been made to ensure the reliability and correctness of the data presented, the dataset is provided \"as is\" without any guarantee. If you find any issues, please report them to the dataset maintainers.",
"## Dataset Structure\n\nThe dataset includes the following features:\n\n- 'image': The raw bytes of the image, which can be read using image processing libraries like PIL or OpenCV.\n- 'tags': A string containing comma-separated tags related to the content of the image.",
"## Size of the Dataset\n\nThe size of the dataset is _1000_ images. (To be updated with more soon)",
"## Use Cases\n\nThis dataset can be used for a variety of purposes, including but not limited to:\n\n- Training and evaluating image classification models.\n- Developing and testing image tagging algorithms.\n- Visual data analysis and machine learning research.\n- Creating artwork and design projects.",
"## License\n\nAll images in this dataset are available under the CC0 License. You can copy, modify, distribute, and perform the work, even for commercial purposes, all without asking permission.",
"## Acknowledgements\n\nThis dataset has been compiled from various sources that provide free stock images under the CC0 License. We extend our gratitude to the photographers and creators who have contributed their work to the public domain.",
"## Accessing the Dataset\n\nThis dataset is hosted on the Hugging Face Hub. You can access and download the dataset using the Hugging Face 'datasets' library with the following command:",
"## Contributing\nWe welcome contributions to this dataset, whether it's adding more images, improving the tags, or any other improvements you can offer. Please follow the standard procedures for contributing to datasets on the Hugging Face Hub."
] | [
"TAGS\n#task_categories-image-to-text #task_categories-image-to-image #task_categories-text-to-image #size_categories-1K<n<10K #language-English #license-cc0-1.0 #cc0 #public domain #copyright-free #stock photos #images #region-us \n",
"# CC0 Stock Images Dataset\n\nThis dataset contains a collection of stock images that are covered by the Creative Commons Zero (CC0) License, meaning they are free for personal and commercial use with no attribution required. It is designed to support a variety of computer vision tasks such as image tagging, categorization, and machine learning model training.",
"## Disclaimer\nWhile every effort has been made to ensure the reliability and correctness of the data presented, the dataset is provided \"as is\" without any guarantee. If you find any issues, please report them to the dataset maintainers.",
"## Dataset Structure\n\nThe dataset includes the following features:\n\n- 'image': The raw bytes of the image, which can be read using image processing libraries like PIL or OpenCV.\n- 'tags': A string containing comma-separated tags related to the content of the image.",
"## Size of the Dataset\n\nThe size of the dataset is _1000_ images. (To be updated with more soon)",
"## Use Cases\n\nThis dataset can be used for a variety of purposes, including but not limited to:\n\n- Training and evaluating image classification models.\n- Developing and testing image tagging algorithms.\n- Visual data analysis and machine learning research.\n- Creating artwork and design projects.",
"## License\n\nAll images in this dataset are available under the CC0 License. You can copy, modify, distribute, and perform the work, even for commercial purposes, all without asking permission.",
"## Acknowledgements\n\nThis dataset has been compiled from various sources that provide free stock images under the CC0 License. We extend our gratitude to the photographers and creators who have contributed their work to the public domain.",
"## Accessing the Dataset\n\nThis dataset is hosted on the Hugging Face Hub. You can access and download the dataset using the Hugging Face 'datasets' library with the following command:",
"## Contributing\nWe welcome contributions to this dataset, whether it's adding more images, improving the tags, or any other improvements you can offer. Please follow the standard procedures for contributing to datasets on the Hugging Face Hub."
] |
e4e4d36f0b06dd3b0176b844af28ef4e0b297647 | # Dataset Card for "noisy_vctk_16k_synth"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Codec-SUPERB/noisy_vctk_16k_synth | [
"region:us"
] | 2024-01-20T19:58:16+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "original", "path": "data/original-*"}, {"split": "academicodec_hifi_16k_320d", "path": "data/academicodec_hifi_16k_320d-*"}, {"split": "academicodec_hifi_16k_320d_large_uni", "path": "data/academicodec_hifi_16k_320d_large_uni-*"}, {"split": "academicodec_hifi_24k_320d", "path": "data/academicodec_hifi_24k_320d-*"}, {"split": "audiodec_24k_320d", "path": "data/audiodec_24k_320d-*"}, {"split": "dac_16k", "path": "data/dac_16k-*"}, {"split": "dac_24k", "path": "data/dac_24k-*"}, {"split": "dac_44k", "path": "data/dac_44k-*"}, {"split": "encodec_24k", "path": "data/encodec_24k-*"}, {"split": "funcodec_en_libritts_16k_gr1nq32ds320", "path": "data/funcodec_en_libritts_16k_gr1nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_gr8nq32ds320", "path": "data/funcodec_en_libritts_16k_gr8nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_nq32ds320", "path": "data/funcodec_en_libritts_16k_nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_nq32ds640", "path": "data/funcodec_en_libritts_16k_nq32ds640-*"}, {"split": "funcodec_zh_en_16k_nq32ds320", "path": "data/funcodec_zh_en_16k_nq32ds320-*"}, {"split": "funcodec_zh_en_16k_nq32ds640", "path": "data/funcodec_zh_en_16k_nq32ds640-*"}, {"split": "speech_tokenizer_16k", "path": "data/speech_tokenizer_16k-*"}]}], "dataset_info": {"features": [{"name": "audio", "dtype": {"audio": {"sampling_rate": 16000}}}, {"name": "id", "dtype": "string"}], "splits": [{"name": "original", "num_bytes": 2298657432.0, "num_examples": 24792}, {"name": "academicodec_hifi_16k_320d", "num_bytes": 2292445936.0, "num_examples": 24792}, {"name": "academicodec_hifi_16k_320d_large_uni", "num_bytes": 2292445936.0, "num_examples": 24792}, {"name": "academicodec_hifi_24k_320d", "num_bytes": 3433060336.0, "num_examples": 24792}, {"name": "audiodec_24k_320d", "num_bytes": 3440000656.0, "num_examples": 24792}, {"name": "dac_16k", "num_bytes": 2294433456.0, "num_examples": 24792}, {"name": "dac_24k", "num_bytes": 3029679147.736, "num_examples": 24792}, {"name": "dac_44k", "num_bytes": 5564292936.952, "num_examples": 24792}, {"name": "encodec_24k", "num_bytes": 3029728087.144, "num_examples": 24792}, {"name": "funcodec_en_libritts_16k_gr1nq32ds320", "num_bytes": 2019057663.064, "num_examples": 24792}, {"name": "funcodec_en_libritts_16k_gr8nq32ds320", "num_bytes": 2019057663.064, "num_examples": 24792}, {"name": "funcodec_en_libritts_16k_nq32ds320", "num_bytes": 2019057663.064, "num_examples": 24792}, {"name": "funcodec_en_libritts_16k_nq32ds640", "num_bytes": 2019057663.064, "num_examples": 24792}, {"name": "funcodec_zh_en_16k_nq32ds320", "num_bytes": 2019057663.064, "num_examples": 24792}, {"name": "funcodec_zh_en_16k_nq32ds640", "num_bytes": 2019057663.064, "num_examples": 24792}, {"name": "speech_tokenizer_16k", "num_bytes": 2028915656.44, "num_examples": 24792}], "download_size": 44767386741, "dataset_size": 41818005558.656006}} | 2024-01-20T20:45:23+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "noisy_vctk_16k_synth"
More Information needed | [
"# Dataset Card for \"noisy_vctk_16k_synth\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"noisy_vctk_16k_synth\"\n\nMore Information needed"
] |
3aeb0f4ef0a45171bf518b36c35534c4a3cae4a3 |
# Dataset Card for Evaluation run of Danielbrdz/Barcenas-Tiny-1.1b-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Danielbrdz/Barcenas-Tiny-1.1b-DPO](https://huggingface.co/Danielbrdz/Barcenas-Tiny-1.1b-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Danielbrdz__Barcenas-Tiny-1.1b-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-20T20:17:56.012496](https://huggingface.co/datasets/open-llm-leaderboard/details_Danielbrdz__Barcenas-Tiny-1.1b-DPO/blob/main/results_2024-01-20T20-17-56.012496.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.255499272307308,
"acc_stderr": 0.030734913787339856,
"acc_norm": 0.2563954397502199,
"acc_norm_stderr": 0.03147886224993774,
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931583,
"mc2": 0.37445718658128274,
"mc2_stderr": 0.013750963148166805
},
"harness|arc:challenge|25": {
"acc": 0.3464163822525597,
"acc_stderr": 0.013905011180063251,
"acc_norm": 0.3626279863481229,
"acc_norm_stderr": 0.014049106564955005
},
"harness|hellaswag|10": {
"acc": 0.4565823541127266,
"acc_stderr": 0.0049709334202319285,
"acc_norm": 0.6120294761999602,
"acc_norm_stderr": 0.004862919176408069
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.15555555555555556,
"acc_stderr": 0.03130948364878316,
"acc_norm": 0.15555555555555556,
"acc_norm_stderr": 0.03130948364878316
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27169811320754716,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.27169811320754716,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2023121387283237,
"acc_stderr": 0.030631145539198823,
"acc_norm": 0.2023121387283237,
"acc_norm_stderr": 0.030631145539198823
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.03873958714149351,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.03873958714149351
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2425531914893617,
"acc_stderr": 0.028020226271200217,
"acc_norm": 0.2425531914893617,
"acc_norm_stderr": 0.028020226271200217
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748142,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748142
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.25517241379310346,
"acc_stderr": 0.03632984052707842,
"acc_norm": 0.25517241379310346,
"acc_norm_stderr": 0.03632984052707842
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.022569897074918407,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.022569897074918407
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.03809523809523811,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.03809523809523811
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2161290322580645,
"acc_stderr": 0.023415293433568525,
"acc_norm": 0.2161290322580645,
"acc_norm_stderr": 0.023415293433568525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.22167487684729065,
"acc_stderr": 0.029225575892489617,
"acc_norm": 0.22167487684729065,
"acc_norm_stderr": 0.029225575892489617
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23030303030303031,
"acc_stderr": 0.032876667586034886,
"acc_norm": 0.23030303030303031,
"acc_norm_stderr": 0.032876667586034886
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21717171717171718,
"acc_stderr": 0.02937661648494564,
"acc_norm": 0.21717171717171718,
"acc_norm_stderr": 0.02937661648494564
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22279792746113988,
"acc_stderr": 0.030031147977641545,
"acc_norm": 0.22279792746113988,
"acc_norm_stderr": 0.030031147977641545
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24615384615384617,
"acc_stderr": 0.021840866990423077,
"acc_norm": 0.24615384615384617,
"acc_norm_stderr": 0.021840866990423077
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145668,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145668
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.22268907563025211,
"acc_stderr": 0.027025433498882392,
"acc_norm": 0.22268907563025211,
"acc_norm_stderr": 0.027025433498882392
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2185430463576159,
"acc_stderr": 0.03374235550425694,
"acc_norm": 0.2185430463576159,
"acc_norm_stderr": 0.03374235550425694
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24036697247706423,
"acc_stderr": 0.01832060732096407,
"acc_norm": 0.24036697247706423,
"acc_norm_stderr": 0.01832060732096407
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3425925925925926,
"acc_stderr": 0.032365852526021574,
"acc_norm": 0.3425925925925926,
"acc_norm_stderr": 0.032365852526021574
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23039215686274508,
"acc_stderr": 0.029554292605695053,
"acc_norm": 0.23039215686274508,
"acc_norm_stderr": 0.029554292605695053
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.34080717488789236,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.34080717488789236,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578728,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578728
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690877,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690877
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.27350427350427353,
"acc_stderr": 0.029202540153431166,
"acc_norm": 0.27350427350427353,
"acc_norm_stderr": 0.029202540153431166
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2796934865900383,
"acc_stderr": 0.01605079214803655,
"acc_norm": 0.2796934865900383,
"acc_norm_stderr": 0.01605079214803655
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.22832369942196531,
"acc_stderr": 0.022598703804321624,
"acc_norm": 0.22832369942196531,
"acc_norm_stderr": 0.022598703804321624
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23016759776536314,
"acc_stderr": 0.014078339253425814,
"acc_norm": 0.23016759776536314,
"acc_norm_stderr": 0.014078339253425814
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02564686309713791,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02564686309713791
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2733118971061093,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.2733118971061093,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2654320987654321,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.2654320987654321,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2198581560283688,
"acc_stderr": 0.024706141070705484,
"acc_norm": 0.2198581560283688,
"acc_norm_stderr": 0.024706141070705484
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23859191655801826,
"acc_stderr": 0.010885929742002202,
"acc_norm": 0.23859191655801826,
"acc_norm_stderr": 0.010885929742002202
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20955882352941177,
"acc_stderr": 0.024723110407677062,
"acc_norm": 0.20955882352941177,
"acc_norm_stderr": 0.024723110407677062
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.017776947157528037,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.017776947157528037
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2636363636363636,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.2636363636363636,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17959183673469387,
"acc_stderr": 0.024573293589585637,
"acc_norm": 0.17959183673469387,
"acc_norm_stderr": 0.024573293589585637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.029929415408348387,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.029929415408348387
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3313253012048193,
"acc_stderr": 0.036643147772880864,
"acc_norm": 0.3313253012048193,
"acc_norm_stderr": 0.036643147772880864
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931583,
"mc2": 0.37445718658128274,
"mc2_stderr": 0.013750963148166805
},
"harness|winogrande|5": {
"acc": 0.6093133385951065,
"acc_stderr": 0.013712536036556653
},
"harness|gsm8k|5": {
"acc": 0.02047005307050796,
"acc_stderr": 0.003900413385915718
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Danielbrdz__Barcenas-Tiny-1.1b-DPO | [
"region:us"
] | 2024-01-20T20:19:44+00:00 | {"pretty_name": "Evaluation run of Danielbrdz/Barcenas-Tiny-1.1b-DPO", "dataset_summary": "Dataset automatically created during the evaluation run of model [Danielbrdz/Barcenas-Tiny-1.1b-DPO](https://huggingface.co/Danielbrdz/Barcenas-Tiny-1.1b-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Danielbrdz__Barcenas-Tiny-1.1b-DPO\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-20T20:17:56.012496](https://huggingface.co/datasets/open-llm-leaderboard/details_Danielbrdz__Barcenas-Tiny-1.1b-DPO/blob/main/results_2024-01-20T20-17-56.012496.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.255499272307308,\n \"acc_stderr\": 0.030734913787339856,\n \"acc_norm\": 0.2563954397502199,\n \"acc_norm_stderr\": 0.03147886224993774,\n \"mc1\": 0.23378212974296206,\n \"mc1_stderr\": 0.014816195991931583,\n \"mc2\": 0.37445718658128274,\n \"mc2_stderr\": 0.013750963148166805\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3464163822525597,\n \"acc_stderr\": 0.013905011180063251,\n \"acc_norm\": 0.3626279863481229,\n \"acc_norm_stderr\": 0.014049106564955005\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4565823541127266,\n \"acc_stderr\": 0.0049709334202319285,\n \"acc_norm\": 0.6120294761999602,\n \"acc_norm_stderr\": 0.004862919176408069\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.15555555555555556,\n \"acc_stderr\": 0.03130948364878316,\n \"acc_norm\": 0.15555555555555556,\n \"acc_norm_stderr\": 0.03130948364878316\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.27169811320754716,\n \"acc_stderr\": 0.027377706624670713,\n \"acc_norm\": 0.27169811320754716,\n \"acc_norm_stderr\": 0.027377706624670713\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n \"acc_stderr\": 0.030631145539198823,\n \"acc_norm\": 0.2023121387283237,\n \"acc_norm_stderr\": 0.030631145539198823\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.03873958714149351,\n \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.03873958714149351\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2425531914893617,\n \"acc_stderr\": 0.028020226271200217,\n \"acc_norm\": 0.2425531914893617,\n \"acc_norm_stderr\": 0.028020226271200217\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.04049339297748142,\n \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.04049339297748142\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.25517241379310346,\n \"acc_stderr\": 0.03632984052707842,\n \"acc_norm\": 0.25517241379310346,\n \"acc_norm_stderr\": 0.03632984052707842\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.022569897074918407,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.022569897074918407\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n \"acc_stderr\": 0.03809523809523811,\n \"acc_norm\": 0.23809523809523808,\n \"acc_norm_stderr\": 0.03809523809523811\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2161290322580645,\n \"acc_stderr\": 0.023415293433568525,\n \"acc_norm\": 0.2161290322580645,\n \"acc_norm_stderr\": 0.023415293433568525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.22167487684729065,\n \"acc_stderr\": 0.029225575892489617,\n \"acc_norm\": 0.22167487684729065,\n \"acc_norm_stderr\": 0.029225575892489617\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.032876667586034886,\n \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.032876667586034886\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.21717171717171718,\n \"acc_stderr\": 0.02937661648494564,\n \"acc_norm\": 0.21717171717171718,\n \"acc_norm_stderr\": 0.02937661648494564\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.22279792746113988,\n \"acc_stderr\": 0.030031147977641545,\n \"acc_norm\": 0.22279792746113988,\n \"acc_norm_stderr\": 0.030031147977641545\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.24615384615384617,\n \"acc_stderr\": 0.021840866990423077,\n \"acc_norm\": 0.24615384615384617,\n \"acc_norm_stderr\": 0.021840866990423077\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145668,\n \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145668\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.22268907563025211,\n \"acc_stderr\": 0.027025433498882392,\n \"acc_norm\": 0.22268907563025211,\n \"acc_norm_stderr\": 0.027025433498882392\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2185430463576159,\n \"acc_stderr\": 0.03374235550425694,\n \"acc_norm\": 0.2185430463576159,\n \"acc_norm_stderr\": 0.03374235550425694\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.24036697247706423,\n \"acc_stderr\": 0.01832060732096407,\n \"acc_norm\": 0.24036697247706423,\n \"acc_norm_stderr\": 0.01832060732096407\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3425925925925926,\n \"acc_stderr\": 0.032365852526021574,\n \"acc_norm\": 0.3425925925925926,\n \"acc_norm_stderr\": 0.032365852526021574\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.23039215686274508,\n \"acc_stderr\": 0.029554292605695053,\n \"acc_norm\": 0.23039215686274508,\n \"acc_norm_stderr\": 0.029554292605695053\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.34080717488789236,\n \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.34080717488789236,\n \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.256198347107438,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\": 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n \"acc_stderr\": 0.04327040932578728,\n \"acc_norm\": 0.29464285714285715,\n \"acc_norm_stderr\": 0.04327040932578728\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690877,\n \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690877\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.27350427350427353,\n \"acc_stderr\": 0.029202540153431166,\n \"acc_norm\": 0.27350427350427353,\n \"acc_norm_stderr\": 0.029202540153431166\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2796934865900383,\n \"acc_stderr\": 0.01605079214803655,\n \"acc_norm\": 0.2796934865900383,\n \"acc_norm_stderr\": 0.01605079214803655\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.22832369942196531,\n \"acc_stderr\": 0.022598703804321624,\n \"acc_norm\": 0.22832369942196531,\n \"acc_norm_stderr\": 0.022598703804321624\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23016759776536314,\n \"acc_stderr\": 0.014078339253425814,\n \"acc_norm\": 0.23016759776536314,\n \"acc_norm_stderr\": 0.014078339253425814\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02564686309713791,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02564686309713791\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2733118971061093,\n \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.2733118971061093,\n \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460845,\n \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460845\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2198581560283688,\n \"acc_stderr\": 0.024706141070705484,\n \"acc_norm\": 0.2198581560283688,\n \"acc_norm_stderr\": 0.024706141070705484\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23859191655801826,\n \"acc_stderr\": 0.010885929742002202,\n \"acc_norm\": 0.23859191655801826,\n \"acc_norm_stderr\": 0.010885929742002202\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.20955882352941177,\n \"acc_stderr\": 0.024723110407677062,\n \"acc_norm\": 0.20955882352941177,\n \"acc_norm_stderr\": 0.024723110407677062\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.26143790849673204,\n \"acc_stderr\": 0.017776947157528037,\n \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.017776947157528037\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2636363636363636,\n \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.2636363636363636,\n \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.17959183673469387,\n \"acc_stderr\": 0.024573293589585637,\n \"acc_norm\": 0.17959183673469387,\n \"acc_norm_stderr\": 0.024573293589585637\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n \"acc_stderr\": 0.029929415408348387,\n \"acc_norm\": 0.23383084577114427,\n \"acc_norm_stderr\": 0.029929415408348387\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3313253012048193,\n \"acc_stderr\": 0.036643147772880864,\n \"acc_norm\": 0.3313253012048193,\n \"acc_norm_stderr\": 0.036643147772880864\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.034462962170884265,\n \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.034462962170884265\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23378212974296206,\n \"mc1_stderr\": 0.014816195991931583,\n \"mc2\": 0.37445718658128274,\n \"mc2_stderr\": 0.013750963148166805\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6093133385951065,\n \"acc_stderr\": 0.013712536036556653\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02047005307050796,\n \"acc_stderr\": 0.003900413385915718\n }\n}\n```", "repo_url": "https://huggingface.co/Danielbrdz/Barcenas-Tiny-1.1b-DPO", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|arc:challenge|25_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|gsm8k|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hellaswag|10_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T20-17-56.012496.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["**/details_harness|winogrande|5_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-20T20-17-56.012496.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_20T20_17_56.012496", "path": ["results_2024-01-20T20-17-56.012496.parquet"]}, {"split": "latest", "path": ["results_2024-01-20T20-17-56.012496.parquet"]}]}]} | 2024-01-20T20:20:04+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Danielbrdz/Barcenas-Tiny-1.1b-DPO
Dataset automatically created during the evaluation run of model Danielbrdz/Barcenas-Tiny-1.1b-DPO on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-20T20:17:56.012496(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Danielbrdz/Barcenas-Tiny-1.1b-DPO\n\n\n\nDataset automatically created during the evaluation run of model Danielbrdz/Barcenas-Tiny-1.1b-DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T20:17:56.012496(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Danielbrdz/Barcenas-Tiny-1.1b-DPO\n\n\n\nDataset automatically created during the evaluation run of model Danielbrdz/Barcenas-Tiny-1.1b-DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-20T20:17:56.012496(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
2c5cb58835b40f48fea042b71397b2a53562b71f |
Conversion of [Amod/mental_health_counseling_conversations](https://huggingface.co/datasets/Amod/mental_health_counseling_conversations) dataset to be used in pretraining.
Python code used for conversion:
```python
from datasets import load_dataset
import pandas
import re
dataset = load_dataset("Amod/mental_health_counseling_conversations", split="train")
def format(columns):
return re.sub(r'\s+', ' ', columns["Response"]).strip()
text = [format(columns) for columns in dataset]
pandas.DataFrame({"text": list(filter(None, text))}).to_csv("train.csv", index=False)
```
| Felladrin/pretrain-mental-health-counseling-conversations | [
"source_datasets:Amod/mental_health_counseling_conversations",
"license:openrail",
"region:us"
] | 2024-01-20T21:31:43+00:00 | {"license": "openrail", "source_datasets": ["Amod/mental_health_counseling_conversations"]} | 2024-01-23T21:49:49+00:00 | [] | [] | TAGS
#source_datasets-Amod/mental_health_counseling_conversations #license-openrail #region-us
|
Conversion of Amod/mental_health_counseling_conversations dataset to be used in pretraining.
Python code used for conversion:
| [] | [
"TAGS\n#source_datasets-Amod/mental_health_counseling_conversations #license-openrail #region-us \n"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.